PRODUCT DESIGNER @ KITMAN LABS
I used a basic design thinking approach to design a feature for backroom staff at sports teams to rate their players as part of a review process.
Understanding
The initial idea for the feature originated when we signed a partnership with Chelsea F.C. As part of their player development process they set player goals and review progress over time.
Kitman Labs develops a generalised product and we wanted to be certain this feature would be useful for all our customers, so we reached out to a broad sample, specifically performance oriented staff, from different sports, in a number of semi structured interviews. What we quickly found was that this process of setting goals as a method for developing players is a common approach. Along with that, the process was similar, with only slight variations in the frequency of reviewing and the level of detail documented.
For example, a team like Chelsea F.C. (English football team), have a lot more time and staff available to work with each of their players, thus the goal setting process is very curated and progress is regularly reviewed every 6 weeks with a number of staff members. This is done based on three specific goals per player along with looking at a standard set of metrics, like Energy, Training Performance, each of which has a definitive definition.
A smaller sized team like Stellenbosch (South African rugby team) who have half the number of staff create a set of goals for players but they are selected from a predefined list, the goals are not particular to the players exact development needs, the regularity of review is extended, twice a year, and the rating is not in depth.



Another interesting finding pertained to the level of formalisation of the process at different teams. Generally, we found many teams were not as developed as Chelsea F.C. Nothing was operationalised at many of the teams we spoke with. For example at Saracens Rugby (English rugby team) and the Ducks (American hockey team), each staff member had a different approach to working with players and conducting the review process, even within the teams. At Chelsea F.C., though, things are rigorous, thought out and the process is standardised. They are also considered the benchmark for player development in sport. This highlighted an opportunity to give guidance and reflect such a process in our software and offer it to customers.
Reinforcing Loops
Below is a breakdown of the As-Is. It was a simple reinforcing loop. We then intervened (To-Be) and tried to change the relationship.
As-Is
This is a a simple reinforcing loop, the coach observes the player performance and gives unstructured, often biased, feedback.


To-Be
Based on the feedback from Chelsea and working with them, we updated the format to include more varied feedback, observations are still important, coaches can pick up on many things data cant, but data also helps. We also included player self assessment and proposed a structured analysis by staff. The aim of the designs was to facilitate a more more structured and detailed feedback loop.
Definition
As part of the work we decided to concentrate on needs 3, 4 and 5 for the MVP. We have a calendaring tool, so point 1 was covered. Point 2 we felt we also covered in the graphing area, all metrics and goals, if input could be visualised per player in this area, so we had covered that need. Finally, sharing the report was also a point we had already covered in our messaging feature and our download to pdf feature.
1. Schedule
A Sports Scientist needs a way to plan and schedule an assessment in advance so staff members can know what they must do and when.
2. Prepare
A Sports Scientist needs a way to prepare for an assessment so they can come to the meeting informed.
3. Rate
A Sports Scientist needs a way to rate an athlete so they can document how well that athlete is developing.
4. Discuss
A Sports Scientist needs a way to see other staff ratings so they can discuss those ratings.
5. Visualise
A Sports Scientist needs a way to plot ratings using different visualisation types so they can analyse the development of an athlete.
6. Share
A Sports Scientist needs a way to share rating reports with staff, athletes, teachers and parents so they can keep them informed and engaged.
Ideation
Once the needs were identified we started moving into an ideation phase. A number of options were identified. Two concepts stood out, but neither was an obvious selection.
Concept 1 – Grouped
The idea behind Concept 1 was an ability for staff to enter multiple ratings against a number of different players and variables quickly. While also being able to reference other players.

Usability Testing & Iterations
After running a set of usability tests, with a set script, we got clear signals that the Concept 2 – Focused was by far an ideal input mechanism and the historical aspect allowed for reference of previous ratings. What was interesting with regard to Concept 1 – Grouped, was how useful this view could be at a general meeting when comparing players and reviewing averages for each player across a season or longer time frame. We decided to build the Focused concept and to later develop the table Grouped concept in the graphing area. A number of smaller interactions were also identified as needing to be updated.
Updates
1. Expand all function added
2. Add comment and staff member to a metric, remove talking points
3. Categorise metrics, sections added
4. Link back from reports section to the forms



Delivery + Outcomes
Delivery
The work was picked up by the engineers as I was running tests with staff at different clubs. I had already shared the work with the front and back end teams, all of which were fine with building either direction. We all knew, both directions were probably needed in the longer run, but being a small company, we could not build both. So we looked to get a sense from the users what was the priority. The back end team set an infrastructure, the UI was not going to hold back how they served they data, it was the same data in both views.
Once the priority in direction was set, the work became standard in terms of working through details with squad, along with QA’ing it.
Outcomes
Internally and externally, this tool had a lot of usage. We saw strong Google Analytics usage stats, we had no benchmark as to what was good, but all customers were using it and listening to feedback via our field team, it was strongly seen as a success. This isn’t the most scientific approach to defining success, but it was the best we had.
In looking at customer product accounts, due to the flexibility of the area, club staff were using the tool for many different use cases, medical assessments, physical assessments and general player feedback and host of other uses cases. In essence, the tool is just a data entry tool for highly structured data. That is what made it useful, it went wider than the original intended use case.
Quickly feature requests came in for updates after release, things like links to other areas of the product, particularly to the reporting area, document uploads, video uploads and a host of other great ideas. Which we added to the backlog and started to work through.