Workout Cards +
Rating of Perceived Exertion (RPE)
TrainingPeaks prides itself on providing extensive, science based workout data analysis for coaches and athletes. Analyzing every detail of a workout is a huge asset for those that want to dig into that level of granularity - but hard data only tells one part of the story. To get the full picture of an athlete’s training, we need subjective feedback. Enter, rate of perceived exertion (RPE), a measure of how a workout felt to the athlete. By including this type of qualitative data into the equation, coaches can better train their athletes, and athletes can start to see trends in their performance based on their RPE values over time.
As the product design lead, I was part of the mobile initiatives team comprised of product management, design, engineering, data, and QA. I led the UX and UI work, producing all major deliverables and presenting these to stakeholders throughout the process.
This work was done April - June, 2018 and was implemented on the TrainingPeaks web app and both iOS & Android mobile apps.
When we began the Understand phase to address our lack of a way to track subjective feedback, we blissfully thought we were just going after a solution to round out workout data. We started by digging into our User Voice platform that has loads of RPE requests from our users to understand the breadth of the requests and to start to understand the problem. We reviewed all requests related to workout compliance, data, and subjective feedback and started to get clear on why it was important to solve this problem for our users.
The more we dove into the problem, we quickly realized that not only did our interface not support the space for any additional data points, but the existing interface was failing to give both coaches and athletes the information they needed to effectively and efficiently use our product.
We have multiple teams that interface with coaches everyday, so to better understand the feedback from our users, we met with Customer Success, Education, Sales, and Coach Management teams. The cross-platform collaboration proved extremely valuable for not only better understanding the problem, but also for gaining alignment amongst teams. This was a huge added bonus for us as we tackled a much larger problem than originally intended - reworking a major element of our product would be an arduous undertaking without full support from the broader team.
Value and Opportunity
I wanted to get a sense of what athletes expect to see before and after each workout, the data that is most valuable to them, and what type of information they include in comments (if they comment at all). So with a junior data analyst, I sent out a survey and followed up with moderated interviews.
Through this process I was able to validate the need for RPE and understand the role this would play in both the coach and athlete experiences. This research also uncovered the myriad of problems that plagued the data visualization and IA of our existing workout cards. The opportunities were endless.
In order to add RPE, we needed to fix the usability issues that ran rampant amongst workout cards. Fortunately, at this point I was able to paint a clear picture from the research, and could start the design phase with journey maps, sketches, and wireframes.
The new set of problems
Based on the research:
There is no differentiation from an incomplete workout and a workout that was completed, but did not hit the plan targets.
There is no way to tell an incomplete, planned workout from a completed, unplanned workout for the current day. (Confusing, I know)
There is no clear hierarchy of information.
We are showing data that users don’t want/need.
We still need to solve for adding subjective feedback.
Who are we solving for?
For the workout cards we focused on uncoached athletes who would need a bit more hand holding to understand training metrics.
We focused on coaches as the target user for RPE, with uncoached athletes as the secondary focus.
WORKOUT CARD Design
In the design phase, we took everything we had learned, and started sketching ideas and wireframes. We knew we needed to clean up the cards and dedicate space for subjective feedback and saw a huge opportunity for letting the value of TrainingPeaks really shine with an improved interface and user experience.
Using 3 of the strongest designs (as determined by key stakeholders), we created clickable prototypes with InVision. We did 2 rounds of moderated, in person interviews, totaling 15 participants. From the tests we were able to refine hierarchy to improve usability, create an interface that could be scanned quickly, and give each key data point a home on the card.
Back to RPE
Using the research we gathered early on, we jumped right into UX flows and sketching.
I used see/do diagrams to help illustrate the experience for an athlete inputting RPE after a workout. This helped give clarity when collaborating with team members before we did any design work.
Low fidelity testing
Even though RPE is an action taken by athletes, we found that coaches gain the most value from this feature. This made testing tricky as we were testing 2 different experiences for the same feature - the athlete providing the feedback, the coach ingesting that data.
We started with wireframes in Balsamiq and got them in front of as many coaches and athletes as we could, as quickly as possible, so that we could keep momentum and make small, impactful tweaks from test to test.
We saw that for usability, we needed to test with athletes, but for the information presented, we needed to be testing with coaches.
With this type of input from athletes and coaches, we created an experience that was intuitive for athletes, and extremely useful for coaches.
High fidelity testing
After a few rounds of tests, we were unable to gain a clear answer to one part of the experience - there was no decisive result on where commenting should live with in the RPE experience.
We decided to move to high fidelity prototypes with 3 concepts for commenting flow variations. We ran one more round of moderated usability tests with the high fidelity prototypes, which yielded similar uncertainty.
Design and product ultimately made a final decision based on UX best practices, dev effort, and scope.
Was it successful?
We released the updated workout cards first, without RPE, and had some backlash from our power users who were used to the old layout. We monitored feedback and usage very closely and once the “shock” wore off, people were very pleased with the new look and layout. On the flip side, we also heard tons of great feedback from users who loved the new cards.
We heard feedback via social channels, customer success, and a follow up survey to users that the new cards are much easier to read. In a later iteration of the cards, we made the commenting, peaks, and RPE icons tappable, which has shown to be hugely valuable for efficiency.
RPE was praised from the moment it was released. Coaches and athletes love “the smileys” and we have heard from many coaches how much this feature helps them coach their athletes.
As far as success metrics go, both the cards and RPE were deemed a success based on increased coach retention and athlete satisfaction. One of the best pieces of feedback? This article from a coach praising the importance of RPE on TrainingPeaks.