top of page
The Claremont UX Research Laboratory logo
Search

Evaluating an Executive Function Coaching Platform for Teens

  • Writer: Hannah Ngọc-Hân Đào
    Hannah Ngọc-Hân Đào
  • Mar 23
  • 6 min read


In today’s world, many students struggle with staying focused and retaining information, which can make school feel overwhelming. As these challenges grow, parents are increasingly looking for tools that can help teens stay organized, build routines, and feel prepared for their day-to-day responsibilities.


We partnered with Coachbit to explore how a digital coaching experience might support executive function skills—such as planning, focus, and time management—while still respecting teens’ independence, trust, and motivation.



The Team


Alison Tu

Lead UX Researcher

Sean Peterson

UX Researcher

Nicholas Montecino

UX Researcher

Setting the Stage

Coachbit is a digital coaching platform designed to support students ages 10–21 in building stronger organizational skills, improving goal planning, and developing greater independence in their daily lives.


As a startup focused on executive function support, Coachbit partnered with our lab to better understand how students experience their platform. Through this collaboration, we evaluated the effectiveness of the learning path designed for students, identified potential usability issues within the app, and explored how the digital tools and coaching experience work together to support student engagement and growth.

The Tasks

Define how learning paths should be introduced to students.

Test how sequencing impacts motivation and retention.

Assess usability and identify opportunities for improvement.


Research Questions

  1. How should learning paths be structured to balance consistency and flexibility?

  2. What makes a learning path feel engaging, supportive, and achievable for teens?


The Three-Part Approach
The Three-Part Approach

Part 1: Heuristic Evaluation

Our first step was to conduct a heuristic evaluation using the Nielsen Norman Group’s 10 usability principles. We chose this method because it was efficient in terms of time, cost, and analysis, and allowed us to review the entire app using well-established UX guidelines.


Each team member evaluated the app individually, then we met to compare our findings. We rated each principle on a scale from 1 to 4 and used sticky notes to label and organize our observations.


After a detailed group discussion, we created a shared chart that brought together all of our observations, identified problems, possible solutions, and key insights in one place.

Heuristic Evaluation Rule #1
Heuristic Evaluation Rule #1

Part 2: Qualitative User Interviews

Our next step was to conduct one-on-one qualitative interviews with Coachbit students. The interviews were held online and included volunteered students ages 10–20 who had prior experience using the app.


During the interviews, we wanted to understand whether students clearly understood the learning path, felt it supported motivation and information retention, and whether the content met different learning needs. We also explored how students felt about their coaches and app tools, and whether the overall experience felt supportive and engaging. A preliminary interview guide was created based on the project’s research questions and revised five times before reaching its final version.


Participants were recruited through Coachbit and volunteered to take part in the study. Stakeholders were informed of our research goals and focus areas to ensure we addressed their initial questions. The research team conducted interviews individually, recorded each session for note-taking purposes, and collected both verbal and written consent before beginning the interviews.



Part 2a: Thematic Analysis

After the interviews, our team reviewed transcripts and written notes together to identify key themes. During our initial coding, we focused on recurring comments related to learning paths, flexibility and pacing, coaching experience, engagement and motivation, as well as app tools and overall usability.


We highlighted both repeated pain points and positive experiences shared by participants to better understand common patterns across interviews.


To refine our analysis, we organized our findings into three primary themes:

Learning Path Structure

 Focused on pacing, flexibility, clarity, and progression through modules.

Engagement & Emotional Support

Included students’ emotional needs such as encouragement, psychological safety, confidence, self-efficacy, as well as the role of coaching in motivation.

App Functionality & Usability

Covered tools, navigation, customization features, and areas causing frustration or confusion.



Organizing the data this way allowed us to more clearly observe trends and relationships across student experiences.
Thematic Analysis From Interviews
Thematic Analysis From Interviews

Part 3: What We Discovered Along the Way

Our findings suggest that flexibility is essential within learning paths. Rigid structures often led students to rush through content or experience repeated frustration when schedules changed. Learning paths were most effective when paired with coach support, as coaches helped provide clarity, understanding, and accountability throughout student progress.


Roadblocks require clearer purpose and guidance. Many students viewed roadblocks as barriers rather than learning tools. They became more effective when they encouraged conversation between students and coaches, instead of simply preventing forward movement. When framed as opportunities for reflection and discussion, roadblocks supported deeper engagement.


Coaches emerged as the most important part of the Coachbit experience. Across

interviews, students consistently described coaches as the primary drivers of engagement, encouragement, reassurance, and accountability. Coaching sessions offered a safe space where students could speak openly about challenges, regain confidence, and plan ahead without judgment.


Rigid habit tracking created frustration for some students. Students reported challenges with AI-based habit verification and time-sensitive habits that felt unforgiving—especially when rewards depended on streaks or consistency. These systems sometimes failed to reflect real-life circumstances, which could negatively impact motivation rather than reinforcing progress.



Design Implications

Based on our research findings, 5 design opportunities emerged that could strengthen student engagement, understanding, and motivation.

Support Emotional Safety and Trust Early

Early experiences should prioritize emotional reassurance and trust-building.

Students benefit from having space for deeper questions and conversations with their coaches, especially when content or expectations feel unclear.

Encouraging open dialogue early helps reduce confusion and anxiety around progress.

Reframe Roadblocks as Preparation

The current roadblock metaphor often feels like an obstruction or punishment.

Shifting imagery from “stopping” (e.g., construction signs or barriers) to “preparation” (such as training camp, rest, or animal hibernation) may help students better understand their purpose.

This reframing positions roadblocks as supportive moments rather than interruptions.

Increase Flexibility Through Coach Support

Manual coach overrides can help address situations where AI verification does not reflect real student behavior.

Encouraging students to communicate directly with their coach provides accountability while maintaining flexibility.

Temporarily pausing disputed tiles and allowing students to return later is another effective solution.

Prevent Accidental Loss of Progress

Tools such as timers can unintentionally reset progress due to accidental clicks.

Currently, students are unable to view progress history, which removes an important motivational cue.


Shift Toward More Active Learning

Reduce text-heavy modules where possible.

Embed interactive tasks directly into lessons, such as:

  • short written reflections

  • drawing or visual storytelling (e.g., comics)

  • hands-on prompts


Include more recap sessions with coaches present, allowing students to demonstrate understanding verbally and reinforce mastery.


Impact & Outcomes

Our research helped us understand how students experience Coachbit’s learning path, and it highlighted how important the coach–student connection is for teens using the app. While the project began by examining how learning paths are introduced and sequenced, we found that students rarely connected with the path on its own. Instead, their relationship with their coach was what gave their learning journey meaning and made the content feel manageable.


These insights reshape how the platform delivers value:

  • Coaches make the learning path work by helping students understand tasks and stay motivated.

  • Tools should support relationships, not replace the coach–student connection.

  • Flexibility is key, since rigid systems often clash with real student routines.

  • Supportive onboarding matters, giving teens clarity and confidence early on.

  • AI should enhance the human connection, not replace the coach’s role.

  • Design should stay flexible and motivating, with pacing and communication shaped for students.

Reflections

This project showed us how important it is to design tools with teens rather than for them. Students were clear, open, and honest about what helped them—and what made things harder. The biggest lesson was that the emotional side of learning matters just as much as the tasks themselves. 


Reinforced by this project are the concepts of:

  • Teen independence; they know what works for them, and rigid learning paths can dull motivation.

  • Emotional support; it matters, not just clear instructions.

  • Coaches build confidence and make the experience feel human.

  • AI should stay supportive, not replace the coach’s role.

  • Adult assumptions often miss the mark, so tools must match real teen behaviors.


The work prioritized understanding and alignment over premature solution design.


Next Steps

Based on what we learned, the next phase of work should continue supporting the coach–student relationship while testing ways to make learning paths more flexible and more responsive.


  • Co‑design with teens to refine learning paths and pacing.

  • Study long‑term motivation to understand where students slow down or disengage.

  • Use AI to strengthen coaching, not replace it.

  • Keep caregivers involved lightly without harming the coach–teen bond.

  • Prototype flexible features like adjustable pacing and coach overrides.

 
 
 

Comments


© 2035 by EPS Marketing. Powered and secured by Wix

bottom of page