top of page
Therabot
A voice-to-voice feature for a clinically validated AI therapy chatbot to reduce symptoms of depression & anxiety
Team: 2 professors, 1 software engineer, 3 ML engineers, 1 data scientist, 1 PM, 1 Designer (me).
Timeline: 4 months (ongoing)
Role: Therabot was initially developed in a research lab at Dartmouth that launched & passed clinical trial in 2024. I joined to design a new end-to-end voice-to-voice experience that improved warmth & accessible interactions and re-designed core app flows into a V2 that elevated user trust & usability.
Outcomes:
-
0->1 prototype of AI voice-to-voice: Designed, tested & prototyped the full conversational experience ready for development
-
End-to-end app redesign: Reimagined all core screens & interactions to create more intuitive & therapeutic V2
-
New revamped design system: Rebuilt the visual language to establish a trustworthy tone and enable future scalability
-
Validated improved usability: Conducted 15+ user tests and 1 UX survey for 100+ users during clinical trial
Tools: Figma, Figma Make, Figma Prototype, ChatGPT, Lottiefiles, Lovable

RESULTS
💬 Users reported stronger emotional connection: 90% described the new voice-to-voice experience as more human, empathetic, and calming.
✨ User satisfaction & trust rose from 7.4 → 9.1 / 10 after redesign (clinical-trial survey, n=100+ users).
🎧 Average session length increased 2.3X with voice interactions vs. text-only mode.
PROBLEM
Therapy is inaccessible, and AI support still feels inhuman.
Despite growing access to digital therapy, text-based conversations still feel transactional and detached, failing to provide the emotion presence that people need in vulnerable moments. Many users struggle to feel truly heard or understood by AI chatbots - they want human connection and comfort in how the AI speaks, listens and expresses empathy.
BUSINESS
CONTEXT
Retention & trust are the next growth levers in mental-health AI.
With the industry projected to exceed $6 Billion, many apps lose users after 2 weeks due to low emotional engagement. Designing AI that feels human & trustworthy not only improves well-being, it creates higher retention and differentiation in a crowded market.
OPPORTUNITY
How might we design a voice-to-voice AI experience that makes users feel truly heard and supported?

Enhance emotional connection & trust
.png)
Increase accessibility & comfort

Boost engagement & retention
SOLUTION
Therabot
An AI mental-health companion that enables natural text and voice-to-voice conversations that make therapy more human & accessible.

A voice-to-voice experience that emphasizes a natural & warm presence
-
Idle, listening, thinking & speaking states
-
Crisis mode detection
-
AI error handling
Designed using Figma Make & Figma Prototype
Chatbot user flows that prioritize emotional safety & predictability
-
Messages list
-
Core chat
-
Crisis detection
-
Voice transcripts

DISCOVERY
No existing tool yet combines emotional intelligence and real-time voice-to-voice interaction
Competitive
analysis
I analyzed leading mental health and voice assistant products using chatGPT to identify gaps in emotional design and patterns of effective existing voice-to-voice. I observed key structures & interaction patterns that helped me map out the general architecture.
-> widely used voice-to-voice like Gemini, ChatGPT, Meta AI have great flow, but are designed for convenience & accessibility, than emotional presence.

-> current mental-health chatbots excel at structured CBT flows & empathetic phrasing, but are text-only or overly scripted.

DISCOVERY
User interviews
User interviews surfaced 3 core challenges preventing Therabot from feeling human, reliable and easy to follow.
I conducted 15+ zoom testing interviews with new participants to understand usability & trust challenges from a design perspective. I found 3 major themes by using chatGPT to extract themes from transcripts:
AI error confusion
"When it said it didn't understand, I didn't know if it was me or the bot"
"It said things that weren't true about me"
"It took long to respond and said it's thinking"
Emotional tone & therapeutic presence
"It feels a little clinical"
"The language sounds robotic & condenscenging at times"
"The response rhythm doesn't feel that natural"
Usability & interaction clarity
"The crisis button popped up suddenly it made me anxious"
"It's a hard to keep track of conversations"
"Messages and timestamps are confusing for long messages"
DEFINE
The voice to voice will focus on 3 key pillars:
Design
goals

.png)

#1: Build psychological safety & trust
Prioritize a predictable environment where users can be vulnerable
#2: Humanize empathy through voice and motion
Use natural pacing, tone and subtle animation to convey warmth
#3: Design for accessibility & natural interactions
Ensure effortless communication via intuitive & seamless flows
DEFINE
User flows &
architecture
I mapped the end-to-end architecture to define how human & AI states interact
I detailed an interaction flow capturing every system and user state, which clarified how the interface should behave and feel at each moment to maintain trust.
Because Therabot's AI model carries a 10-15 second latency, I designed micro-interactions (chime placement, thinking animations) to manage user expectations and preserve presence.

I also mapped out the core crisis detection state, AI output errors and user interruptions with pathways designed to pause the session safely and guide the user back to stability.

IDEATE
I designed an architecture that optimizes for natural & focused interactions
Wireframes

A
B
C
Design B felt the most natural for users and creates the most focused interaction where core voice controls are easy to access without distraction.
IDEATE
Animation
concepts
I explored a range of animation concepts through Figma Make & Lovable.



I ultimately chose a non-entity direction (as objects can feel overly anthropomorphic in mental-health contexts) and instead focused on creating warm presence through motion, color & tone.
I landed on a water-inspired theme with gentle pulses, ripples & flowing gradients associated with calming, grounded & emotional regulation.
I then brainstormed 50+ iterations of ideas for the water theme (Lovable for ideas, Figma for control over design details)
ITERATIONS
Figma Make


First
Last
After 60+ prompts/iterations for the animation with Figma Make, I finally landed on the ideal style & motions.
Major design decisions from user testing insights include:
-
Expanding to full screen for biggest therapeutic presence
-
Colors were heavy -> switch to lighter palette/feel & introducing a warm purple for a empathetic atmosphere
-
Mist felt ominous -> switching to clean lines for water
-
Full screen motion was distracting -> reduce to predictable waves at the bottom for a calmer & clearer interaction
-
Mimicking natural gravitational motion of water (but not too chaotic waves)
IDEATE
Crisis intervention
I redesigned the crisis mode intervention to feel more safe & supportive.
Testing revealed that the sudden pop up made the current crisis intervention flow jarring & cold. It especially the lacked context & warmth.

It was disruptive especially when AI detected a false positive.
Before

After
The interaction was redesigned to be an inline message & gentle pulse on the support icon, making it flow more naturally with the conversation. It is less disruptive especially during false positives.
The palette also introduced an empathetic purple and warm context/guidance copy.
IDEATE
I redesigned the visual system from cold/clinical to emotionally supportive.
Visual
system
Before
After

I introduced a light mode that offers an emotionally lighter option.
The palette introduces a purple that adds a warm human presence, and reduced the clinical teal to a light mint.
Dark = night sky.
Light = calming mint/purple gradient.
⚡ Case study in progress - updated daily!
(more on AI error handling, redesigns)
More projects:
bottom of page


