Lives To Tell (IOS/Android)

Internship


UI and UX Design/Motion Graphics/Prototyping/Creative Direction.




A transformative platform for preserving life stories, designed to help seniors share their memories through voice-led conversations before they're lost forever. Lives To Tell uses voice-led prompts and AI-assisted editing to help seniors speak their memories and turn them into something families will keep. The goal was to make it feel as natural as conversation and ease the use of AI tools for the senior demographic. 



Role:

Product Designer

Timeline:

01/2025-03/2026


Software:

Figma
HTML/CSS/JS
Adobe After Effects

Team:

Wei Li
Jane Chen
Claudia Wormley
Jordan Ho
Shreyantan Chandra








Identity Narrative
Only 1 in 3 Americans has ever recorded a conversation with a parent or grandparent. Within three generations, 80% of family stories disappear. Not because families don't care, but because existing tools weren't built for seniors. Typing is tedious. Video feels awkward. Most people never start.




Interface Design
I designed with accessibility as the foundation, not an afterthought. Larger touch targets, high-contrast text, simplified navigation, voice-first interactions. The interface needed to work for someone who's never used a smartphone and someone helping their grandmother at the same time.

Voice prompts guide users through story capture without requiring them to think of questions. AI cleans up recordings automatically. You can start recording within seconds of opening the app.


User Mapping
I interviewed seniors and their adult children to understand what actually stops people from capturing stories. Seniors felt intimidated by technology but comfortable with conversation. They didn't want to feel recorded or interviewed. They wanted to feel like they were just sharing.

I also explored reminiscence therapy research and Ashton Applewhite's work on anti-ageism. This shaped the experience: gentle prompts, natural flow, technology fading into the background.










Accessibility First

Every decision filtered through one question: would my grandmother be able to use this without help?




The Hook Model

I used Nir Eyal's framework to design for return visits. The trigger is a gentle prompt. The action is speaking. The reward is hearing your story played back, polished. The investment is a growing library of memories.




Stakeholder Challenges

Some collaborators questioned whether accessibility features were necessary. 
I learned to advocate by grounding decisions in research and business impact. 
Seniors who can use the app independently become better users and advocates.












Animations for seniors...





Visualizing a Home for Stories 

The pain points, competitor audit, and AI research were the starting nodes for branching into a variety of solutions. See some of the explorations we considered below.


User creates a loop of a specific section of the song (STEP  ONE - RECORD)
A friend sends the user a rough draft of a song they're working on. The user listens to the track and decides on a specific section they want to add lyrics to. They create a loop: this part of the song plays again and again.


User creates a loop of a specific section of the song (STEP  TWO - DRAFT STORY)

Draft, Copilot chatting, accessibility Copilot options 


User creates a loop of a specific section of the song (STEP  THREE - SHARE)

Share



User compiles keepsake book  (STEP  Four - COMPILE BOOK)

Personalize Your Legacy (choose cover),