0% found this document useful (0 votes)
6 views5 pages

Chapter 1 - The Vanishing Users Mystery

John, a product manager at Wanderly, faces a significant drop in user retention, prompting an investigation into the app's core features and user experience. Through data analysis and user interviews, he identifies that a recent UI change has complicated the itinerary planning process, leading to user frustration. John proposes a targeted fix to improve visibility of critical features while maintaining the new design, ultimately leading to a recovery in user retention and a commitment to better predict the impact of future design changes.

Uploaded by

vrindatyagi09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views5 pages

Chapter 1 - The Vanishing Users Mystery

John, a product manager at Wanderly, faces a significant drop in user retention, prompting an investigation into the app's core features and user experience. Through data analysis and user interviews, he identifies that a recent UI change has complicated the itinerary planning process, leading to user frustration. John proposes a targeted fix to improve visibility of critical features while maintaining the new design, ultimately leading to a recovery in user retention and a commitment to better predict the impact of future design changes.

Uploaded by

vrindatyagi09
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Chapter 1: The Vanishing Users Mystery

John stared at his dashboard, his coffee cooling beside him. The numbers didn't make sense.
Retention had plummeted. Last month, users were planning trips, sharing itineraries, and checking out
events. Now, they were disappearing like tourists fleeing a bad hostel.
"John, can you jump into the retention issue today?" Lisa, the VP of Product, called from across the
open workspace.
"Already on it," he replied, masking his anxiety with a confident nod.
The First Product Crisis
John had joined the company six months ago, fresh-faced from his previous role as a software
developer. This was his first real PM gig at Wanderly, an app designed to help travelers discover the
best times to visit places, optimize itineraries, and find unique events. With a user base of 7,500 active
travelers, things had been smooth—until now.
He pulled up Mixpanel. The drop in retention wasn't just a minor blip—it was a cliff dive. DAUs had
fallen by 30% in two weeks. Panic set in. What had changed? Was it a competitor move? A bug?
Seasonal trends?
John pulled up the company OKRs on his second monitor. The top objective stared back at him: "Help
15,000 travelers plan seamless trips by Q4." With retention dropping, they weren't just losing users—
they were moving away from Wanderly's core mission and growth trajectory.
Step 1: Diagnosing the Problem
John followed his product management framework: Start with data, develop hypotheses, then
validate with users.
He organized his analysis methodically:
New users: Acquisition was steady at 500 signups weekly.
Activation: First-time completion of trip creation remained stable at 65%.
Retention: 7-day retention had dropped from 42% to 29%, with the steepest decline in power
users.
Feature usage: Itinerary planning sessions had decreased by 43%, while other features showed
minimal change.
John created a cohort analysis, splitting users by acquisition date. The pattern was clear: even loyal
users who had been active for months were now dropping off. This wasn't a gradual decline—
something had fundamentally broken in the core experience.
"This is focused on the itinerary planner," John muttered. "Our flagship feature."
Step 2: Talking to Users
Instead of relying solely on numbers, John turned to qualitative insights. He prepared his interview
script carefully, focusing on open-ended questions that wouldn't lead users to specific answers.
He identified three user segments for interviews:
Recent drop-offs who had been highly active
New users who abandoned during onboarding
Power users who remained active despite the changes
"Walk me through the last time you tried to use our app," he asked each participant, noting not just
their words but their tone and hesitations.
One power user, Maria, had been planning trips on Wanderly since launch. "I want to help you see
exactly where I got stuck," she offered, sharing her screen. John watched as she clicked around
looking for the Add Event button, her frustration visible.
"What would make more sense to you?" John asked.
"The old way," she said immediately. "I could add events directly from the main screen."
This pattern repeated across interviews. John developed a matrix to categorize feedback: usability
issues, missing features, and emotional responses. The data was clear—this wasn't just aesthetics; it
was breaking the core user journey.
One user's email particularly stood out:
"I used to love planning my trips here, but it's become too complicated. I couldn't figure out how to
add a last-minute event to my itinerary!"
Step 3: Investigating Internally
John gathered the key stakeholders: Mei from engineering, Raj from design, and Sarah from customer
support.
"We have a problem to solve together," John started, sharing his screen with the retention graphs.
Raj immediately defended the design change. "The new UI tested well in usability studies. It's cleaner."
"I understand," John acknowledged. "The design is beautiful. But here's what we're seeing in
production." He shared the matrix of user feedback.
Mei from engineering raised a practical concern. "We just deployed this. Rolling back means another
release cycle and potential instability."
"And," she added, "the new design helps us scale for the personalization features on the roadmap for
next quarter. Going back could delay those."
John nodded. "What if we did a hybrid approach? Keep the clean design but surface critical actions
more prominently?"
Together, they sketched possibilities on the whiteboard, with John facilitating the conversation toward
a solution that balanced design principles with user needs and technical constraints.
Further investigation confirmed their hypothesis: a recent update had buried the 'Add Event' button
in a sub-menu. What was meant to simplify the UI had instead frustrated long-time users who were
used to the old flow.
Step 4: Weighing Solutions
Before presenting to Lisa, John stepped back to consider the bigger picture. This wasn't just about a
button. It was about Wanderly's promise to make travel planning seamless. If users couldn't add events
easily, the entire value proposition was at risk.
He pulled up his product strategy document and added a note: "UI changes must maintain core user
flows even when simplifying." This wasn't just a tactical fix—it was a principle that would guide future
decisions.
John drafted a decision matrix with three options:
1. Full rollback: Fastest to implement, but would discard all design improvements and create user
whiplash.
2. Quick fix: Surface just the 'Add Event' button prominently, maintaining most of the new design.
3. Data-driven redesign: Use the opportunity to analyze all core flows and optimize them, taking
longer but potentially solving deeper issues.
He evaluated each against criteria:
Impact on retention
Development time
Alignment with product vision
User experience consistency
Engineering refactoring needs
The clear winner was the quick fix with targeted improvements.
Step 5: Creating Alignment
John created a one-page brief summarizing:
The problem (with data)
User impact
Proposed solution
Timeline and resources needed
Expected outcomes
Success metrics
He shared it with stakeholders before their meeting, giving everyone time to process the information.
"John, how quickly can we fix this?" Lisa asked when they gathered.
"Two weeks," he estimated. "One week to test, one to roll out."
But the design team pushed back—changing the UI so soon after an update could confuse users
even more.
His approach:
1. Empathy: "Raj, I know your team put a lot of thought into the new design. Many aspects of it are
working great."
2. Data-driven persuasion: "Here's what our power users are saying. We're seeing a direct
correlation between the UI change and drop-off points."
3. Compromise: "What if we launch an in-app tooltip guiding users through the new layout while
running the A/B test? This gives us data to inform long-term design decisions."
Engineering had concerns about release cycles. John worked with Mei to find an implementation
approach that minimized risk while moving quickly.
Step 6: Executing the Plan
John proposed an A/B test:
Control group: Current UI.
Test group: Move 'Add Event' back to the main itinerary screen with enhanced visibility.
He defined clear success metrics:
Primary: 7-day retention recovery
Secondary: Itinerary completion rate
Guardrail: No decrease in other feature usage
The engineering team implemented the change as a component switch rather than a full release,
reducing technical risk. The design team created subtle visual cues to guide users to the newly
positioned button.
Step 7: Learning and Iterating
The test results were clear in a week:
Users in the new design saw 25% higher itinerary completion rates.
Support tickets about itinerary issues dropped by 40%.
7-day retention began climbing from 29% back to 36%.
John rolled out the change. Over the next month, retention steadily climbed back by 15%. A full
recovery would take time, but they were on the right track.
The team didn't stop there. The crisis highlighted a gap in their process: they needed better
mechanisms to predict how design changes would affect core user flows. John worked with the data
team to implement more granular funnel analytics, tracking each micro-interaction in critical paths.
The Takeaways
John's first big crisis taught him:
✅ Always connect tactical fixes to strategic objectives.
✅ Small UI changes can have a massive impact on user behavior.
✅ Cross-functional collaboration requires balancing competing priorities.
✅ Decision frameworks help navigate complex product problems systematically.
✅ User feedback must be weighted by user type and behavior patterns.
As he closed his laptop for the day, Lisa walked by. "Nice work today, John. That's how real PMs solve
problems."
John grinned. He was starting to feel like a real PM now. But more importantly, he understood the
responsibility that came with the role: protecting the user experience while advancing the product
vision, one decision at a time.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy