Analytical Problem Solving Case Walkthrough
Prompt:
Uber has experienced four months of steady growth in driver signups. This month, signups have suddenly stalled. As a data analytics professional, you’ve been asked to investigate the issue and walk through your thinking process.
P - Plan
Goal: Understand the problem and ask the right questions before diving into data.
Strong response:
“I understand that signup growth has stalled after four months of upward trend. First, I’d want to clarify a few things to narrow the scope:
- Is this issue isolated to specific cities, regions, or across all markets?
- Have there been any recent changes—internal (e.g., onboarding flow, incentives) or external (e.g., competition, regulations)?
I’d also form a few working hypotheses upfront:
- Did we roll out a new background check or documentation requirement?
- Could Lyft or another competitor have launched a new driver bonus?
My next step would be to align on which success metric matters most: raw signup volume, funnel conversion, or time to activation?”
Weak response:
"Signups dropped. I’d check the data and maybe look at application numbers."
→ Lacks scope, context, and business understanding.
Common mistakes:
- Jumping into data too soon without asking clarifying questions.
- Focusing only on internal changes, ignoring external factors like competition or economic shifts.
- Making unqualified assumptions without stating them clearly.
A - Analyze
Goal: Investigate the problem using structured analytical methods.
Strong response:
"I'd take a hypothesis-driven approach and use funnel analysis to examine each step of the signup process—from initial interest to activation. I'd break it down by:
- City or region to localize the problem.
- Time frame, to understand whether this is a short-term blip or ongoing trend.
- Signup funnel steps (e.g., application → background check → training → first trip).
I'd also benchmark conversion rates before and after the drop to identify exactly where the issue begins. Then I'd layer in external signals—such as Lyft’s incentive program launches or Google search trends around 'become a rideshare driver.'"
Weak response:
"I'd look at the signup numbers and see if they dropped."
→ Too surface-level, no method, no segmentation.
Common mistakes:
- No structure in analysis—just poking around in data.
- Failing to segment by key variables (like region or signup stage)
- Not comparing before-and-after metrics to isolate the issue.
C – Construct
Goal: Communicate insights clearly, without overreaching. Focus on what the data shows and what it might mean.
Strong response:
"From the analysis, we found a sharp drop at the background check stage. After digging deeper, I discovered Uber recently switched to a new vendor. This new system requires drivers to enter twice as much information, including uploading additional documents. This appears to be causing friction, with a 40% drop in completion rate since the switch.
This insight was shared with key stakeholders in ops and onboarding."
Weak response:
"The drop is probably because of the new vendor. That's what the data says."
→ No business framing, no quantification, no next step.
Common mistakes:
- Overstating conclusions without evidence ("This vendor is terrible").
- Sharing too many metrics with no real insight.
- Skipping the "why it matters" and not tying back to business impact.
E – Execute
Goal: Recommend realistic next steps, even if they're rough. Show action-oriented thinking.
Strong response:
"Short term, I'd recommend working with the background check vendor to streamline the form—remove redundant steps, prefill data from earlier parts of the funnel, and make instructions clearer. Long term, we should reassess the added requirements. Are they truly necessary from a compliance or risk perspective? If not, we can negotiate with the vendor to simplify. If friction persists, we may consider switching back or exploring alternatives.
Lastly, I’d propose A/B testing two flows: one with the new vendor vs. a simplified process, to measure the impact on completion and activation."
Weak response:
"We should change vendors and get the old one back."
→ Reactionary, no stakeholder awareness, no feasibility check.
Common mistakes:
- Offering vague next steps ("We should fix it somehow.")
- Over-recommending without feasibility ("Let's build our own background check system.")
- Failing to prioritize actions based on business impact and effort.
Conclusion
Using PACE in your interview shows you’re not just analyzing data—you’re solving real business problems, like a true analytics partner.
When Uber (or any company) faces a sudden change, they want an analyst who:
- Clarifies the problem
- Thinks systematically
- Communicates clearly
- Recommends realistic actions
That’s the difference between getting the job—and being the candidate they forget. In the next few lessons, we’ll bring this to life with real mock interviews featuring data analytics professionals from top tech companies. You’ll see exactly how high-performing candidates apply the PACE framework under pressure, break down ambiguous problems, and communicate like true business partners.
Get ready to watch, learn, and start building your own responses with confidence.