The Analyst’s Framework for Data Analysis
In general, we can break down the data analysis process into several key stages. However, it's important to remember that in real-world scenarios, the process is rarely linear.
Data can often be messy or incomplete, and it's common to build an initial version for stakeholder review—then iterate by refining the analysis, adding new data, or adjusting the narrative to tell a clearer, more impactful story.

Lets walk through a real life example, pretending you are a business analyst at Amazon:
What might be tested
While there isn’t always a dedicated “data analysis round", each step of the data analysis process can be evaluated at different points throughout the interview process.
These skills may be tested in various formats, such as:
- Behavioral questions, for example:
- "How do you approach a vague business problem?" → Testing your ability to define problems clearly and translate them into analytical tasks
- "How would you present insights to a non-technical audience?" → Assessing your communication and storytelling skills
- Technical questions, such as:
- "How do you handle incomplete third-party data?"
- "What's your approach to merging inconsistent datasets from multiple sources?"
- Live case interviews or take-home assignments, where you're expected to apply the full analysis process—from scoping and cleaning data to communicating insights and recommending solutions.
To help you prepare, we’ve also compiled common mistakes observed from informational interviews with data professionals currently working at top tech companies.
Below are quick frameworks and thinking processes for each stage of the data analysis process:
1. Define the problem
5Ws Framework (Who, What, When, Where, Why). Use this to break down vague statements.
- Instead of “Sales are dropping,” ask: Who is affected (SMB vs. enterprise)? What is declining (conversion rate)? When did it start?
Problem Translation Practice. Learn how to translate broad goals into measurable, analytical questions. Ask:
- "What does success look like?"
- "Which specific metric should improve if the issue is resolved?"
Stakeholder Alignment + BRD Writing. Practice writing Business Requirement Documents (BRDs) that outline the business goal, context, scope, assumptions, and success metrics.
- Many data teams at companies like Meta, Uber, and Amazon use BRDs to create clarity and avoid misalignment throughout the analysis.
When interviewers ask these types of questions, they are not expecting you to jump directly into analysis or SQL. Instead, they are testing your ability to:
- Understand business context
- Think critically about the request
- Avoid building solutions based on incomplete or unclear goals
In your answer, it’s important to demonstrate:

2. Collect the right data
Data Source Mapping. Learn where relevant data lives—product logs, internal dashboards, third-party tools like Salesforce or Google Ads—and understand the limitations of each.
- For example, user behavior may be sourced from both clickstream event logs and user profile data. Missing either leads to incomplete insights.
Effective Data Scoping. Know how to request or query only what you need. This means applying appropriate filters (e.g., by geography, time range, user type) and ensuring the data answers your refined problem.
Handling Missing or Unavailable Data. Sometimes the data you need doesn’t exist yet—or it exists but is incomplete or unreliable. In these cases, you should:
- Suggest adding new tracking (e.g., through product instrumentation or event logging).
- Consider proxy metrics (e.g., use click-through rates if revenue isn’t tracked per feature).
- If time and capability permit, Use secondary/third-party data such as census data, industry benchmarks, or scraping tools when internal data isn’t enough.
These questions test your pragmatism, data judgment, and ability to work with ambiguity—essential skills in fast-paced teams or roles with partial data coverage.
In your answer, demonstrate:

Be honest about gaps—but show that you're solution-oriented. Great analysts don't panic when data is incomplete. They work around it and push for long-term improvements.
3. Clean & prepare the data
Data Preprocessing Techniques. Learn to handle missing values, duplicates, outliers, and inconsistent formats.
- Cleaning is critical—unclean data leads to misleading conclusions.
- We highly recommend going through Data Pre-Processing lesson under the Statistics & Experimentation module
Technical Tool Mastery. Know how to join multiple tables, use window functions, filter datasets, and manipulate dataframes.
These questions test your technical skill and attention to detail. Interviewers want to know that you don't blindly trust the data and that you have a solid process for preparing it.
In your answer, demonstrate:

4. Explore & analyze
Methods to Analyze Data. Funnel Analysis, Cohort Analysis, and Segmentation analysis are some of the common analytical methods for data analyst to drive quick insights
We highly recommend going through our lesson: Framework for analytical problem solving questions to learn each type of the analysis in more detail.
Visualization Tools. Quick visualization can help us to uncover insights, especially during exploratory data analysis processes. Learn tools like Tableau, Excel/Sheets or Looker to create visuals.
These questions test your analytical depth and whether you explore data in a structured way rather than jumping straight to solutions. To learn how to answer these questions with a clear framework, jump straight to our course, Analytical Problem Solving Questions.
5. Interpret & recommend
AIM Framework (Analysis → Insight → Meaningful Action). This framework ensures your analysis goes beyond surface-level reporting.
- Analysis: What did the data say? (e.g., "Retention dropped by 20% in the last 3 months.")
- Insight: Why did it happen? (e.g., "Drop is concentrated among mobile users with onboarding friction.")
- Meaningful Action: What should we do next? (e.g., "Redesign the mobile onboarding flow with progressive tips.")
Use this to structure interview responses and stakeholder presentations. It helps interviewers assess whether you can connect the dots between data and business value.
Trade-Off Framing. Real-world recommendations aren’t black-and-white—they come with constraints. Show maturity by discussing:
- Effort vs. Impact: Is your suggestion worth the engineering time or marketing spend?
- Short-term vs. Long-term Solutions: Can we do a quick fix now while planning a more robust rollout later?
- Risk vs. Reward: Will your suggestion affect user experience, ops load, or compliance?
"This change could reduce churn by 10%, but it would require reworking the onboarding flow across platforms—so we'd need buy-in from product and design."
For further deep dive and actual case study/mock, we highly recommend you check out our course, Take-Home Case Study for Data Analysts, where we elaborate on how to generate insights and recommendations from data and present them to our audience/stakeholders.
6. Communicate findings
Pyramid Principle. A structured communication technique that starts with your conclusion first—then layers supporting arguments, data, and context underneath.
How to use it in analytics:
- Start with your recommendation or takeaway, e.g., "We should simplify mobile onboarding to improve retention."
- Then give supporting data, e.g., "Retention dropped 20% in the last 3 months, especially among mobile-first users."
- Then add optional context or next steps, e.g., "These users also show lower session length and feature adoption, likely due to onboarding friction."
Storytelling with Data (Cole Nussbaumer Knaflic). A widely respected methodology (and book/course) for presenting data in a clear, engaging, and focused way. It’s about simplifying your message, reducing clutter, and drawing attention to what matters most.
How to apply it:
- Choose the right chart/format to present data: e.g., use a line graph for trends over time, bar charts for comparisons, and scatterplots for correlations.
- Remove clutter: Eliminate unnecessary gridlines, borders, or legends that don’t help interpretation.
- Use color with purpose: Highlight what you want people to notice (e.g., red for drop-offs).
- Narrate your insight: Always pair visuals with a clear takeaway sentence. Don’t make the stakeholder guess.
If you need a refresher on data visualization and dashboarding, jump straight to our module, Data Visualization & Dashboarding.