The UX research reckoning is here | Judd Antin (Airbnb, Meta)
Jud Antin, former head of research at Facebook and Airbnb, discusses the "user research reckoning," arguing the discipline needs to evolve. He shares how companies can better leverage researchers, focusing on business impact and avoiding "user-centered performance."
Deep Dive Analysis
18 Topic Outline
Judd Antin's Background and Career Highlights
Reaction to 'The UX Research Reckoning Is Here' Post
The State of User Research and Its Core Problems
Macro, Middle-Range, and Micro Research Framework
Why Research Often Fails to Drive Impact
Importance of Integrating Research Early in Product Development
Key Traits and Skills of Great User Researchers
Researchers' Need for Business and Profit Focus
Understanding and Avoiding User-Centered Performance
Leveraging Intuition vs. Evidence in Product Decisions
Common Tropes and Misconceptions PMs Have About Researchers
A/B Testing vs. User Research: Strengths and Weaknesses
Hindsight Bias and Narrative Fallacy in Product Development
Making Effective Recommendations from Research Insights
Advice for Companies and Researchers to Evolve the Discipline
Ideal Researcher-to-Team Ratio and Headcount Growth
Critique of NPS as a Performance Metric
The Risks and Limitations of Dogfooding Products
7 Key Concepts
User-Centered Performance
This refers to customer obsession or user-centered practice that is symbolic rather than genuinely focused on learning or changing decisions. It's work done to signal customer-centricity, often manifesting as 'check-the-box' research or executive listening sessions that don't lead to actual product changes.
Macro Research
This type of research focuses on big-picture, strategic business goals, forward-looking innovation, market analysis, competitor insights, and long-term product direction. It involves understanding the overall landscape and strategic planning.
Middle-Range Research
This research addresses middle-altitude questions, often focusing on specific user understanding or how users think, feel, and behave with a product. While interesting, it can be too blobular and not sufficiently tied to specific business problems or metrics to drive significant impact.
Micro Research
This research is laser-focused on enabling high-quality, pixel-perfect product execution, often involving technical usability testing or understanding A/B test results. It aims to improve specific functionalities or user flows quickly and can drive significant business value.
Falsify vs. Validate
This mental model suggests that the goal of research should be to prove assumptions wrong (falsify) rather than simply confirm them (validate). A mindset of seeking to be wrong leads to more genuine learning and better product decisions, contrasting with user-centered performance.
Hindsight Bias
This cognitive bias, also known as the 'I-knew-it-all-along' effect, makes past events seem more predictable than they actually were. It causes people to believe that research findings were obvious or already known, undermining the perceived value of research.
Narrative Fallacy
This bias describes people's tendency to construct simple, coherent stories about past events, often twisting evidence to fit a desired narrative. It can lead to oversimplified explanations for complex outcomes, hindering true understanding and learning.
10 Questions Answered
The user research field is undergoing a reckoning due to widespread layoffs and a perception that research isn't driving sufficient value or impact. This is often because companies haven't properly integrated researchers, leading to reactive, 'middle-range' research that is interesting but not impactful enough for business goals.
PMs should integrate researchers into the product process from beginning to end, fostering consistent relationships. They should partner with researchers on ruthless prioritization, protect their time, and be willing to participate in field research to gain firsthand understanding.
Great researchers are multi-method, possessing skills in formative/generative research, evaluative/usability testing, rigorous survey design, applied statistics, and technical skills like SQL, dashboarding, or prompt engineering to interact with data and AI.
Researchers need to be more business-oriented to identify the overlap between user needs and business profit. This involves understanding company quarterly reports, shareholder calls, strategic plans, OKRs, metrics, and conversion funnels to frame research questions that drive maximum business impact.
While intuition is valuable, it's prone to biases and blind spots. Teams should use 'system two' analytic thinking to check their gut instincts and bring in diverse perspectives (wisdom of the crowd) with varied information and judgment to make more informed decisions.
This trope is untrue because good research can be done quickly (e.g., micro research in 48 hours) and speeds up development by helping teams 'get it right the first time.' Getting it wrong and fixing it later is often slower and more costly than upfront research.
A/B tests are good for causal claims (what happened) but rarely explain *why* something happened. User research complements A/B testing by providing the 'how' and 'why,' helping teams understand the underlying reasons for test results and build better products in the future.
NPS is flawed because its 'likelihood to recommend' question uses a zero to 11 unlabeled scale, which is poor survey science. Precision decreases after five items, and the question itself may not reflect actual behavior. It's also idiosyncratic, making comparisons across companies unreliable.
Customer Satisfaction (CSAT) is a better alternative to NPS. A simple CSAT metric, such as 'Overall, how satisfied are you with your experience with [product/service]?', has better data properties, is more precise, and is more correlated to business outcomes.
While dogfooding is important, PMs are often not like the typical user and can be biased in their assessment. Relying purely on personal usage can lead to misprioritizing issues or overlooking problems that require a different context of use, priorities, or constraints that the PM doesn't experience.
17 Actionable Insights
1. Integrate Research Early & Consistently
Restructure product development to integrate researchers from beginning to end, fostering consistent relationships. This ensures researchers are active partners, can frame the right questions, and drive maximum business impact, rather than being a reactive “service function.”
2. Researchers: Be Business-Oriented
Researchers must deeply understand company financials (quarterly reports, shareholder calls), strategic plans, OKRs, and conversion funnels. This allows them to align research questions with business goals and articulate insights that directly drive profit and growth.
3. Falsify, Don’t Just Validate
Product managers and researchers should approach insights with a “falsify” mindset, actively seeking to be proven wrong rather than using research to merely validate existing assumptions. This openness fosters genuine learning and better product decisions.
4. Prioritize Macro & Micro Research
Focus research efforts on high-impact macro (big picture, strategic, forward-looking) and micro (technical usability, A/B test analysis) questions. Reduce reliance on less impactful, middle-range research that often slows down product development.
5. Develop Diverse Research Skills
Researchers should cultivate a “Swiss army knife” of skills, including formative/generative research, evaluative/usability testing, rigorous survey design, applied statistics, and technical skills like SQL or prompt engineering, to address a wider range of business problems effectively.
6. PMs: Engage in Field Research
Product managers should actively participate in research by joining researchers in the field, traveling to observe users, and engaging directly with customer feedback. This builds stronger partnerships and provides firsthand insights.
7. Prioritize Researcher Workload
Product managers and researchers should ruthlessly prioritize research projects, aiming for a maximum of three projects (two big, one small) for a researcher at any given time. This ensures high-quality work and prevents researchers from being spread too thinly.
8. Leverage Research for A/B “Why”
After A/B tests, leverage researchers to understand the “how and why” behind the results. This prevents endless speculation, informs future product development, and complements quantitative data with qualitative understanding.
9. Avoid Siloing Insights Disciplines
Companies should integrate various insights disciplines (e.g., UX research, market research, data science, customer service feedback) into a unified function. This prevents information overload and creates a cohesive, holistic understanding of the user and market.
10. Use CSAT, Not NPS
For measuring customer loyalty and satisfaction, use a simple Customer Satisfaction (CSAT) metric instead of Net Promoter Score (NPS). CSAT questions have better data properties, are more precise, and correlate better with business outcomes, while NPS is fundamentally flawed from a survey science perspective.
11. PMs: Don’t Over-rely on Dogfooding
While dogfooding your own product is useful for identifying potential issues, be extremely cautious about relying solely on your personal opinion or intuition to prioritize these issues. Your experience differs significantly from the average user, so leverage research to understand the true impact and priority of problems.
12. Researchers: Be Excellent Communicators
Researchers must develop strong communication skills to effectively convey insights. Tailor presentations and language to the specific audience (e.g., PMs vs. executives) to ensure the research is heard, understood, and acted upon, maximizing its impact.
13. Hiring: Seek Multi-Method & Clarity
When hiring researchers, assess their ability to propose multi-method approaches to open-ended research questions and their capacity to break down complex topics into simple, intuitive explanations. Strong communication skills are crucial for translating research into actionable insights for diverse audiences.
14. Startups: Hire Researcher Early
Startups should consider hiring a researcher among their first employees, as a single multi-skilled researcher can provide immense value by accelerating learning, informing critical pivots, and providing evidence for tough decisions, helping founders avoid guessing and move faster.
15. Don’t Ask Users What They Want
Avoid directly asking users what they want, as this is not effective research. Instead, focus on understanding their behaviors, needs, and pain points through observation and nuanced questioning, as direct questions often yield unreliable answers.
16. Beware Hindsight Bias
Recognize and guard against hindsight bias (the “we knew this already” phenomenon) when reviewing research insights. Value research for exposing blind spots and genuinely improving intuition, rather than dismissing insights as obvious in retrospect.
17. Focus on Controllable Factors
Researchers should adopt a stoic mindset, focusing their energy and efforts on aspects of their work and career that they can directly control, such as skill development, relationship building, and communication, rather than dwelling on external factors or systemic issues beyond their influence.
6 Key Quotes
We don't validate, we falsify. We are looking to be wrong.
Judd Antin
User-centered performance refers to customer obsession or user-centered practice that is symbolic rather than focused on learning.
Judd Antin
Good research doesn't slow us down. It speeds us up.
Judd Antin
If a tree fell in the forest and no one was there to hear it, you know, you need to communicate it effectively and you need to do it in a way that's appropriate to the audience.
Judd Antin
NPS is the best example of the marketing industry marketing itself.
Judd Antin
The thing most PMs have trouble with is realizing you are nothing like the user.
Judd Antin
2 Protocols
Five Tools of a Great Researcher
Judd Antin- Formative or generative user experience research (looking ahead, innovation-focused, ethnographic).
- Evaluative research (usability testing).
- Basic rigorous survey design (best scaled way to get responses).
- Applied statistics (understanding A/B testing and quantitative data).
- Technical skills like SQL, dashboarding, or prompt engineering (querying data, interacting with generative AI).
Checking Your Gut Instincts
Judd Antin- Engage 'system two' slow, methodical analytic thinking to scrutinize your initial judgment.
- Bring in the 'wisdom of the crowd' by gathering diverse people with different sources of information and judgment.
- Facilitate open, direct, and candid conversations where disagreement is welcome to expose biases and expand horizons.