Part 5-6 of the Six-User Interview Guide
Part 5: Synthesis and Pattern Recognition
The Immediate Debrief
Within one hour of each interview, before memories fade:
Capture the Headlines:
Write three things:
- The biggest surprise
- The strongest emotion displayed
- The quote you'll remember
These become your north stars during analysis.
Note the Atmosphere:
Was the participant engaged or dutiful? Confident or confused? These meta-observations inform how you weight their input.
Flag Follow-Up Questions:
What do you wish you'd asked? Write it down for the next interview.
The Analysis Framework
After completing all six interviews, block 4 hours for synthesis. This isn't optional—insights have a half-life.
Hour 1: The Data Dump
Transcribe key quotes if you haven't already. Services like Rev.com cost $1.50/minute and return transcripts in hours.
Print everything or display on a large screen. You need to see patterns across interviews, not within them.
Hour 2: The Affinity Mapping
Write observations on sticky notes—one per note. Include participant identifier (P1, P2, etc.).
Group similar observations. Don't force categories; let them emerge. Common clusters:
- Process breakdowns
- Emotional responses
- Workarounds
- Unmet needs
- Mental model mismatches
Hour 3: The Pattern Identification
Look for observations that appear 3+ times. These are patterns. Document them.
But also note outliers. Sometimes one participant's edge case reveals future mainstream needs.
Create a simple matrix:
- Strong patterns (4-6 participants mentioned)
- Moderate patterns (3 participants mentioned)
- Interesting outliers (1-2 participants, but compelling)
Hour 4: The Insight Development
Transform patterns into insights using this formula:
"Users [specific behavior] because [underlying reason], which means [implication for design]."
Example: "Users screenshot confirmation pages because they don't trust the system to maintain records, which means we need persistent, accessible transaction history."
The Deliverable
Your research needs a vessel to create change. Create a one-page findings document:
The Executive Summary (3 sentences):
- What you studied and why
- Who you talked to
- The single most important finding
The Key Insights (3-5 bullets):
- Each insight in the formula above
- Supporting quote
- Recommended action
The Evidence:
- Link to full transcripts
- Affinity map photo
- Participant overview table
The Next Steps:
- Three immediate fixes based on clear problems
- Two areas needing more research
- One fundamental question raised
Part 6: Making Research Stick
The Socialization Strategy
Research without adoption is academic exercise. Here's how to ensure your insights create change:
The Stakeholder Preview:
Before distributing findings, preview them with key decision-makers individually. This prevents public defensiveness and builds allies.
Frame previews as seeking input: "I want to make sure I'm interpreting this correctly. Can I run something by you?"
The Story Package:
Create a 2-minute highlight reel of powerful moments. Video clips hit harder than text quotes.
Edit for emotion, not information. One participant struggling with a simple task convinces more than statistics.
The Workshop Format:
Don't just present findings—make stakeholders experience them:
- Play a clip of user struggle
- Ask: "What do you think the problem is?"
- Let them discuss
- Play the user explaining the problem
- Compare their assumptions to reality
This creates memory through participation.
The Implementation Bridge
Based on the Design Council's research on evidence-based design:
Week 1 Post-Research:
Implement one visible quick fix based on findings. This proves research value immediately.
Schedule design sessions for larger issues. Put them on calendars while momentum exists.
Week 2 Post-Research:
Circulate quotes weekly via Slack or email. "User Quote Tuesday" keeps insights alive.
Create user quote posters for workspace walls. Physical presence maintains awareness.
Week 4 Post-Research:
Report back on changes made. "Based on user interviews, we [change] and [metric] improved by [amount]."
This closes the loop and builds appetite for more research.
The Complete Interviewing Checklist
Pre-Interview Phase
□ Define your research question clearly
Write the specific decision this research will inform. Share with all stakeholders. Get written agreement that findings will influence the decision. Without this commitment, you're doing research theater. The question should be narrow enough to answer in six interviews but broad enough to matter. "Should we redesign everything?" is too broad. "Should the checkout button be blue or green?" is too narrow. "How can we reduce checkout abandonment?" is just right.
Document the decision in writing and share with all stakeholders. Set expectations: "We will use these findings to decide between X and Y." Get explicit agreement from decision-makers that they're prepared to act on findings, even if those findings contradict current plans. Schedule a decision meeting within 48 hours of completing research—put it on calendars now.
□ Map your participant strategy across dimensions
Create a participant matrix with three dimensions: behavior (how they currently solve the problem), context (when/where they engage), and expertise (novice to expert). Don't use demographics unless they directly affect behavior—a 25-year-old and 55-year-old might have identical needs if they share the same problem context.
Aim for: 2 current customers, 2 prospects evaluating solutions, 1 person who churned, 1 person using a competitor. This mix prevents confirmation bias from only talking to happy customers. Document why you're excluding certain groups—this prevents scope creep later.
Resource: Erika Hall's participant selection guide
□ Create and test your screener questions
Write 5 behavioral questions that identify ideal participants. Focus on actions, not opinions: "How many times did you [behavior] last month?" not "Do you care about [topic]?" Include a disqualifier question to filter out people just seeking incentives: "Which of these have you done in the past month?" with one fake option.
Test your screener on a colleague. If it takes over 2 minutes, shorten it. Share screener with team for feedback—they might spot gaps. Document scoring criteria: which answers qualify, which disqualify, which are nice-to-have.
□ Recruit 7-8 participants for 6 interviews
Always over-recruit by 20-30% to account for no-shows. Send initial outreach to 15-20 potential participants to yield 7-8 confirmed. Use calendar scheduling tools (Calendly, etc.) to reduce back-and-forth. Confirm 24 hours before each interview with the meeting link and duration reminder.
For UserInterviews.com: Budget $150-250 per participant total cost. Write detailed screener requirements. Request demographic diversity even if not directly relevant. Allow platform to schedule 8 participants to ensure 6 complete.
Resource: UserInterviews.com guide to screener writing
□ Prepare your discussion guide with timed sections
Write 5-7 core questions that ladder from broad to specific. Start with: "Tell me about the last time you [core behavior]." Include follow-up prompts for each question. Time each section: 5-minute intro, 25-minute core questions, 10-minute context, 5-minute wrap-up.
Mark "must ask" versus "if time allows" questions. Never exceed 10 primary questions—depth beats breadth. Practice on a colleague and time it. Should finish in 35 minutes to allow for tangents. Print multiple copies with space for notes.
□ Set up and test all technical requirements
For remote: Test platform 24 hours before. Send calendar invites with platform link, backup phone number, and clear subject: "[Company] User Interview - Your insights needed." Have backup platform ready (phone call if video fails).
For in-person: Scout location day before. Book for 30 minutes extra (15 before, 15 after). Test power outlets and WiFi. Confirm parking/transit options for participant.
For recording: Test primary and backup devices. Charge everything overnight. Clear storage space—interviews can be 1-2GB each. Test audio in actual environment.
□ Prepare compensation and consent materials
Purchase gift cards or prepare payment method. Have 2 extra in case of issues. Print consent forms with clear language about recording and data use. Prepare receipt template for participants who need expense documentation.
Create backup digital consent process (DocuSign, etc.) for remote interviews. Have participant information sheet explaining how their data will be used. Prepare thank you email template with compensation delivery instructions.
During Interview Phase
□ Arrive early and set up environment
Arrive 30 minutes early for in-person, join 10 minutes early for remote. Test all equipment in actual conditions. Arrange seating at 90-degree angle, not across table. Remove distracting items (awards, marketing materials).
Put devices on do not disturb. Close unnecessary computer applications. Have water available for both of you. Display "Interview in Progress" sign if in office.
□ Build rapport before starting recording
Spend 3-5 minutes on casual conversation unrelated to research topic. Ask about their commute, weather, or visible background items (remote). Explain process: duration, general topics, no right/wrong answers.
Get verbal consent for recording: State it clearly, wait for explicit yes. Start recording, then repeat consent on recording. Mention they can pause or stop anytime.
□ Follow the 70/30 rule for talking
You talk maximum 30% of the time, they talk 70%. Use minimal encouragers: "mm-hmm," "I see," "tell me more." Count to 3 after they stop before you speak.
Avoid filling silence—they often continue with deeper thoughts. If you're talking more than asking, stop and redirect to them. Never interrupt unless they're completely off-topic.
□ Ask for specific examples, not generalities
Always anchor in actual events: "Tell me about the last time" not "Do you usually." When they give opinions, ask for examples: "When did that happen?" If they say "always" or "never," probe: "Has there been an exception?"
Use the critical incident technique: Focus on extremes (best/worst experiences). These reveal boundaries and expectations. Document exact quotes when they express strong emotion.
□ Take notes on emotions and environment, not just words
Note when energy changes (up or down). Document body language shifts. Mark moments of hesitation or confidence.
Capture environment details: interruptions, technical issues, distractions. Note when they reference objects or show you something. Write down your own reactions and surprises.
□ Manage time without rushing insights
Set silent timer visible only to you. At 30 minutes, assess: skip nice-to-haves if behind. Give 10-minute warning: "We're almost done, one final question."
But if they're revealing crucial insights, let them continue. Great data beats perfect timing. You can skip less important questions.
□ Close with gratitude and next steps
Ask: "What should I have asked that I didn't?" Often reveals overlooked areas. Thank them specifically: "Your point about X was particularly insightful." Provide compensation immediately (or explain delivery method).
Ask permission for follow-up: "Can I email if I have clarifying questions?" Tell them how findings will be used (improve product, inform design). End recording, then handle any logistics.
Post-Interview Phase
□ Debrief within one hour while memory is fresh
Write your top 3 surprises immediately. Note the participant's overall stance (satisfied, frustrated, neutral). Capture the one quote that summarizes their experience.
Rate interview quality (1-5) and note why. Document what you wished you'd asked. Note technical issues or methodology observations for improvement.
□ Process recordings within 24 hours
Save files with consistent naming: YYYY-MM-DD_P#_Topic. Back up to cloud storage immediately. Create rough transcript or detailed notes.
Pull key quotes while audio is fresh in memory. Note timestamps for powerful moments. Share highlights with team to maintain engagement.
□ Start pattern recognition after interview 3
Create simple spreadsheet: participants in rows, emerging themes in columns. Don't force patterns—let them emerge naturally. Note contradictions between participants.
Document edge cases that might represent future needs. Track emotional intensity, not just mention frequency. Adjust remaining interview questions based on patterns.
□ Conduct synthesis session within 48 hours of final interview
Block 4 uninterrupted hours—insights decay quickly. Print all notes or display on large screen. Create affinity map with observations on sticky notes.
Group similar observations without forcing categories. Look for patterns mentioned by 3+ participants. Document strong outliers that might indicate edge cases.
Resource: Affinity diagramming guide
□ Transform patterns into actionable insights
Use the formula: "Users do X because Y, which means Z for our design." Each insight should clearly imply action. Avoid vague insights like "Users want simplicity."
Instead: "Users screenshot confirmations because they don't trust the system to maintain records, which means we need persistent, accessible transaction history." Support each insight with 2-3 participant quotes.
□ Create one-page executive summary
Lead with the single most important finding. Include 3-5 key insights with supporting quotes. List 3 immediate fixes based on clear problems.
Note 2 areas needing further research. Pose 1 fundamental question raised by research. Link to full documentation for those wanting details.
□ Share findings within one week through multiple channels
Schedule findings presentation while momentum exists. Create 2-minute video highlight reel of key moments. Share one-page summary via email/Slack.
Post user quotes in common areas. Schedule follow-up sessions with individual stakeholders. Document in shared repository for future reference.
□ Track implementation and impact
Create simple tracking document: finding, recommended action, actual action, result. Follow up at 2 weeks: What's been implemented? Follow up at 4 weeks: What impact observed?
Share success metrics: "Based on user interviews, we changed X and saw Y improvement." This builds organizational appetite for more research. Document lessons learned for next research cycle.
Common Pitfalls and Recovery Strategies
When Participants Cancel Last-Minute
Expect 20% no-show rate. Always recruit 1-2 extra participants. Keep their schedules as backup slots. If someone cancels day-of, immediately contact your backup.
For paid recruitment services, report no-shows immediately for credits. Use the freed time to review previous interviews for gaps. Document patterns to improve future screening.
When Technology Fails
Always have backup recording—phone voice memos work. If video fails, continue with audio only. Take detailed notes on what you're missing visually.
If platform fails entirely, switch to phone call. Get participant's permission to record alternative method. Follow up with email clarifying any unclear points.
When Participants Give Shallow Answers
Switch from abstract to concrete: "Show me" instead of "Tell me." Ask for specific examples: "When did this last happen?" Use the teaching technique: "How would you explain this to someone new?"
Try the opposite angle: "When does this work well?" Sometimes shallow answers mean they don't actually have the problem you're studying—that's valuable data too.
When You Realize Your Questions Are Wrong
After 2-3 interviews revealing your assumptions are off, pivot. Keep the same opening question for consistency. Adjust follow-ups based on what you're learning.
Document the pivot and why it happened. This is good research practice, not failure. The ability to adjust shows research maturity.
Your Next Six Interviews Start Now
Stop waiting for perfect conditions. Your next six user interviews don't need a research team, a big budget, or stakeholder buy-in. They need you to send this email today:
"Hi [Customer Name],
We're trying to better understand how people really [core problem]. Your experience would be incredibly valuable.
Could we schedule a 45-minute video call this week? I'll send a $50 [Amazon/relevant] gift card as a thank you.
When works for you?"
Send that to ten customers. Six will respond. Those six conversations will reveal more about your users than months of analytics data.
Because here's what Erika Hall knows that most businesses don't: Your users already have the answers. They're living with the problems every day. They've developed workarounds. They've found alternatives. They know exactly what's broken.
You just have to ask. And then—this is the hard part—you have to listen.
This guide is part of the UX Helpdesk membership resources. For interview training, recruitment assistance, or synthesis support, consult your membership benefits or visit our member portal.