The Mom Test for Startup Ideas: 23 Questions That Actually Work
You've got a startup idea. You're excited. So you do what every founder does — you tell someone about it.
Your mom says it's brilliant. Your friends say they'd totally use it. Your coworker says "dude, you should definitely build that."
So you spend six months building it. Launch day arrives. Crickets.
What went wrong? Everyone said they loved it.
That's exactly the problem. They lied to you. Not maliciously — they were just being nice. And you made it really easy for them to be nice by asking the wrong questions.
This is what Rob Fitzpatrick calls The Mom Test — the idea that your questions should be so good that even your mom can't lie to you. Not because she becomes honest, but because you stop asking questions that have a "polite" answer.
Let's fix your customer discovery process.
Why Most Customer Interviews Are Worthless
Here's what a typical founder's "validation" looks like:
> Founder: "I'm building an app that helps freelancers track their invoices. Would you use something like that?"
>
> Friend: "Oh yeah, totally. That sounds super useful."
The founder walks away thinking they've validated demand. They haven't validated anything. They've collected a compliment disguised as data.
Three types of bad data destroy startups:
1. Compliments — "That's a great idea!" means nothing. It's a social reflex, not market signal.
2. Hypothetical fluff — "I would definitely pay for that" is a fantasy. People are terrible at predicting their own future behavior.
3. Wishlists — "It would be cool if it also did X, Y, and Z" tells you what sounds interesting, not what's actually painful enough to pay for.
If your interview notes are full of "they loved the idea" and "they said they'd use it," you have zero useful data. You have a pile of polite lies.
The Three Rules That Change Everything
The Mom Test boils down to three rules. Simple to understand, brutally hard to follow:
Rule 1: Talk About Their Life, Not Your Idea
The moment you pitch your idea, the conversation stops being about truth and starts being about politeness. People don't want to crush your dreams to your face.
Instead of: "Would you use an app that..."
Ask about: their current reality, past behavior, existing problems.
You're not there to get validation. You're there to understand their world.
Rule 2: Ask About Specifics in the Past, Not Hypotheticals About the Future
"Would you pay $20/month for this?" is a hypothetical. The answer is meaningless.
"How much did you spend last year trying to solve this problem?" is a fact. Facts don't lie.
Past behavior is the single best predictor of future behavior. If someone has never spent money, time, or effort trying to solve the problem you're targeting, they're not going to start just because your app has a nice UI.
Rule 3: Talk Less, Listen More
If you're doing more than 20% of the talking in a customer interview, you're doing it wrong. You're there to learn, not to pitch. Every minute you spend explaining your vision is a minute you're not hearing the truth.
The 23 Questions That Actually Work
Here's your playbook. These questions are organized by what you're actually trying to learn.
Understanding the Problem (Questions 1-7)
These tell you if the problem is real and painful enough to matter.
1. "Tell me about the last time you dealt with [problem area]."
This is the single best opening question in customer discovery. It's specific, grounded in the past, and impossible to answer with a polite lie. If they can't recall a specific instance, the problem isn't that painful.
2. "What was the hardest part about that?"
Don't assume you know what's painful. Let them tell you. The answer often surprises founders — the real pain point is usually adjacent to what you expected.
3. "Why was that hard?"
Go deeper. One "why" is never enough. The first answer is surface-level. The real insight is two or three layers down.
4. "How often does this come up?"
Frequency matters enormously. A problem that happens once a year is a nuisance. A problem that happens daily is a business.
5. "What do you currently do to solve this?"
If the answer is "nothing," that's a red flag. Not because the problem doesn't exist, but because it might not be painful enough to motivate action. People who have a real problem have a current solution — even if it's duct tape and spreadsheets.
6. "What don't you love about your current solution?"
Now you're getting somewhere. The gaps in their current approach are where your opportunity lives.
7. "Have you tried to find a better solution? What happened?"
This is gold. If they've actively searched for alternatives, the problem is real. If they've tried competitors and churned, you're learning exactly what the market needs.
Gauging Willingness to Pay (Questions 8-13)
Ideas are free. Revenue requires people who will actually open their wallets.
8. "How much time do you spend on this per week?"
Time is money. If they're spending 5 hours a week on a workaround, you can do the math on what a solution is worth.
9. "What does this problem cost you — in money, time, or missed opportunities?"
Make them quantify it. Vague pain produces vague willingness to pay. Specific costs justify specific prices.
10. "Have you ever paid for something to help with this?"
Past spending behavior is the strongest signal you'll get. If they've paid for adjacent solutions, they'll pay for yours. If they've never spent a dollar on this category, proceed with extreme caution.
11. "What would you pay to make this problem disappear?"
Use this one carefully — it's slightly hypothetical, but grounded by the preceding questions about real costs. At this point, they've already quantified the pain, so the answer tends to be more honest.
12. "Who else in your organization would need to approve this purchase?"
Critical for B2B. The person with the pain isn't always the person with the budget. If there are three layers of approval between your champion and a purchase order, your sales cycle just got very real.
13. "What's the budget for tools in this category?"
In B2B, budget cycles are real. Knowing whether there's an existing line item for this type of solution tells you if you're selling into existing demand or creating new demand. (Creating new demand is 10x harder.)
Testing Commitment (Questions 14-18)
Talk is cheap. These questions separate tire-kickers from real prospects.
14. "Can I follow up with you when we have a prototype?"
If they say yes and actually give you their email (not their work email that goes to spam), you have a warm lead. If they hesitate, they were being polite.
15. "Would you be willing to be a beta tester?"
This costs them time. Real interest involves real commitment. "Sure, send me an email" is weak. "Yes, I'll block out 30 minutes next Tuesday" is strong.
16. "Who else should I talk to about this?"
Referrals are a commitment signal. If they're genuinely excited about the problem being solved, they'll connect you with others who share the pain. If they shrug, the enthusiasm was performative.
17. "Can I sit with you next time you work on this?"
Observation beats interviews every time. If they'll let you watch them struggle through their current process, you'll learn more in an hour than ten interviews would teach you.
18. "Would you pre-order this if I could deliver it in 90 days?"
The ultimate commitment test. A credit card on the table settles every debate about whether demand is real. If you can get 5-10 pre-orders before writing a line of code, you have something.
Understanding the Market (Questions 19-23)
These help you understand the landscape, not just the individual.
19. "What other tools do you use for your workflow?"
Integration context matters. Your product doesn't exist in isolation — it needs to fit into their existing stack. This also reveals adjacent competitors you might not know about.
20. "Where do you go to find new tools or solutions?"
This is your distribution roadmap. Are they on Reddit? Do they ask peers? Do they search Google? The answer tells you where to show up.
21. "What would have to be true for you to switch from your current solution?"
Switching costs are the silent killer of startups. Even if your product is 2x better, the pain of switching might not be worth it. You need to understand the bar.
22. "What's changed about this problem in the last year?"
Timing matters. If the problem is getting worse (regulations, scale, complexity), you're riding a wave. If it's stable or improving, you're fighting gravity.
23. "If you could wave a magic wand and fix one thing about how you handle this, what would it be?"
End with this. It's slightly hypothetical, but by now you've earned honest answers. The response often reveals the priority — the single feature that would make or break their decision.
The Anti-Patterns: Questions That Feel Productive But Aren't
Avoid these like the plague. They sound like good customer discovery but produce nothing useful.
"Do you think it's a good idea?"
Nobody will tell you it's a bad idea to your face. This question has exactly one possible answer in polite society.
"Would you buy a product that does X?"
The gap between "would" and "will" is where startups go to die. Hypothetical purchase intent is meaningless.
"How much would you pay for this?"
Without context about their current spending, this produces fantasy numbers. Ask about past behavior first.
"What features would you want?"
You've just turned your customer into a product manager. They're bad at that job. Observe their problems; design the features yourself.
"Do you agree that [problem] is really annoying?"
Leading questions produce led answers. You've basically told them what to think.
Running Your First 10 Conversations
Theory is nice. Here's how to actually do this.
Finding People to Talk To
You need 10-15 conversations to start seeing patterns. Here's where to find them:
- Your network — but NOT friends and family. Acquaintances and second-degree connections are more honest.
- LinkedIn — cold outreach works if you're genuinely asking for advice, not selling.
- Reddit/communities — find where your target audience hangs out. Contribute first, then DM.
- Coffee shops and co-working spaces — seriously. "Hey, can I buy you a coffee and ask about [problem area]?" works more often than you'd think.
The Conversation Structure
Keep it simple:
1. Open warm (2 min) — context for why you're talking. "I'm exploring [problem area] and trying to understand how people handle it."
2. Their story (15 min) — use questions 1-7. Let them talk. Take notes.
3. Go deeper (10 min) — follow the interesting threads. Use questions 8-13 as needed.
4. Commitment test (3 min) — questions 14-18. See if they'll put skin in the game.
5. Close and capture (2 min) — thank them. Write up your notes within 10 minutes while it's fresh.
Total time: 30 minutes. Don't go longer. Respect their time and yours.
Reading the Signals
After 10 conversations, you should see patterns. Here's how to interpret them:
Strong signals (green light):
- Multiple people describe the same problem unprompted
- People have already spent money trying to solve it
- They get emotional when describing the pain
- They ask when your solution will be ready
- They offer to introduce you to others with the same problem
Weak signals (proceed with caution):
- People acknowledge the problem but haven't tried to solve it
- Interest but no commitment (no email, no referrals)
- The problem exists but it's low-priority
- Very different pain points across conversations (market is fragmented)
Red flags (reconsider everything):
- Nobody can recall a specific instance of the problem
- "That's interesting" with zero follow-up
- They describe the problem but say it's not worth solving
- You're doing most of the talking in interviews
- Zero willingness to commit time, money, or introductions
The 2026 Twist: AI and Customer Discovery
Here's what's changed. In 2026, two things make customer discovery different:
1. AI can prep your questions, but it can't replace the conversation.
Use AI to generate targeted question lists for your specific niche. Use it to analyze your interview transcripts for patterns. Use it to score your idea's fundamentals before you start interviews. But don't use AI as a substitute for talking to real humans. The unexpected emotional reaction, the pause before an answer, the thing they mention offhand — those happen in real conversations.
2. Founders are building too fast.
With vibe coding and AI-assisted development, you can go from idea to deployed MVP in a weekend. That sounds great until you realize it lets you skip validation entirely. The temptation to "just build it and see" is stronger than ever. Resist it. A weekend of customer conversations will save you from building the wrong thing in a weekend.
The Mom Test isn't outdated because we have AI — it's more important because we have AI. The faster you can build, the more critical it is to build the right thing.
The Spreadsheet That Ties It All Together
After each conversation, log this:
| Field | Example |
| Name | Sarah K. |
| Role/Context | Freelance designer, 3 years |
| Key problem described | Chasing late invoices, 4+ hours/week |
| Current solution | Manual follow-up via email |
| Money spent on problem | $0 (just time) |
| Emotional intensity (1-5) | 4 — visibly frustrated |
| Commitment offered | Gave email, will beta test |
| Surprise insight | Problem is worse with international clients |
Stop Collecting Compliments, Start Collecting Evidence
Most founders fail at validation not because they're lazy, but because they're asking the wrong questions. They walk into conversations hoping to hear "great idea" instead of hoping to learn the truth.
The truth might hurt. It might kill your idea. It might send you in a direction you didn't expect.
That's the point.
The founders who win aren't the ones with the best ideas — they're the ones who figured out what people actually need before spending six months building something nobody asked for.
Twenty-three questions. Ten conversations. That's all that stands between you and knowing whether your startup idea is worth the next year of your life.
Go find out.