Stuck? Ask AI for Terrible Ideas
Why asking for terrible ideas unlocks better ones
You know that feeling when you need ideas and your brain just... stops?
Maybe you’re staring at a blank screen trying to figure out how to approach a messy dataset.
Or you need to present three solutions to a problem and can’t think of even one.
Or you’re drafting an explanation for people outside your field and every phrase sounds either too jargon-heavy or too dumbed-down.
The ideas won’t come. The harder you push, the more stuck you feel.
Try this: ask AI for bad ideas on purpose.
Sounds backwards, but it works. And understanding why reveals something useful about how to use AI when your brain won’t cooperate.
The Problem With “Give Me the Best Answer”
When you ask AI “What’s the best way to solve this problem?” you’re treating it like an answer machine. You’re hoping it delivers a ready-made solution you can use directly.
Two things go wrong here.
First, AI gives you one polished response, and if it doesn’t quite fit your situation, you’re stuck again.
Second, you’re passive - waiting for AI to think instead of getting your own brain moving.
Try this instead:
Give me 15 ways to approach this problem. Include obvious ones, creative ones, and at least three terrible ideas that probably won’t work.Read that list and notice what happens. Your brain starts reacting: “That won’t work because...” or “That’s interesting but we’d need...” or “What if we combined these two?”
That’s the shift. You’re not looking for the perfect answer from AI. You’re generating enough material to get your own judgment working again.
Why Terrible Ideas Actually Help
Asking for bad ideas does something specific: it removes the pressure to be brilliant.
When every suggestion needs to be good, you (and AI) get conservative. You stick to safe, obvious options. But when you explicitly ask for terrible ideas, a few things happen:
Permission to experiment. Seeing bad ideas cost nothing. You’re just generating options, not committing to anything. This lowers the stakes enough that your brain stops censoring itself.
Unexpected seeds. Often the “terrible” ideas contain something unexpectedly useful. Maybe the approach is wrong but the underlying question is right. Maybe it won’t work for your situation but reminds you of something that would.
Volume creates momentum. Ten options give you something to react to. One option leaves you evaluating whether it’s “good enough” - which is just another form of being stuck.
The technique works because you’re asking AI to do what it does well (generate volume quickly) while keeping you in charge of what actually matters (judgment about your specific situation).
Five Ways to Use This
Ask for quantity, including bad ideas
Generate 15 ways to [your problem]. Include obvious ones, creative ones, and at least three terrible ideas.This is the core technique. By forcing volume, you create material to react to. The “terrible ideas” part explicitly removes pressure and often surfaces something useful.
Works for: approaching messy data, structuring analysis, explaining technical concepts, designing workflows.
Challenge your assumptions
What am I likely not considering about [your situation]? What assumptions might I be making?When stuck, you’re usually locked into one way of seeing the problem. This prompt questions your framing directly.
Sometimes helps to flip it: “What’s the worst way to solve this?” Seeing what to avoid often clarifies what to do instead.
Get multiple perspectives at once
How would different people approach this? Give me perspectives from: a software engineer, a business analyst, a project manager, a data scientist, and a skeptical stakeholder.Different roles ask different questions. Five perspectives in two minutes beats scheduling five separate conversations. Particularly useful when you’re presenting to cross-functional teams and need to anticipate concerns.
Ask for context-specific options
Generic prompt:
Help me brainstorm a dashboardMore useful:
I’m designing a dashboard for non-technical executives who care about donation growth but get overwhelmed by too many metrics. Budget is limited, timeline is two weeks. What approaches might work?The more context you provide - audience, constraints, timeline, technical limits - the more relevant the options. Specificity in prompts produces specificity in output.
Iterate, don’t stop at one round
When AI lists ideas, follow up:
“Expand #3 into three possible directions”
“Combine #2 and #7 into one approach”
“Give me 5 variations on option #4”
This back-and-forth lets you drill down quickly. Your judgment guides which paths are worth exploring deeper.
What This Looks Like
Scenario: You need to explain a 15% monthly sales drop to executives who don’t understand statistical variance.
You’ve been drafting for 20 minutes. Everything sounds either too technical or patronizing.
Weak prompt:
How do I explain this?Better prompt:
I need to explain a 15% sales drop to non-technical executives who are worried it’s a crisis. I know it’s within normal variance. Give me 5 different ways to frame this - using analogy, visualization description, direct concern-addressing, industry comparison, and concrete example.You get five starting points in two minutes. None perfect, but each gives you something to react to.
Maybe the weather analogy feels too simple, but it sparks an idea about seasonal patterns.
Maybe the industry benchmark approach could combine with the concrete example.
Ten minutes later, you have something clearer. Not because AI wrote it - because AI gave you enough material to recognize what you were actually trying to say.
Another scenario: You’re designing a workshop outline and nothing’s clicking.
Try:
List 10 unexpected ways to explain [concept] to [audience].You might see: “compare it to something familiar from their daily work,” “create a myth-busting session,” “build a collaboration exercise.”
You won’t use these verbatim, but they’ve created momentum. Your brain can build from there.
One more: You’re three days from a stakeholder presentation and can’t figure out how to structure findings.
Try:
I have findings from a six-month customer behavior analysis. My stakeholders care most about actionable insights and budget implications. They have 30 minutes. Give me 5 ways to structure this presentation.The AI might suggest: chronological story, problem-solution format, insight-action pairs, comparison to benchmarks, or priority-ranked findings.
Seeing these laid out helps you recognize which structure matches how your stakeholders actually make decisions.
Why bother doing this?
Because there’s a hidden cost of staying stuck.
Every hour you spend staring at a blank page is an hour you’re not refining, testing, or improving what you’re building.
From AI Ideas to Your Ideas
The process isn’t about copying. You’re using AI as a catalyst.
Read what AI generates and notice your reactions.
Which ideas make you think “yes,” “maybe,” or “definitely not”? That’s your expertise responding.
Pick 2-3 that feel promising - not necessarily the “best” ones, but the ones that resonate with your specific situation.
Then develop them in your context.
Maybe AI suggested a decision tree and that reminds you of a flowchart approach from a previous project. That connection is your brain working, not AI’s output.
Watch for this: If you’re tempted to copy-paste an AI response directly, you probably haven’t engaged enough yet. Push for more variations or deliberately ask for flawed versions that force your brain to do the fixing work.
When This Doesn’t Work
The suggestions feel generic
Your prompt needs more context. Instead of “Give me data visualization ideas,” try “I have sales data by region, product, and time. My audience is regional managers who want to understand both their performance and competitive position. They’re visual learners but get overwhelmed by dense charts. What are 10 ways to visualize this for actionable insights?”
You can’t tell which direction to pick
You’re not looking for the objectively “right” answer - most problems don’t have one. Pick two directions, sketch each for 10 minutes, see which still feels solid. That’s usually enough to proceed.
This feels like outsourcing your thinking
It’s not. Professional writers have always used thinking partners. This is the same concept with different mechanics. The ideas you develop after getting unstuck are still yours - you’re just using a different method to access your own judgment.
You’re stuck because the problem isn’t clear yet
Try:
What questions should I answer before deciding how to approach [problem]?Sometimes you need to frame the problem before you can brainstorm solutions.
Sometimes it genuinely wastes time
True. If you already have a clear direction, adding AI brainstorming just creates noise. This is specifically for when you’re stuck and solo brainstorming isn’t working.
Why This Matters for Your Work
The ability to unstick yourself quickly becomes more valuable as AI handles more routine execution.
When you can generate options fast, you spend more time on judgment calls - which direction fits your constraints, which approach your stakeholders will actually use, which solution accounts for things AI can’t see about your situation.
That judgment is what distinguishes people at similar technical levels. The faster you can get from “stuck” to “evaluating options,” the more time you have for thinking that actually requires your experience.
People who can unstick themselves (and help others do the same) become valuable in ambiguous situations. When your team faces a new problem with no obvious solution, being able to generate and evaluate options quickly makes you someone people want in the room.
Going Deeper
Once this feels comfortable:
Keep a template library. Save reusable prompts:
“List 10 unconventional approaches to...”
“What assumptions might be wrong about...”
“Give me 5 metaphors for explaining...”
“Show me 3 ways this could fail in practice...”
Ask for the opposite. If stuck on solutions, ask why the problem is unsolvable. Arguments against often reveal what needs addressing for a solution to work.
Request failure modes. “Show me 3 ways to implement this that would technically work but create problems down the road.” Learning what not to do clarifies what you should do.
Use scaffolding prompts. “What are the key questions I should answer before deciding on an approach?” Sometimes you’re stuck because you’re trying to solve before properly framing.
What This Solves (And What It Doesn’t)
This technique addresses one specific problem: the friction of starting when your brain won’t cooperate.
You’ll spend less time staring at blank screens waiting for ideas to arrive. When you get stuck, you’ll have a reliable way to generate starting material instead of hoping inspiration strikes.
Brainstorming sessions - alone or with others - move faster when you can quickly generate options to evaluate rather than struggling to produce anything.
What this won’t do (and this is IMPORTANT to acknowledge):
Make you better at execution. Speed up work that’s already clear. Replace the judgment calls that require your specific expertise.
The difference between spending 10 minutes generating options versus spending an hour staring at nothing compounds over time. Not just in productivity - in frustration levels too.
Some people find this becomes their default move when stuck. Others use it occasionally when really jammed. Both work. The point is having the option when you need it.
One Prompt Worth Trying
Next time you’re stuck (and it will happen), try this:
“Give me 15 ways to approach [specific problem]. Include obvious approaches, creative ones, and at least three terrible ideas.”
Then:
Read through the list (5 minutes max)
Pick 2-3 that make you react - either “that’s interesting” or “that definitely won’t work but...”
Sketch one out for 10 minutes
See where it goes
You’re not committing to anything. You’re just generating material to react to.
Most people find they have more to work with than when they started. The ones who don’t usually realize they weren’t actually stuck - they just didn’t like their options. That’s useful information too.
Where do you usually get stuck? Data analysis structure? Explaining concepts? Presentation format? Try it there first.
Hope this was helpful.
Til next time,
Donabel


