3.2.1 - The Difference Between Learning and Being Stuck
Name the learning signal: say what question is being answered, what signal will answer it and what decision comes next.
Learning signal
Are we learning something new or just stuck in loops?
The call
Know whether you are learning or stuck. Otherwise AI helps you iterate without evidence and every cycle feels like progress while nothing changes.
Why it matters
The difference between learning and being stuck is whether each iteration answers a question or just produces more output. AI can generate variations quickly, but human judgement decides whether the latest change moved the needle or just moved the work. The difference is between focused iteration and expensive repetition.
Explainer
A learning signal is not a feeling of progress. It is the specific answer that tells you whether the last iteration worked. Until you can name one question being answered, one signal that answers it and one decision that follows, you are iterating blind. AI can help run experiments, but it cannot tell you when to stop.
Make the learning signal concrete
Compare the broad version with a version you can actually test.
- Too vague: We are iterating and learning as we go.
- Concrete enough to test: After each round of testing, we check whether content creators acted on at least one context-shaped result. If they did, the context layer is working and we iterate on result quality. If they did not, we stop and investigate whether the context is shaping the query at all.
The second version lets two people make the same decision from it.
Check the learning signal
- Pass: You can say what question is being answered, what signal will answer it and what decision comes next.
- Fail: If learning still means we are figuring it out as we go, the signal is not defined well enough yet.
Do not move into the next iteration until this passes.
How to use AI for the learning signal
- AI chat: Rewrite the learning signal until you can state all three parts clearly.
- vibeCoding: Build the thinnest flow that tests this learning signal in practice before broader build work.
- AI-assisted coding: Carry the same learning signal into implementation and review so the live system keeps the same decision.
Sharpen the learning signal
Copy this prompt into AI chat, replace the bracketed lines with your real learning signal and keep the instruction exactly as visible here.
You are checking whether this learning signal is clear enough before you move forward.
Constraint:
The learning signal must be specific enough that two people would call the same work progress from it.
Working draft:
Learning question: [what question is being answered]
Signal: [what signal will answer it]
Next decision: [what decision comes next]
Task:
Decide whether this learning signal is specific enough to guide the next decision. If it is vague, rewrite it so two people would make the same decision from this learning signal.
Check:
- Would two people interpret this the same way?
- Does it stay concrete enough to guide the next step?
- Does it meet this bar: You can say what question is being answered, what signal will answer it and what decision comes next.
Return:
- A corrected learning signal
- A short explanation of what was vagueCopy this into AI chat. Replace the bracketed parts. Keep the rest unchanged. AI will likely suggest refinements based on what you enter. Use those to sharpen your thinking, not replace it. Create a free account to save your answers and pick up where you left off.
Evaluation
Before accepting the result, check whether two people would make the same decision from it.
Example
To help you work through this, here is a real example. StartWithYourContext is an AI search tool built as part of the vibe2value project. Here is how its learning signal was written using the three parts:
- Question being answered: Does the context layer visibly change the search results compared to searching without context?
- Signal: Whether the content creator acts on a context-shaped result instead of ignoring it or searching elsewhere.
- Decision next: If they act, iterate on result quality. If they do not, investigate how context is shaping the query.
That learning signal is specific enough that two people would make the same decision from it.
When there is more than one side
Not every product has a single learning signal. When a system serves more than one side, each side learns from different evidence and progress for one may mask stagnation on the other.
Multi-sided worked example
For example, StartWithYourContext has two different learning signals:
- Content creator: Are they acting on results? If yes, the context layer is learning. If not, the search is not different enough from generic.
- Developer: Are they completing setup without getting stuck? If yes, the documentation is learning. If not, the integration is harder to follow than it looks.
Both signals tell you whether you are learning or stuck, but they answer different questions. If only one is tracked, the other side can be stuck without anyone noticing.
Risk and mitigation
- Risk: Iterating without a learning signal, which creates output that feels like progress while the core question stays unanswered.
- Mitigation: Define one learning question per iteration and pause when the signal is unclear.
Key takeaway
Do not move forward until you can say what question is being answered, what signal will answer it and what decision comes next.
Work through this in a workshop
If your learning signal is still unclear, bring it to a free weekly workshop. Bring the messy part of your AI-assisted build and leave with a clearer next step. In some sessions, we walk through practical examples on the Cloudflare Workers stack to show how a rough idea turns into something that actually runs.
What do you think?
How are you telling the difference between learning and being stuck in your iterations and how is AI helping you stay honest about which one it is?