Skip to main content

Discovery Briefs — Framing the Right Problem

Someone walks up to your desk and says: "We need to build a dashboard for our enterprise customers." You hear this and your instinct is to start scoping — what data do we show? What filters? What layout? Within ten minutes you are sketching wireframes. Within an hour you have a ticket in Linear.

Stop. What is the actual problem?

Maybe enterprise customers are churning because they cannot prove ROI to their CFO. Maybe the sales team is losing deals because prospects cannot see how the product performs at scale. Maybe customer success managers are spending four hours a week manually pulling metrics for quarterly business reviews. Each of these is a different problem. Each leads to a different solution — and only one of those solutions might be a dashboard. If you start building the dashboard without understanding which problem you are solving, you will ship a feature that looks right but solves nothing.

This lesson teaches you to pause between the request and the response. The tool is the problem brief — a structured document that reframes "build X" into "here is what is actually wrong, who it affects, and what we need to learn before we decide what to build." You will use the /brief command from the custom product-strategy plugin to generate briefs, and — critically — you will learn to evaluate whether the AI's output actually maintains problem-focus or quietly smuggles in the solution you were trying to avoid.

Solution-Prescriptive vs. Problem-Focused

Every feature request arrives in one of two forms. Recognising which form you are dealing with is the first PM skill this lesson builds.

DimensionSolution-PrescriptiveProblem-Focused
Sounds like"We need a dashboard""Enterprise customers cannot demonstrate ROI"
Verbs usedBuild, add, create, integrate, launchStruggle, lose, wait, miss, churn
Anchoring riskHigh — the team converges on the named solutionLow — the team explores the problem space
Discovery outcomeValidates the proposed solution (confirmation bias)Discovers the right solution (evidence-based)
Example"Add a Slack integration""Customers miss critical alerts because notifications only exist inside the product"

Most feature requests arrive solution-prescriptive. This is natural — stakeholders think in terms of features because features are concrete and easy to communicate. But the PM's job is to translate "build X" into "because of Y" before committing engineering resources.

The Cost of Skipping the Reframe

A B2B SaaS team that ships a feature without understanding the problem typically discovers the misalignment 6-8 weeks after launch, when adoption data shows the feature is not solving the issue customers actually had. At that point, the team has spent a full quarter on a solution to the wrong problem — and the real problem is still unsolved.

The Problem Brief Structure

The /brief command produces three types of briefs. This lesson focuses on the Problem Brief — the document you write at the start of discovery, before any solution work begins.

A problem brief has seven sections, and each one serves a specific purpose:

SectionPurposeWhat belongs hereWhat does NOT belong here
THE PROBLEMState what is happening that should not be happeningObservable pain, current behaviour, gap between expectation and realityFeature names, solution proposals, implementation details
WHO IS AFFECTEDIdentify the persona(s) and quantify the audiencePersona name, segment, approximate count"All users" (too vague)
EVIDENCEGround the problem in data, not opinionSupport tickets, NPS verbatim, behavioural data, lost deal notes, research quotes with sourcesAnecdotes without sources, assumptions presented as data
IMPACT OF NOT SOLVINGMake the cost of inaction concreteRevenue at risk, churn projections, competitive exposure, team time wastedVague urgency ("this is important")
WHAT WE DO NOT KNOW YETMake uncertainty explicitOpen questions, untested assumptions, gaps in evidenceNothing — this section must always be present
HYPOTHESIS (optional)State a directional bet as a testable assumption"We believe that X will solve Y. We will know when Z.""We should build X" (that is a requirement, not a hypothesis)
DISCOVERY QUESTIONSDefine what research must answer before a spec can be written3-5 specific, answerable questions"Explore user needs" (too vague to act on)
The Most Important Section

The WHAT WE DO NOT KNOW YET section is what separates a problem brief from a stakeholder request dressed up in a template. If your brief does not have this section — or if the section says "N/A" — it is not a problem brief. It is a solution proposal wearing a problem brief's clothing. Acknowledging uncertainty is strength, not weakness.

Worked Example: Reframing "Build a Dashboard" for InsightFlow

Here is the feature request that arrived in your InsightFlow Slack channel from Aisha Patel, Head of Sales:

"We need a dashboard for enterprise customers. Three prospects in the last month asked for reporting capabilities during demos. Can we get this on the Q3 roadmap?"

This is classic solution-prescriptive framing. Aisha has identified a real signal (prospects asking about reporting) and jumped to a solution (dashboard). Your job is to uncover what is actually going wrong.

Run /brief with the request:

/brief
type: "problem"
context: "InsightFlow is a B2B SaaS analytics platform. We are Series B,
50 employees, 200 customers. Our primary persona is Analyst Alex — data
analysts at 100-500 person companies who build dashboards without SQL.

Feature request from Head of Sales: 'We need a dashboard for enterprise
customers. Three prospects asked for reporting capabilities during demos.
Can we get this on the Q3 roadmap?'

Reframe this as a problem brief. Do not propose solutions."

Sample output from /brief:

Product Brief: Enterprise Reporting Gap
Type: Problem Brief
Date: 2026-03-18
Author: [PM]
Team: Product, Sales, Customer Success

THE PROBLEM
Enterprise prospects are evaluating InsightFlow but cannot assess
whether the platform meets their reporting and governance requirements
during the sales process. Three enterprise prospects in the past 30
days requested reporting capabilities that InsightFlow does not
currently surface during demos. Sales cannot demonstrate how
InsightFlow handles the reporting workflows that enterprise buyers
evaluate as table-stakes — leading to stalled deals or lost
opportunities at the enterprise tier.

WHO IS AFFECTED
- Enterprise prospects (companies with 500+ employees) evaluating
InsightFlow during the sales cycle
- Estimated 15-20 enterprise opportunities per quarter based on
current pipeline
- Secondary: existing enterprise customers (12 accounts) who may
have similar unmet needs but have not yet voiced them

EVIDENCE
- 3 enterprise prospects raised reporting capabilities as a gap
during demos in the past 30 days (source: Aisha Patel, Head of Sales)
- Deal sizes for these opportunities: illustrative range $50K-$120K ARR
- Current enterprise win rate unknown — no segmented data available
- No support tickets from existing enterprise customers on this topic
(but absence of complaints ≠ absence of need)

IMPACT OF NOT SOLVING
- Enterprise deals continue to stall or lose to competitors who
demonstrate reporting capabilities in the sales process
- InsightFlow's expansion from analytics to workflow automation
(the Q3 strategic bet) may lack credibility at the enterprise
tier without visible reporting maturity
- Sales team workarounds (manual report exports, custom demos)
consume approximately 4-6 hours per enterprise opportunity

WHAT WE DO NOT KNOW YET
- What specific reporting capabilities did these prospects ask for?
"Reporting" could mean scheduled email reports, embedded analytics,
compliance audit trails, or executive dashboards — each is a
different product surface
- Are these prospects representative of the enterprise segment, or
are they outliers with unusual requirements?
- Do our 12 existing enterprise customers have the same need? Have
they built workarounds we do not know about?
- What do competitors offer in this space? Is this table-stakes or
a differentiator?

DISCOVERY QUESTIONS
1. What specific reporting workflows did each of the three prospects
describe? (Interview Sales — Aisha Patel, AEs on those deals)
2. What reporting capabilities do our top 3 competitors demonstrate
in enterprise sales cycles? (Competitive analysis)
3. Do our existing 12 enterprise customers use workarounds for
reporting needs? (Customer Success interviews — 5 minimum)
4. What is the minimum reporting surface needed to unblock enterprise
sales — and is that the same as what existing customers need?
5. Would reporting capabilities change the deal economics enough to
justify Q3 prioritisation over workflow automation?

Evaluating the Output

Before you accept this brief, check it against the /brief skill's quality rules:

Check 1 — No solution proposals in THE PROBLEM section. Read the problem statement again. Does it mention dashboards, features, or specific product changes? In this output, the problem section describes the gap (prospects cannot assess reporting capabilities) without prescribing what to build. This is correct.

Check 2 — WHAT WE DO NOT KNOW section is present and substantive. The section lists four specific unknowns, each framed as a question. It does not say "N/A" or "we need more research." This is correct.

Check 3 — Discovery questions are specific and answerable. Each question names who to talk to or what method to use. "What specific reporting workflows did each prospect describe?" is answerable. "Explore user needs" would not be. This is correct.

Check 4 — No decision or next step is missing. The brief implies the next step (answer the discovery questions) but does not state it explicitly. You might add: "NEXT STEP: Run a 1-week discovery sprint to answer questions 1-3 before Q3 planning." This is a minor gap — prompt the agent to add it.

Keep This File

Lessons 3-14 build one continuous product management cycle for InsightFlow. Keep your Cowork session and working folder between lessons. The problem brief you produce in this lesson's exercise feeds directly into Lesson 4, where you will use /interview to create an interview guide that addresses your discovery questions.

The Three Brief Types

The /brief command supports three brief types. This lesson focuses on the Problem Brief, but understanding all three helps you choose the right format at each stage:

Brief TypeWhen to UseKey RuleOutput
Problem BriefAt the start of discovery — you know something is wrong but have not investigated yetNo solution proposalsA framing document that aligns the team on the problem
Discovery BriefBefore a research sprint — you need to scope what the team will investigateMust include ranked questions with explicit success criteriaA research plan with methods, sample, and expected outputs
Initiative BriefBefore writing a PRD — you need executive alignment on a major betMust include "the bet," rough effort, and decision neededA one-page framing document for stakeholder sign-off

The progression is: Problem Brief (what is wrong?) → Discovery Brief (how will we investigate?) → Initiative Brief (should we commit resources?) → PRD (what will we build?). Each document answers the question that the previous one raised.

Exercise: Reframe a Feature Request

Plugin: Custom product-strategy Command: /brief Time: 20 minutes

Step 1 — Read the feature request

Your VP of Engineering, Maria Santos, sends this request:

"We need to add real-time collaboration to InsightFlow. Two of our biggest customers say their analysts waste time because they cannot edit dashboards simultaneously. I want this scoped for Q3."

Identify what makes this request solution-prescriptive. What is the solution being proposed? What is the underlying problem that is not yet articulated?

Step 2 — Run /brief to produce a problem brief

/brief
type: "problem"
context: "InsightFlow is a B2B SaaS analytics platform, Series B,
50 employees, 200 customers. Primary persona: Analyst Alex — data
analysts at 100-500 person companies who build dashboards without SQL.

Feature request from VP Engineering: 'We need to add real-time
collaboration to InsightFlow. Two of our biggest customers say their
analysts waste time because they cannot edit dashboards simultaneously.
I want this scoped for Q3.'

Reframe this as a problem brief. Do not propose solutions."

Step 3 — Evaluate the output against the NEVER DO rules

Check each section:

  • Does THE PROBLEM section contain solution language? Look for verbs like "add," "build," "integrate," or "enable." If the agent wrote "InsightFlow needs real-time editing capabilities," that is solution language — the problem statement should describe the pain (analysts waste time on conflicting edits) not the fix.
  • Is the WHAT WE DO NOT KNOW section substantive? If it says "N/A" or is missing, the brief fails. Prompt the agent: "The WHAT WE DO NOT KNOW section is missing. Add at least three specific unknowns about this problem."
  • Are the discovery questions specific and answerable? "Understand collaboration needs" is too vague. "How many hours per week do analysts at [Customer A] and [Customer B] spend resolving conflicting dashboard edits?" is answerable.

Step 4 — Refine the brief

If the output has quality issues, send a follow-up prompt:

Review this problem brief against these rules:
1. THE PROBLEM section must contain no solution proposals — only
observable pain and current behaviour
2. WHAT WE DO NOT KNOW must list at least 3 specific unknowns
3. DISCOVERY QUESTIONS must be answerable with a named method
(interview, data pull, competitive analysis)
4. Add a NEXT STEP section recommending the first action

Fix any violations.

Step 5 — Extend: try a second feature request

Run /brief on a feature request from your own product backlog (or use this one if you do not have a product):

"Customers keep asking for a mobile app."

This request is maximally vague. The resulting problem brief should surface how little you actually know — the WHAT WE DO NOT KNOW section should be the longest section in the brief. If the agent produces a brief where the problem and evidence sections are longer than the unknowns, something is wrong.

What You Built

You produced a problem brief — a structured document that reframes a solution-prescriptive feature request into a problem-focused investigation. The brief aligns your team on what is wrong, who is affected, what evidence exists, and what you still need to learn — without committing to a specific solution.

You also practised the evaluation skill that separates a PM from someone who accepts the first output an AI produces: checking for solution language in the problem statement, verifying that unknowns are explicit, and ensuring discovery questions are specific enough to act on.

The problem brief you created feeds directly into Lesson 4, where you will use /interview to create an interview guide that addresses your discovery questions, and /synthesize-research to process the interview findings into evidence for a feature spec.

Try With AI

Use these prompts in Cowork or your preferred AI assistant.

Prompt 1 — Reproduce (apply what you just learned):

Here is a feature request for InsightFlow:

"Our onboarding takes too long. New users take 3 weeks to build their
first dashboard. We need to add templates — pre-built dashboards they
can clone and customise."

Reframe this as a problem brief using the problem brief format:
THE PROBLEM, WHO IS AFFECTED, EVIDENCE, IMPACT OF NOT SOLVING,
WHAT WE DO NOT KNOW YET, DISCOVERY QUESTIONS.

Rules:
- THE PROBLEM section must not mention templates or any specific solution
- WHAT WE DO NOT KNOW must have at least 3 items
- DISCOVERY QUESTIONS must name the research method for each question

What you're learning: Practising the reframe on a request where the solution (templates) sounds obviously right — which makes it harder to resist including it in the problem statement. The discipline is the same regardless of how good the proposed solution sounds.

Prompt 2 — Adapt (change the context):

Take this feature request from a different product domain:

"Our e-commerce platform needs a recommendation engine. Customers
browse an average of 12 products but only purchase 1.2. A recommendation
engine would increase basket size."

Reframe this as a problem brief. The problem is NOT "we need a
recommendation engine" — the problem is the gap between browsing
behaviour and purchase behaviour. What do we not know about why
customers browse 12 items but buy 1.2?

What you're learning: Applying problem-focused framing outside the InsightFlow context. The principle is domain-independent: the PM's job is always to understand the problem before the solution, regardless of the product.

Prompt 3 — Apply (connect to your domain):

Think of a feature request you have received recently — from a
stakeholder, customer, or your own idea. Write it down as it was
originally phrased.

Now reframe it as a problem brief. Focus on:
1. What is the observable pain? (Not "users need X" — what are users
currently experiencing that is wrong?)
2. What evidence do you have? (Be honest — if you have none, say so)
3. What do you NOT know? (List at least 3 genuine unknowns)
4. What are the 3 discovery questions you would need to answer before
writing a spec?

Compare your problem brief to the original feature request. What
assumptions were hidden in the original phrasing that are now visible?

What you're learning: The ultimate test — applying problem-focused framing to your own work. The goal is to surface assumptions that were invisible in the original feature request but become obvious in the problem brief format.

Flashcards Study Aid


Continue to Lesson 4: User Research — Interviews & Synthesis →