Menu

The PM Co-Pilot: From Messy Brief to Developer-Ready Spec

English

When the AI knows your codebase, it can structure your specs, surface edge cases, and write acceptance criteria that actually make sense. A workflow for spec-driven development without the three-hour grind.

As someone building products, using AI to augment your work is super helpful. Usually, the normal case would be you explain something to the AI, and you get the AI to help you with the structuring and asking you questions and clarifying some points that might not be clear.

But if the AI also can read the code base and have the full context of the current platform, then it means that the AI will be able to even more accurately review, understand your code base, review the task or the problems that you are trying to solve, and write something very close to spec-driven development. It will help you write the expected outcome as well which will also be coherent with the platform that you already have.

The Problem: Vague Specs, Slow Feedback Loops

Most of the time, PMs write task cards that are incomplete or unclear. Developers read them, ask clarifying questions. Back-and-forth starts. Days get lost.

Here's what that looks like:

We need to capture and log UTM parameters when users register on our platform. Currently we run Meta ads campaigns, but will expand to other channels in the future like Google Ads and TikTok. When a user registers, extract any UTM parameters from the URL (utm_source, utm_medium, utm_campaign, utm_content, utm_term) and store them tied to that user's record in the database. For example, a user clicking a Meta ad might arrive with ?utm_source=meta&utm_medium=cpc&utm_campaign=summer_sale, and we need to log exactly those values. If no UTM parameters existβ€”such as direct traffic, organic search, or word-of-mouthβ€”store null values for those fields. Expand the existing users table in MySQL to include columns for each UTM parameter.

It's complete thinking. But it's not structured. Developers will ask:

  • What happens if the user comes back on a different device?

  • Do we track first-touch or last-touch attribution?

  • What if UTM params are partial?

The Solution: Template + AI Co-Piloting

Instead of spending hours refining that input yourself, you give it to an AI tool that knows your codebase. You ask it to structure this into a task card using your template. When you hit uncertaintyβ€”like "what are the edge cases here?" - you ask the AI to walk you through them.

OUTPUT (Structured Task Card):

Card Title: Capture UTM Parameters on User Registration

Description:

Capture and store UTM parameters when users register on the platform. This enables tracking of acquisition sources for marketing attribution (Meta Ads, Google Ads, TikTok, etc.).

UTM Parameters to Capture:

ParameterDescriptionExample
utm_sourceTraffic sourcemeta, google, tiktok
utm_mediumMarketing mediumcpc, email, social
utm_campaignCampaign namesummer_sale, brand_awareness
utm_contentAd/content variantad_v1, banner_blue
utm_termPaid search keywordwellness_music

Example URL:

https://finetunes.app/landing?utm_source=meta&utm_medium=cpc&utm_campaign=summer_sale

Behavior:

ScenarioStored Values
User arrives via Meta ad with UTM paramsStore all UTM values
User arrives via direct traffic (no params)Store null for all UTM fields
User arrives via organic search (no params)Store null for all UTM fields
User arrives with partial UTM paramsStore available values, null for missing

Database Changes:

Add columns to users table:

ColumnTypeNullable
utm_sourceStringYes
utm_mediumStringYes
utm_campaignStringYes
utm_contentStringYes
utm_termStringYes

Implementation Flow:

StepAction
1User lands on site with UTM params in URL
2Store UTM params in session/local storage
3User completes registration
4Extract stored UTM params
5Save UTM values to user record in database

Edge Cases:

ScenarioHandling
User navigates away before registeringPersist UTM params in session/local storage
UTM params overwritten by new visitKeep first-touch or last-touch?
User registers on different deviceUTM params lost (acceptable)

Question: First-Touch or Last-Touch Attribution?

  • First-touch: Keep original UTM params even if user returns via different source
  • Last-touch: Overwrite with most recent UTM params

Acceptance Criteria:

  • Database schema updated with UTM columns
  • UTM params extracted from URL on landing
  • UTM params persisted until registration
  • UTM values saved to user record on registration
  • Null values stored when no UTM params present
  • Works for Meta, Google, TikTok, and any future ad sources

Why This Works

The card is now spec-driven and developer-ready. No ambiguity. Developers know exactly what to build and what to test. The edge cases are visible. Open questions are explicit.

And you didn't spend three hours perfecting it. You gave the AI your raw thinking, asked it to structure it, asked it to surface edge cases, and refined from there.

The Template

You can use this template for any task card. Fill it in yourself, or use it as a starting point with an AI tool. Either way, the structure forces clarity.

## **Task Card Template for Product/Design Specifications**

---

### **Card: [Title β€” Brief, Action-Oriented]**

**Description:**  
[1-2 sentences explaining what this card accomplishes and why.]

---

**Scope / Applies to:**

| Area | Included |
|------|----------|
| [Page/Component/Feature] | βœ“ / βœ— |
| [Page/Component/Feature] | βœ“ / βœ— |

---

**Current state (if redesign):**
- [What exists today]
- [Pain point or limitation]

---

**Proposed design / behavior:**

| Element | Description |
|---------|-------------|
| [Element name] | [What it does / how it looks] |
| [Element name] | [What it does / how it looks] |

---

**Visual mockup (ASCII or description):**
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ [Visual representation] β”‚
β”‚ [of the component]      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜


---

**Key behaviors / interactions:**

| Action | Result |
|--------|--------|
| [User action] | [What happens] |
| [User action] | [What happens] |

---

**States (if applicable):**

| State | Visual / Behavior |
|-------|-------------------|
| Default | [Description] |
| Active | [Description] |
| Disabled | [Description] |

---

**Design variations (for designer):**

| Variation | Description |
|-----------|-------------|
| A | [Option A details] |
| B | [Option B details] |

---

**Dependencies / Blocked by:**

| Feature/Card | Relationship |
|--------------|--------------|
| [Card name] | Blocked by / Blocks / Related |

---

**Questions / Open items:**
- [Any decisions still needed?]

---

**Acceptance criteria:**
- [ ] [Specific, testable requirement]
- [ ] [Specific, testable requirement]
- [ ] [Specific, testable requirement]

---

### **Usage notes:**

- **Keep it scannable** β€” Use tables over paragraphs
- **Be specific** β€” Avoid vague language like "improve" or "better"
- **Show, don't just tell** β€” ASCII mockups clarify intent
- **Separate concerns** β€” One card = one focused feature/change
- **Link dependencies** β€” Reference related cards by name
- **Designer vs Dev** β€” Clarify what's for design exploration vs fixed requirements

Getting Started

  • Try this with your next feature request
  • Pair it with an AI tool that can read your codebase
  • When you're unsure about something, ask the AI to walk you through it
  • Refine, and ship the card

It's not autopilot. It's co-piloting. And it works.