Skip to main content

Type-First Development

What if the compiler could tell you what to build next?

DOMAIN (contracts) → INFRASTRUCTURE (repos) → APPLICATION (logic) → PRESENTATION (UI)
│ │ │ │
▼ ▼ ▼ ▼
typecheck typecheck typecheck typecheck
green? ──→ green? ──→ green? ──→ green? = DONE

Flow engineering tells you WHAT to build. Type-first development tells you HOW to build it — start at the domain, let TypeScript errors pull you outward layer by layer. The compiler becomes the methodology.

The Principle

Three ideas that compound:

IdeaWhat It Means
Domain-firstStart at the center. Contracts define what exists. Every other layer adapts.
Type-drivenChange a type, run typecheck. Red squiggles ARE your todo list.
Constraint satisfactionThis isn't creative design — it's satisfying constraints. Agents excel at this.

Why this matters for AI products: agents hallucinate architectures. Give them a concrete verification loop and they can't go wrong. Infinite search space becomes a deterministic algorithm.

Cost Escalation

Every layer further from the type system multiplies debugging cost by 10x.

CHEAPEST                                                    MOST EXPENSIVE
| |
v v

TypeScript ──> Zod Boundary ──> PostgreSQL ──> Production
(0s) (<1s) (10+ min) (hours + trust)

Real incident: plan-cli.ts piped phase objects with a missing phaseSlug.

LayerWhat HappenedCost
TypeScriptRed squiggle. PlanningPhaseInsert marks phaseSlug required.0s
ZodphaseSlug -- Required and exit. Schema already exported from Drizzle.<1s
PostgreSQL23502: NOT NULL violation. No field name without reading docs.10+ min
Production500 error. "Something went wrong." Hotfix cycle.Hours
LayerYou GetYou Must Do
TypeScriptExact file, line, fieldNothing
ZodField name + messageRead the error
PostgreSQLError code + constraintMap code to column to input to object
ProductionStack trace in logsFind, reproduce, fix, deploy, notify

The validator existed. The type existed. One line was missing. For agents, the cost isn't time — it's signal. TypeScript gives the exact field to fix. PostgreSQL gives an opaque code the agent can't act on. Break the type boundary, break the agent's steering loop.

// What was there (cost: 10+ minutes)
const phases = JSON.parse(stdin) as Record<string, unknown>[];

// What should have been there (cost: 0)
const phases = phasesInputSchema.parse(JSON.parse(stdin));

Tracking Cost

The 10x multiplier is a claim. Track it.

Every boundary incident gets three fields:

FieldExample
Layer caughtPostgreSQL
Time to resolve12 minutes
Layer it should have been caughtZod (schema existed, wasn't connected)

Two things to measure:

MetricQuestionHow
Does the 10x hold?Is PostgreSQL really 10x more expensive than Zod?Compare median time-to-resolve by layer
Are boundaries wired?Do createInsertSchema exports have matching .parse() calls at every boundary?grep -r 'createInsertSchema' --include='*.ts' -l vs grep -r '\.parse(' --include='*.ts' -l — schemas without consumers are validators nobody uses

The second grep is the one that catches the actual incident pattern — the validator exists but isn't connected. Track schema-coverage as a codebase health metric alongside test coverage.

See Cost of Quality for the enforcement-tier view.

The Algorithm

The Boris Rule (paraphrasing Boris Cherny): Give Claude a concrete verification loop and tell it to iterate until checks pass.

1. Make domain change (ports, entities, DTOs)
2. Run typecheck
3. Red? FIX IMMEDIATELY — never proceed with errors
4. Green? Move outward to next layer
5. Repeat until all layers green
6. All green = done

Never batch fixes across layers. Never proceed with red. This is constraint satisfaction, not design.

Pre-Flight

Before any change, answer four questions:

QuestionWhat It Reveals
What outcome? (1-3 sentences)Maps to Outcome Map
What binary measure makes it "done"?Test, metric, or demo path
Which layer? (domain / infra / app / UI)Where to start
Does a generator cover this?Use it before hand-coding

The flow engineering maps answer the strategic questions. Pre-flight answers the tactical ones.

Layer Model

Dependencies point inward. Updates propagate outward.

┌─────────────────────────────────────────────────┐
│ PRESENTATION │
│ ┌─────────────────────────────────────────┐ │
│ │ APPLICATION │ │
│ │ ┌─────────────────────────────────┐ │ │
│ │ │ INFRASTRUCTURE │ │ │
│ │ │ ┌─────────────────────────┐ │ │ │
│ │ │ │ DOMAIN │ │ │ │
│ │ │ │ │ │ │ │
│ │ │ │ Ports, Entities │ │ │ │
│ │ │ │ DTOs, Events │ │ │ │
│ │ │ │ │ │ │ │
│ │ │ └─────────────────────────┘ │ │ │
│ │ │ Repos, Adapters │ │ │
│ │ └─────────────────────────────────┘ │ │
│ │ Use Cases, Orchestrators │ │
│ └─────────────────────────────────────────┘ │
│ Components, Actions, Routes │
└─────────────────────────────────────────────────┘
LayerContainsRule
DomainPorts, entities, DTOs, eventsSource of truth. Never imports outward.
InfrastructureRepositories, adaptersImplements domain ports. Only layer touching the database.
ApplicationUse cases, orchestratorsComposes infrastructure through ports. Business logic lives here.
PresentationComponents, actions, routesConsumes application layer. Transforms contracts into views.

When a domain contract changes, the compiler lights up every file that needs updating — outward through infrastructure, application, presentation. Red squiggles are breadcrumbs.

Maps to Code

Each flow engineering map produces specific code artifacts:

MapCode ArtifactsLayer
Outcome MapPorts, DTOs, domain eventsDomain
Value Stream MapUse cases, repositories, adaptersInfrastructure + Application
Dependency MapComposition roots, task orderingApplication + Presentation
Capability MapGenerators, skills, work chartsPlatform
A&IDAgent configs, instrument schemasAll layers

This is the bridge between pictures and products. The maps aren't documentation ABOUT the code — they produce the code.

Type Boundaries

Data transforms explicitly at each layer boundary:

┌──────────────┐   serialize()   ┌──────────────┐   map()   ┌──────────────┐
│ DOMAIN │ ──────────────→ │ CONTRACT │ ────────→ │ VIEW │
│ │ │ │ │ │
│ Rich types │ │ Wire-safe │ │ UI-ready │
│ Date objects │ │ ISO strings │ │ Formatted │
│ Numbers │ │ Strings │ │ Numbers │
└──────────────┘ └──────────────┘ └──────────────┘
BoundaryTransformationWhy
Domain to ContractDate becomes string, number may become stringJSON safety, precision
Contract to Viewstring becomes number, dates formatted for displayUI consumption

Each transformation is an explicit function. No implicit coercion. The compiler catches every mismatch.

The Trap

// Domain: amount is a number
interface Deal {
amount?: number;
}

// Contract: amount becomes a string (precision)
interface SerializedDeal {
amount: string | undefined;
}

// View: amount is a number again (for calculations)
interface DealView {
amount?: number;
}

// The mapper that makes it safe
const toDealView = (s: SerializedDeal): DealView => ({
amount: s.amount ? Number(s.amount) : undefined,
});

Without the mapper, you assign a string to a number. TypeScript catches it. Without TypeScript, your UI silently displays "150000" where it should calculate 150000. The type boundary IS the safety net.

The as Trap

as tells the compiler "trust me" at the exact boundary where trust should be zero. It does not validate, transform, or check — it tells the compiler to stop looking. JSON.parse returns any, and as paints a type-shaped lie on top.

// WRONG — compiler trusts you, runtime does not
const data = JSON.parse(stdin) as PhaseInput[];

// WRONG — same lie told twice
const data = JSON.parse(stdin) as unknown as PhaseInput[];

// WRONG — honest about lying, still no validation
const data: PhaseInput[] = JSON.parse(stdin);

// CORRECT — Zod validates at the boundary
const data = phasesInputSchema.parse(JSON.parse(stdin));
PatternVerdict
JSON.parse(x) as TypeBoundary violation. Replace with Zod.
JSON.parse(x) as Record<string, unknown>[]Boundary violation. Generic type gives false safety.
request.json() as TypeBoundary violation. Validate in server action or API route.
as any on external dataBoundary violation. Silences every downstream check.
as unknown as TypeSuspicious. Usually masking a shape mismatch.
schema.parse(input)Correct. Throws on invalid input with field-level errors.
schema.safeParse(input)Correct. Returns { success, data, error } for graceful handling.

Every external data boundary gets a Zod schema. No as casts. An agent running the type-first algorithm needs the compiler as its oracle — as silences the oracle at the one boundary where it matters most.

Diagnosis

When type errors surface, trace the boundary:

StepQuestionAction
1Where is the boundary?Domain to Contract? Contract to View? Action to Hook?
2Is a transformation missing?Date to string? number to string?
3Is the contract type wrong?Does SerializedX match what the serialize function returns?
4Is the consumer wrong?Is the component expecting domain types instead of contracts?
5Is there a missing mapper?Does a transformation exist between contract and view?

80% of type errors at layer boundaries are serialization mismatches. Check the boundaries first.

The Pit of Success

Falling into the Pit of Success — design systems where doing the right thing is the path of least resistance.

Types don't just guide implementation. They produce test specs. A Deal type with amount: number and stage: DealStage simultaneously tells:

  • The compiler: this component must accept a number, not a string
  • The test: this function must return a Deal with a valid stage
Types ──→ Test Specs ──→ Implementation
│ │ │
│ │ └─ Compiler catches mismatches (CAN'T ignore errors)
│ │
│ └─ Failing tests define "done" (CAN'T ship without green)

└─ Domain contracts ARE acceptance criteria (CAN'T avoid specifying success)

Generators make this automatic. You can't skip domain types because the generator won't run without a schema. You can't skip tests because the plan template gates on them. The system makes correct execution the only comfortable path.

Types Drive Test Specs

When flow maps produce domain contracts, those contracts become two things at once:

  1. Type definitions — the compiler's todo list
  2. Test expectations — the spec's acceptance criteria

UI fixtures prove this works: deterministic data that mirrors domain types exactly. Same shape for demos, tests, and production. If a fixture type doesn't match the domain type, that's a bug. The fixture IS the test expectation. The domain type IS the acceptance criteria.

Domain Types (source of truth)

Fixtures mirror these (test data)

UI Components consume fixtures (demos + tests)

Server Actions return real data with same shape (production)

Generator-First

Never hand-code what a generator can scaffold. Generators enforce correct layer order automatically — domain first, then infrastructure, then application, then presentation. The generator is a capability map turned into a tool.

When a pattern occurs more than twice, it becomes a generator. When a generator exists, using it is mandatory. This is how capabilities compound — codified knowledge replacing manual effort.

Drizzle Schemas

If you use Drizzle ORM, runtime validators for every table already exist — generated from the table definition.

export const planningPhase = pgTable("planning_phases", {
id: uuid("id")
.primaryKey()
.default(sql`gen_random_uuid()`),
planId: uuid("plan_id")
.references(() => planningPlan.id)
.notNull(),
phaseSlug: varchar("phase_slug", { length: 255 }).notNull(),
name: varchar("name", { length: 255 }).notNull(),
orderIndex: integer("order_index").notNull(),
});

// Generated automatically — phaseSlug required, planId required, id optional:
export const insertPlanningPhaseSchema = createInsertSchema(planningPhase);

The connection that was missing:

// Schema file exported this:
export const insertPlanningPhaseSchema = createInsertSchema(planningPhase);

// CLI file bypassed it:
const phases = JSON.parse(stdin) as Record<string, unknown>[];

The pgTable() call generates both the TypeScript type AND the Zod validator. Same source of truth. The only step is connecting them at the boundary:

const validated = insertPlanningPhaseSchema.parse(rawInput);

The Compounding Loop

Each entity commissioned improves the generator for the next:

Entity 1: Generator v1 → 60% correct, 40% manual fixes → improve generator
Entity 2: Generator v2 → 80% correct, 20% manual fixes → improve generator
Entity 3: Generator v3 → 95% correct, 5% edge cases → improve generator
Entity N: Generator vN → approaching 100% correct output

This only works if the improvement step is structural — baked into every plan template as a mandatory task. "Review generated output. If manual fixes were needed, update the generator template. Commit the improvement before marking the plan complete."

Documented-but-skippable doesn't compound. Enforced-by-template does.

Plan Template Enforcement

The full sequence, enforced by plan templates:

PhaseWhatGate
ExploreMap the value flow, understand domainContracts defined
TypesEncode contracts as TypeScriptTypecheck green
Test SpecsGenerate failing tests from typesTests exist and fail
BuildImplement to pass tests, use generatorsTests green
ValidateOutcomes match what exploration predictedSpec expectations met
ImproveFold manual fixes back into generatorsGenerator template updated

The template IS the methodology. When the plan template enforces explore-first and test-spec-before-implementation, agents fall into the pit of success without needing to understand the theory.

Evidence: Before and After Templates

The difference between ad-hoc plans and template-derived plans:

DimensionWithout TemplatesWith Templates
Task originWritten from scratch each timeDerived from composable templates with source lines
TDDOptional, usually skippedEnforced RED→GREEN sequence in specific tasks
Testing depth"Add tests" (vague)Three-tier pattern: unit (T1), integration (T2), e2e (T3)
UI quality gatesNoneCDD file limits, data-testid conventions, Page Object Models
SecurityAfterthoughtSecurity test triad: positive + negative + DB verification
Assertion quality"Assert it works"Assertion levels: value check + DB state verification
Proof of completionTrust the developerProof commands that verify outcomes mechanically
RetrospectiveOptional post-mortemVVFL retrospective task baked into every plan
Cross-team compositionSingle authorTasks routed to specialist teams (meta, intel, UI, PE)

Same 16 tasks. Same feature. The template version enforces every gate structurally — agents can't skip what the template requires. The ad-hoc version relies on the agent knowing and remembering every practice. Under load, memory fails. Templates don't.

Checklists

Before Work

  • Outcome and binary success measure defined
  • Relevant flow map identified
  • Primary layer identified (domain / infra / app / presentation)
  • Existing generators and patterns checked
  • External data boundaries identified (stdin, API, webhooks)
  • Test expectations defined from domain types

During Work

  • Domain changes first — no domain file imports outward
  • Test spec written before implementation (failing test = spec)
  • TypeScript errors used as breadcrumbs, layer by layer
  • Public contracts in domain, internal validation colocated with consumers

Before Commit

  • All touched layers pass typecheck
  • All tests green — outcomes match spec expectations
  • No any or @ts-ignore added to silence errors
  • No as casts on external data boundaries — Zod validates instead
  • If a pattern repeated, generator considered
  • If generated code needed manual fixes, generator improved
  • Relevant flow maps updated with new knowledge

Context

References