Skip to main content

Type-First Development

What if the compiler could tell you what to build next?

DOMAIN (contracts) → INFRASTRUCTURE (repos) → APPLICATION (logic) → PRESENTATION (UI)
│ │ │ │
▼ ▼ ▼ ▼
typecheck typecheck typecheck typecheck
green? ──→ green? ──→ green? ──→ green? = DONE

Flow engineering tells you WHAT to build. Type-first development tells you HOW to build it — start at the domain, let TypeScript errors pull you outward layer by layer. The compiler becomes the methodology.

The Principle

Three ideas that compound:

IdeaWhat It Means
Domain-firstStart at the center. Contracts define what exists. Every other layer adapts.
Type-drivenChange a type, run typecheck. Red squiggles ARE your todo list.
Constraint satisfactionThis isn't creative design — it's satisfying constraints. Agents excel at this.

Composer Model

The type system is the conductor. The engineer is the composer — writes the score (domain contracts), then the type system conducts agents through it.

"Sketch out type signatures first and fill in values later." — Boris Cherny, Programming TypeScript

The composer writes the score. The conductor (compiler) ensures every instrument plays the right notes. The performers (agents) fill in the values. The engineer's primary contribution is not code — it is domain judgment.

This distinction matters. Domain contracts (Layer 1) are design — the hardest, most judgment-intensive work. Everything outward from the domain is constraint satisfaction. Agents excel at constraint satisfaction. Humans excel at domain design. Type-first draws the line: you bring the domain knowledge, the type system handles everything outward.

The fastest teams don't ask for code. They constrain the task, persist project context, use short cycles, and force verification. The goal is to reduce randomness — fewer hallucinations, fewer unintended refactors, more predictable iterations.

Why this matters for AI products: agents hallucinate architectures. Give them a concrete verification loop and they can't go wrong. Infinite search space becomes a deterministic algorithm.

Cost Escalation

Every layer further from the type system multiplies debugging cost by 10x.

CHEAPEST                                                    MOST EXPENSIVE
| |
v v

TypeScript ──> Zod Boundary ──> PostgreSQL ──> Production
(0s) (<1s) (10+ min) (hours + trust)

Real incident: plan-cli.ts piped phase objects with a missing phaseSlug.

LayerWhat HappenedCost
TypeScriptRed squiggle. PlanningPhaseInsert marks phaseSlug required.0s
ZodphaseSlug -- Required and exit. Schema already exported from Drizzle.<1s
PostgreSQL23502: NOT NULL violation. No field name without reading docs.10+ min
Production500 error. "Something went wrong." Hotfix cycle.Hours
LayerYou GetYou Must Do
TypeScriptExact file, line, fieldNothing
ZodField name + messageRead the error
PostgreSQLError code + constraintMap code to column to input to object
ProductionStack trace in logsFind, reproduce, fix, deploy, notify

The validator existed. The type existed. One line was missing. For agents, the cost isn't time — it's signal. TypeScript gives the exact field to fix. PostgreSQL gives an opaque code the agent can't act on. Break the type boundary, break the agent's steering loop.

// What was there (cost: 10+ minutes)
const phases = JSON.parse(stdin) as Record<string, unknown>[];

// What should have been there (cost: 0)
const phases = phasesInputSchema.parse(JSON.parse(stdin));

Tracking Cost

The 10x multiplier is a claim. Track it.

Every boundary incident gets three fields:

FieldExample
Layer caughtPostgreSQL
Time to resolve12 minutes
Layer it should have been caughtZod (schema existed, wasn't connected)

Two things to measure:

MetricQuestionHow
Does the 10x hold?Is PostgreSQL really 10x more expensive than Zod?Compare median time-to-resolve by layer
Are boundaries wired?Do createInsertSchema exports have matching .parse() calls at every boundary?grep -r 'createInsertSchema' --include='*.ts' -l vs grep -r '\.parse(' --include='*.ts' -l — schemas without consumers are validators nobody uses

The second grep is the one that catches the actual incident pattern — the validator exists but isn't connected. Track schema-coverage as a codebase health metric alongside test coverage.

See Cost of Quality for the enforcement-tier view.

The Algorithm

The Boris Rule (Boris Cherny): "The most important thing to get great results — give the agent a way to verify its work. If it has that feedback loop, it will 2-3x the quality of the final result."

1. Make domain change (ports, entities, DTOs)
2. Run typecheck
3. Red? FIX IMMEDIATELY — never proceed with errors
4. Green? Move outward to next layer
5. Repeat until all layers green
6. All green = done

Never batch fixes across layers. Never proceed with red. Layer 1 is design — the domain contracts require judgment. Layers 2-4 are constraint satisfaction — the compiler tells you what to do next.

Pre-Flight

Before any change, answer four questions:

QuestionWhat It Reveals
What outcome? (1-3 sentences)Maps to Outcome Map
What binary measure makes it "done"?Test, metric, or demo path
Which layer? (domain / infra / app / UI)Where to start
Does a generator cover this?Use it before hand-coding

The flow engineering maps answer the strategic questions. Pre-flight answers the tactical ones.

Layer Model

Dependencies point inward. Updates propagate outward.

┌─────────────────────────────────────────────────┐
│ PRESENTATION │
│ ┌─────────────────────────────────────────┐ │
│ │ APPLICATION │ │
│ │ ┌─────────────────────────────────┐ │ │
│ │ │ INFRASTRUCTURE │ │ │
│ │ │ ┌─────────────────────────┐ │ │ │
│ │ │ │ DOMAIN │ │ │ │
│ │ │ │ │ │ │ │
│ │ │ │ Ports, Entities │ │ │ │
│ │ │ │ DTOs, Events │ │ │ │
│ │ │ │ │ │ │ │
│ │ │ └─────────────────────────┘ │ │ │
│ │ │ Repos, Adapters │ │ │
│ │ └─────────────────────────────────┘ │ │
│ │ Use Cases, Orchestrators │ │
│ └─────────────────────────────────────────┘ │
│ Components, Actions, Routes │
└─────────────────────────────────────────────────┘
LayerContainsRule
DomainPorts, entities, DTOs, eventsSource of truth. Never imports outward.
InfrastructureRepositories, adaptersImplements domain ports. Only layer touching the database.
ApplicationUse cases, orchestratorsComposes infrastructure through ports. Business logic lives here.
PresentationComponents, actions, routesConsumes application layer. Transforms contracts into views.

When a domain contract changes, the compiler lights up every file that needs updating — outward through infrastructure, application, presentation. Red squiggles are breadcrumbs.

Maps to Code

Each flow engineering map produces specific code artifacts:

MapCode ArtifactsLayer
Outcome MapPorts, DTOs, domain eventsDomain
Value Stream MapUse cases, repositories, adaptersInfrastructure + Application
Dependency MapComposition roots, task orderingApplication + Presentation
Capability MapGenerators, skills, work chartsPlatform
A&IDAgent configs, instrument schemasAll layers

This is the bridge between pictures and products. The maps aren't documentation ABOUT the code — they produce the code.

Type Boundaries

Data transforms explicitly at each layer boundary:

┌──────────────┐   serialize()   ┌──────────────┐   map()   ┌──────────────┐
│ DOMAIN │ ──────────────→ │ CONTRACT │ ────────→ │ VIEW │
│ │ │ │ │ │
│ Rich types │ │ Wire-safe │ │ UI-ready │
│ Date objects │ │ ISO strings │ │ Formatted │
│ Numbers │ │ Strings │ │ Numbers │
└──────────────┘ └──────────────┘ └──────────────┘
BoundaryTransformationWhy
Domain to ContractDate becomes string, number may become stringJSON safety, precision
Contract to Viewstring becomes number, dates formatted for displayUI consumption

Each transformation is an explicit function. No implicit coercion. The compiler catches every mismatch.

The Trap

// Domain: amount is a number
interface Deal {
amount?: number;
}

// Contract: amount becomes a string (precision)
interface SerializedDeal {
amount: string | undefined;
}

// View: amount is a number again (for calculations)
interface DealView {
amount?: number;
}

// The mapper that makes it safe
const toDealView = (s: SerializedDeal): DealView => ({
amount: s.amount ? Number(s.amount) : undefined,
});

Without the mapper, you assign a string to a number. TypeScript catches it. Without TypeScript, your UI silently displays "150000" where it should calculate 150000. The type boundary IS the safety net.

The as Trap

as tells the compiler "trust me" at the exact boundary where trust should be zero. It does not validate, transform, or check — it tells the compiler to stop looking. JSON.parse returns any, and as paints a type-shaped lie on top.

// WRONG — compiler trusts you, runtime does not
const data = JSON.parse(stdin) as PhaseInput[];

// WRONG — same lie told twice
const data = JSON.parse(stdin) as unknown as PhaseInput[];

// WRONG — honest about lying, still no validation
const data: PhaseInput[] = JSON.parse(stdin);

// CORRECT — Zod validates at the boundary
const data = phasesInputSchema.parse(JSON.parse(stdin));
PatternVerdict
JSON.parse(x) as TypeBoundary violation. Replace with Zod.
JSON.parse(x) as Record<string, unknown>[]Boundary violation. Generic type gives false safety.
request.json() as TypeBoundary violation. Validate in server action or API route.
as any on external dataBoundary violation. Silences every downstream check.
as unknown as TypeSuspicious. Usually masking a shape mismatch.
schema.parse(input)Correct. Throws on invalid input with field-level errors.
schema.safeParse(input)Correct. Returns { success, data, error } for graceful handling.

Every external data boundary gets a Zod schema. No as casts. An agent running the type-first algorithm needs the compiler as its oracle — as silences the oracle at the one boundary where it matters most.

Diagnosis

When type errors surface, trace the boundary:

StepQuestionAction
1Where is the boundary?Domain to Contract? Contract to View? Action to Hook?
2Is a transformation missing?Date to string? number to string?
3Is the contract type wrong?Does SerializedX match what the serialize function returns?
4Is the consumer wrong?Is the component expecting domain types instead of contracts?
5Is there a missing mapper?Does a transformation exist between contract and view?

80% of type errors at layer boundaries are serialization mismatches. Check the boundaries first.

The Pit of Success

Falling into the Pit of Success — design systems where doing the right thing is the path of least resistance.

Types don't just guide implementation. They produce test specs. A Deal type with amount: number and stage: DealStage simultaneously tells:

  • The compiler: this component must accept a number, not a string
  • The test: this function must return a Deal with a valid stage
Types ──→ Test Specs ──→ Implementation
│ │ │
│ │ └─ Compiler catches mismatches (CAN'T ignore errors)
│ │
│ └─ Failing tests define "done" (CAN'T ship without green)

└─ Domain contracts ARE acceptance criteria (CAN'T avoid specifying success)

Generators make this automatic. You can't skip domain types because the generator won't run without a schema. You can't skip tests because the plan template gates on them. The system makes correct execution the only comfortable path.

Types Drive Tests

When flow maps produce domain contracts, those contracts become two things at once:

  1. Type definitions — the compiler's todo list
  2. Test expectations — the spec's acceptance criteria

UI fixtures prove this works: deterministic data that mirrors domain types exactly. Same shape for demos, tests, and production. If a fixture type doesn't match the domain type, that's a bug. The fixture IS the test expectation. The domain type IS the acceptance criteria.

Domain Types (source of truth)

Fixtures mirror these (test data)

UI Components consume fixtures (demos + tests)

Server Actions return real data with same shape (production)

Generator-First

Never hand-code what a generator can scaffold. Generators enforce correct layer order automatically — domain first, then infrastructure, then application, then presentation. The generator is a capability map turned into a tool.

When a pattern occurs more than twice, it becomes a generator. When a generator exists, using it is mandatory. This is how capabilities compound — codified knowledge replacing manual effort.

Drizzle Schemas

If you use Drizzle ORM, runtime validators for every table already exist — generated from the table definition.

export const planningPhase = pgTable("planning_phases", {
id: uuid("id")
.primaryKey()
.default(sql`gen_random_uuid()`),
planId: uuid("plan_id")
.references(() => planningPlan.id)
.notNull(),
phaseSlug: varchar("phase_slug", { length: 255 }).notNull(),
name: varchar("name", { length: 255 }).notNull(),
orderIndex: integer("order_index").notNull(),
});

// Generated automatically — phaseSlug required, planId required, id optional:
export const insertPlanningPhaseSchema = createInsertSchema(planningPhase);

The connection that was missing:

// Schema file exported this:
export const insertPlanningPhaseSchema = createInsertSchema(planningPhase);

// CLI file bypassed it:
const phases = JSON.parse(stdin) as Record<string, unknown>[];

The pgTable() call generates both the TypeScript type AND the Zod validator. Same source of truth. The only step is connecting them at the boundary:

const validated = insertPlanningPhaseSchema.parse(rawInput);

Compound Value

Types are the reuse mechanism. A component library where every export has a typed contract becomes more valuable with each consumer.

LIB: Define port (ButtonProps, ColumnDef<TData>)
↓ implements
PROVING GROUND: Demonstrate the component works (design system app)
↓ imports
APP A: Consumes the typed component (admin portal)
↓ imports
APP B: Same component, different data (customer app)
↓ imports
APP N: Each new consumer adds constraints

Four mechanisms make this compound:

MechanismWhat HappensWhy Types Matter
Bug fixes propagateFix in one consumer benefits allType contract guarantees the fix doesn't break others
Type narrowing accumulatesEach consumer forces more precise typesGenerics widen capability without breaking backward compatibility
Testing surface compoundsN apps = N sets of integration testsType contract guarantees all usages are valid
Constraints stabilizeMore consumers = more constraints on the APIThe compiler enforces every constraint across the monorepo

The formula: Component Value = Capability x Consumers x Type Safety x Boundary Integrity. Without type safety, adding consumers increases fragility. With type safety, adding consumers increases stability. Without boundary integrity — types exist but aren't connected — the formula collapses. A schema nobody calls is a validator nobody uses.

This is why the component library gets built in libs first and proven in a design system app before any production app consumes it. The proving ground verifies the typed contract works. Production apps then import with confidence — the port is proven, the adapter is theirs.

The type export is the durable artifact. Components get rewritten. Styles change. Frameworks migrate. But ButtonProps, ColumnDef<TData>, ListboxState<TValue> — these persist. The type is the port in hexagonal architecture. The component is the adapter. The consuming app is the driving adapter. More consumers means more constraints on the port, which makes it more valuable.

The Compounding Loop

Each entity commissioned improves the generator for the next:

Entity 1: Generator v1 → 60% correct, 40% manual fixes → improve generator
Entity 2: Generator v2 → 80% correct, 20% manual fixes → improve generator
Entity 3: Generator v3 → 95% correct, 5% edge cases → improve generator
Entity N: Generator vN → approaching 100% correct output

This only works if the improvement step is structural — baked into every plan template as a mandatory task. "Review generated output. If manual fixes were needed, update the generator template. Commit the improvement before marking the plan complete."

Documented-but-skippable doesn't compound. Enforced-by-template does.

Plan Template Enforcement

The full sequence, enforced by plan templates:

PhaseWhatTrophy LayerGate
ExploreMap the value flow, understand domainContracts defined
TypesEncode contracts as TypeScriptL0Typecheck green
Test SpecsGenerate failing tests from typesL1 → L2Tests exist and fail (L1 for schemas, L2 for server actions, L3 only if browser-dependent)
BuildImplement to pass tests, use generatorsL1 → L2Tests green
ValidateOutcomes match what exploration predictedL3 → L4Spec expectations met
ImproveFold manual fixes back into generatorsGenerator template updated

The template IS the methodology. When the plan template enforces explore-first and test-spec-before-implementation, agents fall into the pit of success without needing to understand the theory.

Template Evidence

The difference between ad-hoc plans and template-derived plans:

DimensionWithout TemplatesWith Templates
Task originWritten from scratch each timeDerived from composable templates with source lines
TDDOptional, usually skippedEnforced RED→GREEN sequence in specific tasks
Testing depth"Add tests" (vague)Three-tier pattern: unit (T1), integration (T2), e2e (T3)
UI quality gatesNoneCDD file limits, data-testid conventions, Page Object Models
SecurityAfterthoughtSecurity test triad: positive + negative + DB verification
Assertion quality"Assert it works"Assertion levels: value check + DB state verification
Proof of completionTrust the developerProof commands that verify outcomes mechanically
RetrospectiveOptional post-mortemVVFL retrospective task baked into every plan
Cross-team compositionSingle authorTasks routed to specialist teams (meta, intel, UI, PE)

Same 16 tasks. Same feature. The template version enforces every gate structurally — agents can't skip what the template requires. The ad-hoc version relies on the agent knowing and remembering every practice. Under load, memory fails. Templates don't.

Component Build Order

The type-first methodology turned into a build order for reusable components. Six steps, each producing artifacts that the next step consumes.

StepWhatWhat It Actually Is
1. TypesDefine data shapesDemand specification — what does the consumer need?
2. FixturesDemo and seed dataProof the spec is satisfiable — can real data fit this shape?
3. FunctionsPure transform logicRouting intelligence — transforms between shapes
4. ComponentsBuild in correct libAdapters — ports get their visual implementation
5. Page DataAssembly file, zero renderingComposition root — the connection point
6. PageComposition onlyWhat the user sees — composes from steps 1-5

Steps 1-4 live in shared libraries. Steps 5-6 live in applications. Step 4 is the boundary where shared infrastructure meets application-specific needs. A new application gets steps 1-4 for free and only writes steps 5-6.

Fixtures Before Components

The move most teams skip. Forcing understanding of the data shape before touching UI means the component contract is already proven before a single element gets written. You can't build the component wrong because you already hold the data it consumes.

Page Data Seam

The page data file (step 5) creates an explicit seam between "what data does this page need" and "how does this page render." That seam is where an agent operates autonomously — types constrain the assembly, component contracts constrain the render. Both sides verified by the compiler.

Capability Registry

The lib routing table (interactive, forms, marketing, etc.) is a capability registry. When an application needs a pricing section, it checks the registry first. The question for each new application: what percentage of UI needs are already solved?

That composition ratio is the metric that proves the platform compounds. Track it.

Checklists

Before Work

  • Outcome and binary success measure defined
  • Relevant flow map identified
  • Primary layer identified (domain / infra / app / presentation)
  • Existing generators and patterns checked
  • External data boundaries identified (stdin, API, webhooks)
  • Test expectations defined from domain types

During Work

  • Domain changes first — no domain file imports outward
  • Test spec written before implementation (failing test = spec)
  • TypeScript errors used as breadcrumbs, layer by layer
  • Public contracts in domain, internal validation colocated with consumers

Before Commit

  • All touched layers pass typecheck
  • All tests green — outcomes match spec expectations
  • No any or @ts-ignore added to silence errors
  • No as casts on external data boundaries — Zod validates instead
  • Trophy layer correct — no E2E test written without L2 integration test for referenced server actions
  • If a pattern repeated, generator considered
  • If generated code needed manual fixes, generator improved
  • Relevant flow maps updated with new knowledge

Context

Questions

If the type export is the durable artifact and components are disposable, why do teams still design components first and types second?

  • At what point does a component library have enough consumers that its type contracts are more stable than its implementation?
  • When the conductor defines signatures and agents fill values, what skill atrophies — and does it matter?
  • If adding consumers increases stability through constraints, what is the minimum number of consumers before compound value exceeds maintenance cost?