Product Design
How do you know your design is getting better?
The Product Design Bible defines thresholds. This page tracks whether you're hitting them. Without both, standards are static checklists. With both, Deming's loop runs: Plan (set the threshold) → Do (build to spec) → Check (measure here) → Act (update the standard).
The Loop
STANDARDS (design bible) → defines "4.5:1 contrast ratio"
↓
BUILD (execution) → developer runs checklist
↓
MEASURE (this page) → did we hit the threshold?
↓
LEARN (feedback) → which standards need updating?
↓
BETTER STANDARDS → bible evolves
Leading Indicators
Predict design quality before users encounter problems.
| Metric | Threshold | Tool | Cadence |
|---|---|---|---|
| Lighthouse Performance | 90+ | Chrome DevTools | Every deploy |
| Lighthouse Accessibility | 90+ | Chrome DevTools | Every deploy |
| Token resolution rate | 100% (zero transparent on visible elements) | DevTools Computed tab | After CSS changes |
| Contrast compliance | 100% of text at 4.5:1+ | axe DevTools | Every deploy |
| Horizontal overflow | Zero unintended scrollbars | All 4 breakpoints | Every deploy |
| Touch target compliance | 100% of interactive elements at 44x44px+ | DevTools element inspector | After layout changes |
| Core Web Vitals (LCP) | Under 2.5s | PageSpeed Insights | Weekly |
| Core Web Vitals (CLS) | Under 0.1 | PageSpeed Insights | Weekly |
| Core Web Vitals (INP) | Under 200ms | PageSpeed Insights | Weekly |
Lagging Indicators
Confirm design quality through user behavior.
| Metric | What it reveals | Threshold | Tool |
|---|---|---|---|
| Task completion rate | Can users do the job? | 90%+ on primary flows | Analytics funnel |
| Time to first action | Is the CTA obvious? | Under 10s from page load | Session recording |
| Bounce rate | Does the page hook? | Under 40% on landing pages | Analytics |
| Form abandonment | Is friction too high? | Under 20% | Analytics funnel |
| Conversion rate | Does the page convert? | Trend up quarter over quarter | Analytics |
| Accessibility complaints | Are we excluding people? | Zero per quarter | Support tickets |
| Five-second test pass rate | Is the message clear? | 80%+ correct identification | User testing |
Automated Audit
The Design Review page includes JavaScript audit scripts that produce scored output. Run these against any page and record the results.
| Audit | What it checks | Source |
|---|---|---|
| Rendering audit | Token resolution, computed values, overflow | Rendering Verification |
| Accessibility audit | Headings, alt text, color independence, landmarks | Interaction + Accessibility |
| Full design audit | All visual, responsive, and interactive checks | Design Review |
Monitoring Cadence
| Frequency | What to check |
|---|---|
| Every deploy | Lighthouse scores, token resolution, overflow, contrast |
| Weekly | Core Web Vitals, page load trends |
| Monthly | Conversion funnel, task completion, bounce rates |
| Quarterly | Full design audit against bible, five-second tests, accessibility review |
| Annually | Review which thresholds need updating based on accumulated data |
Feedback Path
When metrics reveal patterns, update the standard:
| Pattern | Action |
|---|---|
| Lighthouse consistently 95+ | Raise threshold to 95 |
| New WCAG version released | Update accessibility thresholds |
| Users fail a flow despite passing checks | Add missing check to bible |
| A threshold is always met effortlessly | Automate it (CI check), remove from manual checklist |
| A threshold is never met | Either fix the system or adjust the threshold with justification |
This is Wathan's constraint principle applied to measurement: if the system always prevents the violation, stop checking manually and let CI enforce it.
Context
- Product Design Bible -- The thresholds being measured
- Standards -- Why measurable thresholds matter
- Software Development Metrics -- Parallel metrics for code quality
- Process Optimisation -- PDCA loop mechanics
- Developer Experience -- DX as leading indicator for design quality