Skip to main content

Analytics and Tracking

What is the minimum instrumentation that makes a feedback loop readable — and what does it cost to not have it?

Why Measure

A page that converts well and a page that looks like it converts well are identical until you measure. Analytics is the gauge that makes the loop real. Without it, every copy change is an opinion.

The instrumentation stack has three layers:

LayerMeasuresChanges when
PageviewTraffic volume and sourceBaseline — rarely
EventNamed user actions (click, scroll, expand)When you add a new conversion goal
Session replayWhere users stop, hesitate, or leaveWhen events show a drop-off but not the cause

Event-First Architecture

Session-first analytics tools (built around page sessions) are poor instruments for conversion work. An event-first tool fires a named signal each time something meaningful happens — CTA click, scroll past a section, FAQ opened — and those signals compose into the conversion picture you need.

The events to wire before measuring any conversion:

EventWhat it provesThreshold to watch
Primary CTA clickAttention converted to intent≥ 5% of pageviews
Scroll to 50%Framework section reached≥ 30% of pageviews
Return visit within 7 daysCompounding attention≥ 15% of unique visitors
FAQ expandedObjection surfaced≥ 10% of pageviews

Do not add events speculatively. Each event is a question you are actively trying to answer. If you do not have a hypothesis for what the number should be, do not wire the event.

Conversion Benchmarks

Conversion rates vary significantly by page type and traffic source. Use these as orientation only — your baseline is the only number that matters for measuring improvement.

Page typeGood bandGreat band
Landing page2–5%5–10%
Signup or trial form20–30%30–50%
Pricing page5–10%10–20%
Blog to email capture1–3%3–5%

A flat rate within the Good band is a kill signal if the trend is flat for more than 4 weeks. Stable mediocrity is not stability — it is drift without correction.

Technical Gauges

Conversion is bounded by technical performance. A page that loads in 4 seconds cannot convert at the same rate as one that loads in 1.5 seconds, regardless of copy quality.

Minimum thresholds before considering copy improvements:

MetricThresholdWhy
LCP (largest contentful paint)< 2.5sAbove this, bounce rate climbs sharply
CLS (cumulative layout shift)< 0.1Layout jumps destroy trust signals
INP (interaction to next paint)< 200msSlow response to clicks signals broken UI
Performance score≥ 90Lighthouse composite — below this, technical debt is the conversion problem
SEO score≥ 95Below this, organic traffic is throttled before measurement starts

Run technical gauges against the deployed URL, not localhost. Scores differ by 20–40 points in some cases.

Instrumentation touches visitor data. Two rules that do not bend:

  1. Session replay must mask inputs. Keyboard content is never recorded.
  2. Consent gates pageview tracking in jurisdictions where GDPR or equivalent applies. Measure unblocked traffic separately — consent-rate itself is a signal.

Context

  • SEO — organic traffic is the pool analytics measures conversion against
  • Landing Page — the page structure analytics instruments
  • Performance — the outer loop that analytics feeds

Questions

What is the minimum set of events that would make the conversion loop readable — and which one, if it moved 50%, would tell you everything you need to know?

  • At what point does adding more events cost more in engineering time and cognitive noise than the signal produces?
  • How do you separate copy drift from audience drift when conversion rates decline over 12 weeks?
  • Which anti-pattern in analytics implementation is most commonly introduced by developers who know enough to be dangerous: over-instrumentation, under-instrumentation, or event naming inconsistency?