Why this matters
Attribution tells you which partner touched a conversion. It can't tell you whether the conversion _would have happened without that partner_. The two questions have very different answers — and one of them determines whether you should keep spending with that partner.
A paid-social publisher that drives 1,000 conversions a month at last-click looks valuable. But if 850 of those buyers would have bought anyway (they saw your organic Google result the next day), you're paying for conversions the publisher didn't actually cause.
Incrementality holdouts give you the causal answer: of the conversions that got last-click credit to this partner, how many were _actually_ driven by the partner, and how many would have happened anyway?
How it works
Trcker runs a live A/B test on every click from every partner you enable holdouts for:
- Click arrives from partner P for brand B (and optionally offer O).
- Trcker checks if there's an active holdout rule for this brand/partner/offer.
- A deterministic hash of the click is compared against the rule's holdout percentage (typically 10%).
- If the click falls in the hold-out slice, the click and any downstream conversion are flagged
is_holdout = truebut still redirected to your offer normally. The visitor's experience is unchanged. - The visitor's conversion rate (
converted / clicks) is compared between the _treated_ and _held-out_ slices every night.
Because the hash is deterministic, the same visitor lands in the same slice every time — so repeat clicks stay consistent.
The stats Trcker computes
Every night at 07:00 UTC, for every active holdout rule, Trcker computes:
Pooled two-proportion z-test — compares the treated conversion rate against the held-out conversion rate. Returns a z-score and a two-sided p-value.
Wald 95% confidence interval — the range of plausible "true" lift values. If the CI crosses zero, the lift isn't statistically significant.
Incremental conversions — treated count minus the count the held-out slice _would_ have produced at the treated volume. This is the number of conversions you can credibly attribute to the partner's _marginal effort_.
Snapshots are upserted per-brand per-partner per-offer per-window, so the nightly job is idempotent and safe to re-run.
When you'll see results
Holdouts need volume to produce a statistically significant signal. A partner sending 50 clicks a month won't reach significance in any reasonable timeframe. Rough rule of thumb:
| Baseline conversion rate | Clicks needed for 95% confidence at 10% lift | |---|---| | 1% | ~130,000 | | 3% | ~45,000 | | 10% | ~15,000 |
If your partner doesn't send that much volume, the lift number is still directional — it's just not statistically "proven."
The difference from attribution
Attribution asks "who touched this conversion?" Incrementality asks "did touching it matter?"
Use attribution for _allocation_ across partners who definitely contribute. Use incrementality to catch partners whose conversions were going to happen anyway — those are the ones to pause or renegotiate.
Holdout rule lifecycle
- Start a holdout from Reports > Incrementality — click "Start a holdout," pick a partner, optionally scope to a specific offer, and set the percentage (10% is the default). The randomizer starts diverting traffic immediately; results appear after the next nightly run.
- End a holdout by clicking "End" on the active rule. The nightly job stops counting new activity but preserves the existing snapshots so you keep the historical measurement.
- Overlap rules: When a click could match multiple rules (e.g. a brand-wide rule and an offer-specific rule for the same partner), the more-specific rule wins.
- Changing the percentage: End the old rule, start a new one. This gives you a clean-break measurement rather than mid-window drift.
Future: per-publisher geo holdouts
Geo-based holdouts — pause a partner in California for 30 days while leaving the rest of the country untouched — are on the roadmap. They're useful when a single creative is saturating a region and you want to see if _any_ partner-driven lift remains.
Related
- Multi-touch attribution — who _touched_ a conversion across all partners
- Fraud detection — filter out invalid clicks before they hit your holdout buckets