The Instinct Tax
Practitioner ObservationA pattern identified through direct practice building local-first AI products, grounded in UX research and supported by neuroscience, thermodynamics, and behavioral psychology. The survival percentages are practitioner estimates informed by industry data, not controlled study results. We publish early because the pattern is actionable now.
The grab
You grab your phone and stand up from your chair. That's it. That's the whole decision. You're already moving before you think about where you're going. The phone is in your hand because it's always in your hand. The action is so automatic it doesn't register as a choice.
This is instinct. Not biological instinct — trained instinct. What Daniel Kahneman calls System 1: fast, automatic, effortless.[1] Thousands of repetitions have compressed a complex behavior (check context, assess situation, decide next action) into a single physical gesture. Grab and go. Neuroscience calls this compression myelination — repeated neural signals get physically faster pathways, wrapped in insulation that increases transmission speed up to 100×.[7] The chain doesn't get smarter. It gets faster. That's what instinct is.
Now imagine your phone requires you to open a browser, type an IP address, enter credentials, and navigate to the thing you were already thinking about. By step two, you're no longer thinking about the thing. You're thinking about the tool. The instinct is dead. You're doing a chore.
Every step between impulse and value is a tax on human behavior. Engineers don't feel this tax because machines don't have instincts.
Why engineers build taxed flows
A machine will execute a 47-step authentication handshake in 3 milliseconds. A 12-step onboarding flow and a 1-step onboarding flow are the same to a computer. Steps are free to machines. They cost nothing.
So engineers optimize for correctness, security, and completeness — all valid concerns — and unconsciously design for what machines tolerate, not what humans need. Every flow makes mechanical sense. Every step is "logical." And every step bleeds momentum from the person holding the phone. Luke Wroblewski documented this pattern extensively in mobile-first design: the physical constraints of the device should drive the interaction design, not the other way around.[4]
The result: products that work perfectly and feel terrible. This isn't subjective — it maps to nociception, the nervous system's damage-measurement mechanism.[8] Nociceptors don't fire on opinion. They fire on harm. Cognitive friction works the same way: the system dims rather than crashes, but the cost is real and measurable. Internal tools that nobody uses on mobile. Apps that people install and never open a second time — what the industry measures as Day 1 retention, averaging around 25% across categories.[5] Not because the product is bad, but because the product makes you think about the product instead of the thing you're trying to do. Don Norman calls this the gulf of execution: the gap between what you intend to do and the actions the system requires.[6]
The framework
| Tax level | Steps | What it feels like | Survival |
|---|---|---|---|
| Zero-step | Phone buzzes, glance, done | Instinct preserved | 100% |
| One-step | Open app, content is there | Instinct barely taxed | ~80% |
| Two-step | Open app, log in, find content | Instinct dying | ~50% |
| Three-step | Open browser, type URL, authenticate, navigate | It's a chore now | ~20% |
| Four+ steps | Find docs, locate IP, open browser, type it, log in, navigate... | Only the committed survive | ~5% |
The zero-step row is the goal. Your phone buzzes, you glance, you know. That surface is an I/O channel. Three dots telling you the shape of things without asking you to open anything, navigate anywhere, or think about the tool. The brain stays put; the I/O channel meets you where you already are.
Each step doesn't subtract users linearly — it divides them exponentially. The drop-offs aren't gradual, either — they're phase transitions.[9] Like water at 100°C: heat builds steadily with no visible change, then everything changes at once. A user doesn't slowly disengage. They hit a threshold and they're gone. This pattern aligns with form abandonment research showing completion rates drop roughly 50% as form length doubles,[2] and with BJ Fogg's Behavior Model: when motivation is constant, increasing the difficulty of the action kills the behavior.[3] It's not laziness. It's physics. Human momentum has inertia, and every step is friction.
Case study: phone access for a local AI agent
We build local-first AI agents. One of them, Dash, runs on a home network. The problem: you're at your desk talking to Dash, then you need to walk away. You want to continue the conversation on your phone. Same session, same thread, no interruption.
The wrong way (5 steps, 3 require infrastructure thinking)
- Find the terminal output that shows the server's IP address
- Open your phone's browser
- Type
http://192.168.1.137:3577 - Enter your safe word
- Find your conversation
Five steps. Three of them require you to think about infrastructure. By step three you've already decided it's not worth it.
The right way (3 steps, 0 require infrastructure thinking)
- Scan the QR code on your screen
- Enter your safe word
- Tap your conversation
Same number of human-intent steps (safe word, find conversation). The difference: every infrastructure step is gone. You never think about IP addresses, ports, or browsers. Every step is about your intent, not the machine's address.
Every time after: tap the home screen icon. Your conversation is already there.
The engineering behind this isn't trivial — deterministic session IDs derived from the safe word so both devices share identity without a "link your devices" flow, PWA installation for a native app feel, session keys that survive server restarts. But the human doesn't see any of that. They see: grab phone, tap icon, continue talking.
Key decisions that preserved instinct
- QR code prints automatically at startup. No "find your IP" step. The server does the work of knowing its own address.
- PWA install makes it a home screen icon. Not a bookmark. Not a URL to remember. An app.
- Deterministic session identity. Same safe word = same session on any device. No device linking, no pairing ceremony, no "sign in on your other device."
- Standalone display mode. No browser chrome. No URL bar. No reminder that "this is a website." It's just Dash.
Every one of these decisions removed a step. Not added a feature — removed a step. The discipline is subtraction, not addition. This mirrors synaptic pruning — the brain's mechanism for building expertise.[11] During development, the brain eliminates unused neural connections to accelerate the ones that fire repeatedly. Instinct isn't more knowledge. It's fewer pathways, each one faster.
The design principle
Machines can add steps unless told not to. Humans can't remove them once they exist.
Software doesn't feel friction. So it accumulates friction by default. Every security check, every confirmation dialog, every "welcome back" interstitial, every loading state — each one is individually reasonable and collectively fatal.
The discipline is: start from the instinct, then work backward to make the engineering fit inside it. Not the other way around. In thermodynamics, a heat engine converts friction into useful work.[10] Product development works the same way — every point of friction is fuel that can be metabolized into smoother pathways. The instinct tax isn't pure waste. It's tuition. The question isn't whether friction exists. It's whether the system digests it into instinct or lets it pile up.
Four questions to ask before shipping any user-facing flow:
- What is the human already doing with their hands? Grabbing phone, looking at screen, walking to the door. Start there.
- What's the minimum number of conscious decisions between that physical action and the value they want?
- Can any of those decisions be made for them? Deterministic sessions, cached auth, pre-loaded state, auto-detected context.
- Does any step require them to think about the tool instead of the task? If yes, that step is a candidate for elimination.
This is operant conditioning applied to product design. Skinner (1938) showed that positive reinforcement — reward for the desired behavior — produces more durable habits than punishment.[12] Every step removed is a reward. Every error dialog is a punishment. We know negative reinforcement works. We choose not to use it.
The AI implication
This matters more now than it ever has, because AI agents are about to design a lot of user-facing flows.
An AI agent will happily generate a 12-step onboarding wizard. It's being thorough. It's covering edge cases. It's following best practices. And it's creating a flow that 80% of humans will abandon by step four.
Agents don't have momentum. They execute sequences. A 12-step process and a 1-step process are identical to a machine — both complete in milliseconds. So when an agent builds for humans, it has no internal signal that says "this is too many steps." It optimizes for correctness, not for the feeling of picking up your phone and just being there.
This is the real UX challenge of the AI era. Not making AI smarter. Making AI respect that humans are analog creatures running on momentum, not logic. Every flow an agent designs needs an instinct governor — something that asks: would a human still be moving by this step, or have they already put the phone down? The same logic applies to how the system delivers answers. Polling a dashboard is an instinct tax. Letting answers precipitate and arrive at the surface when they're ready costs the human nothing.
The Gesture
You grab your phone. You stand up. That's the whole interaction budget. Everything your product needs to deliver has to happen inside that gesture, or it's not going to happen at all.
The metric isn't "does the flow work?" The metric is "does the human still want to finish?"
When friction is the feature
A framework this aggressive needs its own counter-argument, or it becomes a blunt instrument.
Not all friction is instinct tax. Some friction is the product working correctly. The distinction matters: infrastructure leakage — steps that exist because of how the system is built — is always a candidate for elimination. But intentional friction — steps that exist because of what the human needs to accomplish — is often the feature itself.
Re-authentication before a wire transfer. A compliance checklist before deploying to production. A calibration step before a medical device takes a reading. A consent dialog before sharing personal data. These aren't taxes. They're guardrails. Removing them doesn't preserve instinct — it removes judgment from moments that require it.
If a step serves human intent, keep it. If it serves machine needs, kill it.
That's the filter. "Enter your IP address" serves the machine — the human doesn't care how packets route. "Confirm you want to send $50,000" serves the human — that pause exists because irreversible actions deserve a moment of conscious thought. One is tax. The other is design.
Two blind spots in the framework worth naming:
- Step thresholds are heuristics, not laws. The survival percentages in the table above are practitioner estimates — useful for intuition, dangerous as policy. A five-step flow for filing taxes will get completed because the stakes justify the friction. A two-step flow for a casual social app might still be too much because the motivation is low. Context sets the threshold, not the step count alone.
- High-value tasks tolerate more friction. Signing a mortgage, configuring a firewall, onboarding to a tool you'll use daily for years — these earn their steps. The instinct tax hits hardest on low-stakes, high-frequency actions: checking a notification, glancing at a dashboard, continuing a conversation. Frequency and stakes together determine how aggressively you should subtract.
The instinct tax is real. But the cure for friction isn't frictionlessness — it's intentionality. Every step should be there on purpose, serving the person holding the phone, not the system behind the screen.
References
- Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux. System 1 (fast, automatic, intuitive) vs. System 2 (slow, deliberate, effortful) as dual-process model of cognition.
- Wroblewski, L. (2008). Web Form Design: Filling in the Blanks. Rosenfeld Media. Baymard Institute (2023) corroborates: checkout abandonment rates average 69.99% across industries, with checkout complexity cited as a significant factor (18% of abandonment reasons).
- Fogg, B.J. (2009). "A Behavior Model for Persuasive Design." Proceedings of the 4th International Conference on Persuasive Technology. Behavior = Motivation × Ability × Trigger. When ability drops (more steps), behavior dies even if motivation is constant.
- Wroblewski, L. (2011). Mobile First. A Book Apart. Argues that designing for mobile constraints first produces better interactions for all devices.
- Adjust. (2023). Mobile App Trends Report. Average Day 1 retention across app categories ~25%. Average Day 30 retention ~6%. The first interaction is disproportionately decisive.
- Norman, D. (2013). The Design of Everyday Things. Revised edition. Basic Books. The "gulf of execution" and "gulf of evaluation" framework for understanding interaction friction.
- Fields, R.D. (2008). "White Matter in Learning, Cognition and Psychiatric Disorders." Trends in Neurosciences, 31(7), 361–370. Myelination increases neural signal transmission speed up to 100×, explaining how repeated practice converts deliberate action into automatic response.
- Sherrington, C.S. (1906). The Integrative Action of the Nervous System. Yale University Press. Foundational work on nociception — the nervous system's damage-detection mechanism, distinct from subjective pain experience. See also Purves, D. et al. (2018). Neuroscience. 6th ed. Sinauer Associates.
- Atkins, P. & de Paula, J. (2014). Atkins' Physical Chemistry. 10th ed. Oxford University Press. Phase transitions as discontinuous state changes: energy accumulates continuously but the transition (solid → liquid → gas) is sudden and total.
- Cengel, Y. & Boles, M. (2014). Thermodynamics: An Engineering Approach. 8th ed. McGraw-Hill. Heat engines convert thermal energy (friction, combustion) into useful work — the thermodynamic basis for systems that metabolize their own inefficiency.
- Huttenlocher, P.R. (1979). "Synaptic density in human frontal cortex — developmental changes and effects of aging." Brain Research, 163(2), 195–205. The brain eliminates ~50% of synaptic connections during development, accelerating frequently-used pathways by pruning unused ones.
- Skinner, B.F. (1938). The Behavior of the Organism. Appleton-Century-Crofts. Foundational work on operant conditioning: positive reinforcement produces more durable behavioral change than punishment.
The throughline
Each paper picks up where the last one left off.