← All posts Builder Notes

Privacy Isn't a Feature — It's the Product

Apr 25, 2026 · 7 min read · Abhishek Gawde

When someone opens a breathing app mid-panic, something specific is happening in their body. Heart rate elevated. Hands potentially shaking. Prefrontal cortex partially offline. They're not making a considered product decision — they're reaching for something that might help them get through the next 60 seconds.

That moment is exactly the kind of data that wellness platforms are built to capture. Session timestamp. Technique selected. Duration. Frequency of use. The day of the week panic tends to hit. Aggregated over time: a detailed map of a person's worst moments.

Most apps collect this. Most users don't notice. I think it's the most quietly corrosive design decision in the wellness space — and I made a different call with Undulate.

If you're in crisis, please reach out to a professional or contact the 988 Suicide and Crisis Lifeline (call or text 988).

Why Every Other App Tracks Everything

The business logic is straightforward. User data drives retention features: streaks, personalized recommendations, progress charts, "you've been calmer this week" notifications. Retention keeps users subscribed. Subscriptions are the revenue model. So data collection isn't just permitted — it's structural. The whole system depends on it.

There's also a genuine product argument for it. If you know a user opens the app every Sunday night before their Monday commute, you can build features around that pattern. You can improve the product. You can understand what's working. These aren't fake reasons.

But there's something the product argument skips: what it feels like, on the user's end, to know that your panic attacks are being logged.

The Meta-Anxiety Problem

Anxiety has a particular quality that most non-sufferers underestimate: it is recursive. It feeds on itself. You feel anxious, and then you feel anxious about feeling anxious, and then you feel anxious about how often you've been feeling anxious lately, and suddenly you're spiraling about a spiral.

Anxiety tracking apps lean directly into this recursion. They show you charts of your anxiety over time. They surface patterns. They send notifications when you've missed a streak of calm. The implicit message is: your anxiety is a thing to be monitored and optimized.

I've written before about why tracking anxiety can make it worse — the act of monitoring something you're already afraid of gives it more cognitive real estate, not less. But there's a layer beyond that: the discomfort of knowing your worst moments are stored somewhere, associated with your account, potentially visible to a company's analytics team, possibly covered by terms of service that allow data sharing with "partners."

You don't have to believe this is being misused to find it quietly unsettling. The data existing is enough.

What "Stores Nothing" Actually Means

When I say Undulate stores nothing, I mean it in a specific technical sense that's worth spelling out, because "privacy-focused" has become marketing language that can mean almost anything.

There are no user accounts. That means there's no server-side record of who you are or when you used the app. There's no session history — the app doesn't remember what technique you chose last time, because that information never leaves your device. There's no behavioral analytics pipeline ingesting individual usage events. When you close the app, nothing about what you just did is transmitted to a server with your identity attached to it.

The Emergency Calm Link — undulate.app/calm — is more extreme. It's a web page. No sign-up. No cookie that persists across sessions. No login. You arrive, you follow the guided breathing for 60 seconds, you leave. Nothing is retained. I have no way of knowing you were there.

The honest caveat

I use aggregate, anonymous analytics to understand whether the app is being used at all — total session counts, which techniques are opened, crash rates. This data has no individual identifiers attached to it. I can see "box breathing was opened 400 times this week" but not "this specific person opened box breathing at 2am on Tuesday." That's a line I chose to draw, and I think it's the right one. But I'm naming it because "no tracking" claims deserve to be specific.

What This Decision Costs

I want to be honest about the tradeoffs here, because most privacy-forward positioning treats it as a pure win. It isn't.

Without user accounts, I can't build personalization features. I can't show you which technique you tend to reach for when you're most activated. I can't surface patterns in your own usage because I deliberately don't have them. Some of those features would probably be useful for some users — and I can't build them.

Without behavioral analytics on individual sessions, product decisions are harder. I can't tell if a specific user bounced because the UI confused them or because they felt better and didn't need more. I'm operating with less signal than most product teams would accept.

Without retention mechanics — streaks, progress charts, personalized notifications — I have no loop to pull people back to the app. The only thing that brings someone back is that it worked the last time they needed it. That's a harder growth model.

These are real costs. I mention them because I think the privacy argument is most credible when you acknowledge what you're giving up, not just what you're avoiding.

Why It Was the Only Right Call Anyway

The thing that kept coming back to me during the design process was this: the moment someone reaches for a breathing app, they're in a reduced-capacity state. They're not making a fully deliberate choice to share data. They're not pausing to read a privacy policy or opt out of analytics. They're just opening the thing they hope will help.

Informed consent requires capacity to give it. A person mid-panic doesn't have full capacity. Building a data collection system that activates in that moment felt, to me, like taking advantage of a vulnerability rather than addressing it.

There's also something more practical: trust is the foundation of a tool you're supposed to reach for at your worst. If a user has any background awareness that their panic sessions are being logged, that awareness becomes a small tax on using the app. Not a deal-breaker for most people, probably. But a friction that accumulates. The goal is to remove every possible barrier between "I need to breathe" and actually breathing — and uncertainty about data is a barrier.

The Emergency Calm Link as Product Philosophy

The most concrete expression of this philosophy is the free breathing session at undulate.app/calm. I built it partly as a marketing tool — a way for someone to try guided breathing before deciding to buy the app. But it turned into something I think about differently now.

It's a URL you can text to someone who's having a hard time right now. They don't need an app. They don't need an account. They don't need to hand over an email address. They click the link, they breathe for 60 seconds, they close the tab. Nothing is stored on either end.

That's as close to a pure tool as I know how to make. No onboarding friction. No data collection. No residue. Just the thing itself.

I've heard from a few people who keep the link in their phone's notes for exactly this scenario — a moment when they know they'll be too activated to navigate an app store or type a password. A URL in notes takes one tap. That's the whole design brief.

A Note on the Competitive Angle

I want to be careful not to overstate this as a differentiator. Plenty of wellness apps have privacy policies I respect, and some subscription apps genuinely do use data to build better products. I'm not making an argument that data collection in wellness is always wrong.

I'm making a narrower argument: when the use case is acute distress — when the user is at their most vulnerable and has the least capacity for deliberate consent — the right default is to collect nothing. That specific intersection of vulnerability and data collection is where I think the industry norm is wrong.

Undulate was built for that exact intersection. So the decision made itself.

60 seconds. Nothing stored.

The Emergency Calm Link is a free guided breathing session in your browser. No sign-up. No account. No record that you were there. Just the breathing.

Open breathing session