Private AI Chat: What Actually Keeps You Safe
The privacy gap in AI companion apps is now well-documented — and uncomfortable. Here is what the research actually says, and the specific architectural choices that close it.
I have watched more than one person quietly close their AI companion app after seeing what shows up on their credit card statement. Not because the amount was surprising — because the descriptor named the product. In a category where "discreet" is the first word in every marketing page, the actual delivery of that promise is where most apps fall apart. The billing statement is only the most visible example. The rest of the gaps are harder to see, and far more serious.
What does "private AI chat" actually mean in 2026?
Short answerIn this category, privacy means three things at once: your messages are encrypted and not reachable from the public internet; your billing never exposes what the product is; and deleting your account actually removes your data. All three fail independently across most of the market, which is why most romance AI apps fail privacy audits.
The phrase "private AI chat" gets used loosely in this space, so it is worth being specific about what it has to mean to be useful.
For an AI companion — which, let us be honest, is a category where the conversations get personal in ways ordinary chat apps never do — privacy is not one thing. It is at least four things that all have to work, and a failure in any one of them makes the other three worthless.
Wire-level privacy. Are your messages encrypted between your device and the service? If not, everything else is academic.
Storage-level privacy. Once your messages arrive at the service, where are they kept, and who can reach them? A database that sits on the public internet with a password-only defence is not private in any meaningful sense, and this is not a hypothetical — publicly exposed databases are one of the most common sources of AI app breaches.
Billing-level privacy. What shows up on your credit card statement when you pay? If it says the product name or the category, you are one unexpected family member away from an awkward conversation you did not sign up for.
Deletion-level privacy. When you delete your account, does your data actually leave the system, or does it sit in a "soft-deleted" state forever, fully recoverable by anyone at the company who wants to look at it?
Most apps in this category fail on at least two of these four. The 2024 Mozilla Foundation review found all eleven romance AI apps it tested failed its privacy evaluation — every single one carried a Privacy Not Included warning.[1]
The 2024 Mozilla report that named every major romance AI app
Short answerMozilla reviewed 11 of the most popular romance AI chatbots in 2024. All 11 failed. One app fired over 24,000 third-party trackers in the first 60 seconds of use. The same report found that roughly 90% of the apps may share or sell data for advertising purposes.
The Mozilla Foundation's Privacy Not Included project is one of the most credible privacy audits in consumer software, and in February 2024 they published a review of the most popular romance AI chatbots.[1]
The headline findings were stark. Of the eleven apps they tested, every single one failed their minimum privacy standard. Ten failed basic security standards on top of that. The report described the category as the worst-performing they had ever reviewed on privacy grounds — and Mozilla has reviewed a lot of categories.
One app fired more than 24,000 third-party trackers in the first sixty seconds of use. Another sent data to advertising networks the moment it launched. Roughly nine in ten of the apps reviewed either explicitly permitted sharing or selling your data for advertising purposes, or refused to confirm that they did not.
The coverage of the report was unsparing. Infosecurity Magazine, The Mozilla Foundation's own blog, and subsequent follow-ups across consumer technology press all converged on the same description: the category was "creepy", and the companion apps marketed as intimate and safe were, as a class, the least privacy-respecting consumer software Mozilla had ever measured.
Regulators noticed. The Italian data protection authority later fined a leading companion app for privacy failings around minors and data handling, and US advocacy groups filed an FTC complaint in 2025 alleging deceptive design and data practices in the broader category.[2] None of this is resolved — it is the backdrop every honest post about privacy in AI companionship has to start with.
How JustHoney actually handles your data
Short answerYour messages are encrypted in transit and at rest. Storage is isolated from the public internet. Sessions expire quickly. Billing carries a neutral descriptor. When you delete your account, it is a real delete — not a flag in a database — and within thirty days your data is gone from the active system.
Here is what is actually in place, described without naming specific vendors because that is orthogonal to whether it works.
Encrypted in transit. Every request between your device and JustHoney runs over modern TLS. Table stakes, worth confirming we actually have it.
Encrypted at rest. Your data is stored with encryption at rest, so raw disk contents are not readable without active system keys. Passwords are never stored in plain text — they are hashed with a strong one-way algorithm, so even in a worst-case breach your password itself cannot be recovered from the stored form.
Network-isolated storage. The place where your data lives is not reachable directly from the public internet. It is reachable only from the application services that need to talk to it. This is the exact failure mode behind a surprising number of AI app breaches, and designing it out once is cheaper than explaining it after.
Short-lived sessions. Authenticated sessions expire quickly and two-factor codes expire even faster. A leaked session token is useful to an attacker for only a brief window, by design.
Billing: discreet by default. Card processing is handled by a regulated third-party payment processor under a neutral billing descriptor, configured so the JustHoney brand and category do not appear on your statement. The full card details never touch our servers.
Your conversations are not traded. Individual conversations are not shared, not sold, and not used as identifiable training data for downstream AI models. This is documented in the privacy policy and we intend to keep it that way.
Deletion is real deletion. When you delete your account, the system performs an actual delete — not a "marked as inactive" flag that sits there indefinitely. Cached session state is invalidated immediately, and within thirty days the active system is wiped of anything tied to you.
- TLS in transit, encryption at rest, passwords hashed not stored
- Storage isolated from the public internet
- Short-lived session tokens and even shorter two-factor codes
- Discreet billing descriptor — no product name on your statement
- Hard delete on account removal, not a soft-delete flag
- Individual conversations never sold or used as identifiable training data
What happens when you delete your account
Short answerA real deletion runs against your user record and everything connected to it — messages, generated photos, preferences, session state. Within thirty days, the active system is wiped of anything tied to you. A small set of records persists only where law requires it, and we say so plainly rather than burying it.
Deletion is the single most tested honesty signal in any privacy policy, because it is the one a user can actually verify for themselves. Here is what happens when you click delete account.
Your user record is removed directly — not marked inactive, not moved to a "deleted users" table, not hidden behind a flag someone can flip back later. Everything linked to your account (your conversations, your generated photos, your preferences, your session state) is removed at the same time. Cached session data is invalidated immediately, so any token you had is dead the moment the delete completes. Within thirty days, the active system contains nothing tied to you.
A small set of records persists for narrowly legitimate reasons, and we will tell you what they are:
• Payment records are retained where tax law requires it. These contain transaction metadata, not the contents of any conversation.
• Backups roll off on their own cycle. This is how backup systems work at any serious company — an entry in a backup is not individually editable — and the live system is clean long before the backup tape catches up.
• Anonymised usage signals may persist in aggregate form for abuse prevention and product improvement. Aggregate is the operative word: there is nothing in that data that can be tied back to you personally.
If you want a specific confirmation of deletion, or an export of your data before you delete, the privacy policy documents how to request it. EU users have full GDPR rights including access and erasure, and we honour them on documented timelines.
That is the honest version. Not "we delete everything everywhere instantly" — no app in the world can truthfully say that — but the closest thing to it that a real system can actually deliver.
JustHoney vs. typical AI companions
The honest limits of privacy in any AI chat app
No app in this category is perfectly private, and we would rather spell out where the edges sit than let you discover them later. These limits are true for every serious operator in the space, ours included:
- Anonymised and aggregated data may be used internally for abuse prevention and product improvement. "Anonymised" is a real term of art — it means nothing in the data can be traced back to you personally — but it is not the same as "never touched".
- Payment records are retained where tax law requires it. We do not get to opt out of that, and neither do you.
- Backup systems roll off on their own cycle. This is how backups work everywhere — at JustHoney and at any other serious company — and the active system is cleaned up long before the backup tape is.
- Government legal process (subpoenas, court orders, valid warrants) can compel disclosure of records that still exist. No app in the world can truthfully promise otherwise. What we can promise is that we minimise what we keep and we do not volunteer what we have not been asked for.
- The age-verification step involves a certified third-party identity provider because we are legally required to verify age. We receive only a verification result and a confirmed date of birth — not the underlying identity documents.
Frequently asked questions
Are my chats with my companion used to train AI models?
Individual conversations are not shared, not sold, and not used to train downstream models as identifiable data. Anonymised and aggregated data may be used internally for quality and abuse-prevention purposes, and this distinction is documented in the privacy policy.
What will appear on my credit card statement?
A neutral, discreet descriptor through our payment processor. The JustHoney brand and product category do not appear on the statement. This is configured at the processor level, not just displayed client-side.
Can I actually delete my account and have my data removed?
Yes. Deletion is a real deletion, not a flag, and the active system is wiped of anything tied to you within thirty days. Backups roll off on their own cycle afterwards. EU users have full GDPR access and erasure rights on top of the standard deletion flow.
How do you handle the age-verification step?
A certified third-party identity verification provider handles the actual ID check. We receive only a verification result and a confirmed date of birth — not the underlying documents. This is a legal compliance requirement in this category, and we picked a partner specifically to minimise what we keep.
Keep reading
Ready to meet her?
Sign up free in under a minute, or browse hundreds of companions first.
- [1]Mozilla Foundation, Privacy Not Included 2024 romance AI chatbot review — 11 of 11 apps failed. https://www.mozillafoundation.org/en/privacynotincluded/articles/happy-valentines-day-romantic-ai-chatbots-dont-have-your-privacy-at-heart/
- [2]Time coverage of the 2025 FTC complaint filed against a leading romance AI app. https://time.com/7209824/replika-ftc-complaint/
Mara has been writing about AI companion platforms since 2023. She covers how these products are built, how they behave in practice, and where they break — from the team side and the user side.