The new smart toilet camera Kohler launched as Dekoda — promoted with claims of “end-to-end encryption” — was recently exposed as using only standard TLS/HTTPS-style encryption rather than true end-to-end encryption, meaning the company can still access and process users’ intimate bathroom images. A security researcher, Simon Fondrie-Teitler, flagged the misleading language, pointing out that while traffic is encrypted in transit and data encrypted at rest, the images are decrypted on Kohler’s servers — exactly the scenario true end-to-end encryption is supposed to prevent. Kohler has confirmed the data can be decrypted for processing and may be used (in de-identified form) to train AI, underscoring how marketers misuse encryption language to appear privacy-friendly.
Sources: The Register, TechBuzz
Key Takeaways
– Kohler’s “end-to-end encryption” claim for Dekoda is deceptive; the device only uses TLS/HTTPS plus server-side decryption, allowing the company to view private data.
– Users’ personal bowel-health images are stored and potentially used by Kohler — including for AI training — under an “anonymized/de-identified data” usage clause.
– The mislabeling underscores a broader problem in IoT privacy: manufacturers often rely on consumers’ technical ignorance to market “secure” devices that nonetheless allow full corporate access.
In-Depth
The Dekoda toilet camera from Kohler promised what many would consider a bold leap: a smart, privacy-aware home device designed to analyze gut-health markers while using “end-to-end encryption” to guarantee user privacy. On its face, the pitch seemed simple enough — get advanced health insights without compromising privacy. But the problem, as revealed by privacy researcher Simon Fondrie-Teitler and reported by multiple tech outlets, lies in what the term “end-to-end encryption” really means — and what Kohler actually delivers.
In secure messaging systems like Signal, WhatsApp, or iMessage, end-to-end encryption ensures that only the communicating user devices can decrypt the content; not even the service provider can view it. This means that even if data is intercepted or stored on remote servers, no third party (including the provider itself) can decrypt and read it. For a toilet camera that captures deeply personal health data, that level of protection would arguably be essential.
But that’s not what Dekoda does. Instead, it uses standard TLS/HTTPS encryption — sufficient for protecting data in transit — and then decrypts that data once it arrives on Kohler’s servers. From there, the company retains full access to the images: the same images that, in theory, prompted the user to expect privacy. Kohler’s own privacy contact apparently confirmed this during correspondence cited by Fondrie-Teitler. The company also admitted the images are stored “at rest” on their systems, processed for the health analysis feature, and possibly used for machine-learning purposes under a clause promising only “de-identified” data.
That admission is troubling for a number of reasons. First, “de-identified” doesn’t guarantee true anonymity, especially with images — what counts as identifying can often be fuzzy, and re-identification risk remains. Second, the use of deeply personal health images for AI training or third-party sharing, even in “de-identified” form, crosses a boundary many consumers likely weren’t aware of when buying the device. Finally, the misleading use of “end-to-end encryption” — a technically buzzworthy phrase — appears designed more to assuage privacy fears than to reflect real security practices.
The incident illustrates a broader problem in the Internet-of-Things and health-tech spaces: companies are easily tempted to exploit consumers’ lack of technical knowledge. Terms like “end-to-end encryption” carry powerful connotations, but unless devices actually implement the cryptographic guarantees those terms imply, consumers are left with a false sense of security. And when it comes to intimate data — like what you flush away — that false sense of security may have far-reaching consequences.
In the end, the Dekoda story serves as a cautionary tale: connected devices that claim privacy-boosting features deserve careful scrutiny. If you ever rely on “encryption” as a privacy safeguard, take a closer look — because what you think means “only you can see the data” might really mean “the company sees everything.”

