Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Malicious Chrome Extensions Compromise 900,000 Users’ AI Chats and Browsing Data

    January 12, 2026

    Microsoft Warns of a Surge in Phishing Attacks Exploiting Misconfigured Email Systems

    January 12, 2026

    SpaceX Postpones 2026 Mars Mission Citing Strategic Distraction

    January 12, 2026
    Facebook X (Twitter) Instagram
    • Tech
    • AI News
    Facebook X (Twitter) Instagram Pinterest VKontakte
    TallwireTallwire
    • Tech

      Malicious Chrome Extensions Compromise 900,000 Users’ AI Chats and Browsing Data

      January 12, 2026

      Wearable Health Tech Could Create Over 1 Million Tons of E-Waste by 2050

      January 12, 2026

      Viral Reddit Food Delivery Fraud Claim Debunked as AI Hoax

      January 12, 2026

      Activist Erases Three White Supremacist Websites onstage at German Cybersecurity Conference

      January 12, 2026

      AI Adoption Leaders Pull Ahead, Leaving Others Behind

      January 11, 2026
    • AI News
    TallwireTallwire
    Home»Tech»Viral Call-Recording App “Neon” Goes Dark After Exposing Users’ Calls and Phone Numbers
    Tech

    Viral Call-Recording App “Neon” Goes Dark After Exposing Users’ Calls and Phone Numbers

    Updated:December 25, 20255 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Viral Call-Recording App “Neon” Goes Dark After Exposing Users’ Calls and Phone Numbers
    Viral Call-Recording App “Neon” Goes Dark After Exposing Users’ Calls and Phone Numbers
    Share
    Facebook Twitter LinkedIn Pinterest Email

    A rapidly rising app called Neon, which promised to pay users for recording their phone calls and selling the data to AI firms, was abruptly taken offline after a severe security flaw allowed anyone with a Neon account to access other users’ phone numbers, full call transcripts, and raw audio files. TechCrunch’s investigation revealed that Neon’s backend servers were not properly enforcing access controls, meaning that a logged-in user could simply manipulate requests or endpoints to retrieve sensitive information belonging to strangers. The app’s founder, Alex Kiam, responded by shutting down the servers and notifying users of a “temporary pause” for extra security, but notably did not mention the breach itself. In light of mounting scrutiny, Business Insider reports that Neon may remain offline for one to two weeks while a full security audit and fixes are applied. Meanwhile, outlets like Gizmodo and Malwarebytes warn that the incident underscores the risks of monetizing deeply personal data without robust security and transparency.

    Sources: Gizmodo, Business Insider

    Key Takeaways

    – The Neon app’s security flaw was shockingly fundamental—its servers failed to enforce basic access controls, effectively making user authentication a “master key” to all call data in the system.

    – The decision by the founder to take the app offline cited “adding extra layers of security,” but omitted any explicit admission of the data exposure, raising transparency and trust concerns.

    – This incident reflects deeper tensions in the AI era: companies are aggressively seeking voice and other personal data for model training, but often lack sufficient safeguards, leaving user privacy dangerously exposed.

    In-Depth

    When a tech product promises money in exchange for something as private as phone calls, skepticism and scrutiny should follow—and in the case of Neon, that scrutiny exposed a disaster waiting to happen. Neon launched just last week and rapidly climbed the App Store charts by telling users: permit us to record your calls, we pay you per minute, and we anonymize and sell the data to AI companies. The pitch sounds tempting to some: turn your voice into a small income stream. But the technical foundations for executing that safely were evidently not there.

    TechCrunch’s security team essentially stress-tested Neon by creating a fresh account, placing calls, and using a network traffic inspection tool called Burp Suite to peer under the hood. What they found was alarming: although the Neon front end showed only a user’s own calls and earnings, its back end exposed far more. By manipulating API endpoints, the investigators were able to fetch call transcripts, raw audio links, metadata (like caller and callee numbers, timestamps, durations, earnings) belonging to other users. In effect, Neon lacked row-level access controls or proper authorization checks so that any authenticated user could browse the entire dataset. The flaw wasn’t subtle—it was catastrophic.

    Upon notification, the founder, Alex Kiam, moved to shut down the servers and informed users that the app was entering a “temporary pause” so the company could build “extra layers of security.” But he did not explicitly notify users that their recordings and transcripts may already have been accessed by unknown parties, nor did he immediately commit to full transparency or compensation. Business Insider writes that Kiam expects a week or two offline for remediation and a security audit. Meanwhile, Neon’s ascent to top ranks of the App Store has collapsed under the weight of this scandal.

    Beyond the immediate fallout, the Neon case is a cautionary tale for the broader AI ecosystem. It spotlights the appetite of AI developers for new, large voice datasets—and how startups can lean into monetizing personal data without fully building the controls to protect it. Malwarebytes notes that Neon’s business model hinged on collecting very sensitive personal data, all under claims of anonymization and consent, yet the actual security was weak and the privacy promises ripe for abuse.

    For users, the lesson is straightforward: something paying you to hand over highly personal data should always trigger paranoia. Even if a service promises anonymization, the anonymization layer is only as strong as the underlying system security. And for regulators and platform gatekeepers (like Apple and Google), Neon raises uncomfortable questions: how did an app with such glaring flaws make it onto the App Store at all? Should app reviews demand stronger security review before deployment, particularly for apps handling voice, call, or biometric data?

    In the near term, users who installed Neon should assume their call logs, transcripts, recordings, and possibly identities have been exposed. They should delete the app, request account closure, and monitor for misuse of any personal communication data. Developers thinking of launching similar services must prioritize security from day one—things like least privilege access, encryption, secure APIs, and external audits aren’t optional extras when you’re dealing with deeply personal data.

    The Neon collapse is a wake-up call: in the rush to monetize data for AI, skipping infrastructure discipline or security basics is a gamble that users almost always lose.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleViral AI Homeless-Man Prank Triggers Police Warnings & Legal Risks
    Next Article Visa Unveils Agentic Commerce, Letting AI Spend Your Money

    Related Posts

    Malicious Chrome Extensions Compromise 900,000 Users’ AI Chats and Browsing Data

    January 12, 2026

    Wearable Health Tech Could Create Over 1 Million Tons of E-Waste by 2050

    January 12, 2026

    Viral Reddit Food Delivery Fraud Claim Debunked as AI Hoax

    January 12, 2026

    Activist Erases Three White Supremacist Websites onstage at German Cybersecurity Conference

    January 12, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Malicious Chrome Extensions Compromise 900,000 Users’ AI Chats and Browsing Data

    January 12, 2026

    Wearable Health Tech Could Create Over 1 Million Tons of E-Waste by 2050

    January 12, 2026

    Viral Reddit Food Delivery Fraud Claim Debunked as AI Hoax

    January 12, 2026

    Activist Erases Three White Supremacist Websites onstage at German Cybersecurity Conference

    January 12, 2026
    Top Reviews
    Tallwire
    Facebook X (Twitter) Instagram Pinterest YouTube
    • Tech
    • AI News
    © 2026 Tallwire. Optimized by ARMOUR Digital Marketing Agency.

    Type above and press Enter to search. Press Esc to cancel.