Close Menu

    Subscribe to Updates

    Get the latest tech news from Tallwire.

      What's Hot

      Epic Games Adds Inflation To In-Game Currency

      April 16, 2026

      Starlink Outage Reveals Military Dependence on SpaceX

      April 16, 2026

      The Gaming World as of April 2026

      April 15, 2026
      Facebook X (Twitter) Instagram
      • Tech
      • AI
      • Get In Touch
      Facebook X (Twitter) LinkedIn
      TallwireTallwire
      • Tech

        Starlink Outage Reveals Military Dependence on SpaceX

        April 16, 2026

        The Gaming World as of April 2026

        April 15, 2026

        Amazon Buys Satellite Company Globalstar- It’s About Control of Space-Based Connectivity

        April 15, 2026

        NASA Astronauts Use iPhones to Capture Historic Artemis II Mission Images

        April 8, 2026

        OpenAI Expands Influence With Strategic TBPN Media Acquisition

        April 8, 2026
      • AI

        Anthropic Code Leak Raises Questions About AI Security and Industry Oversight

        April 8, 2026

        The Rise Of Agentic AI Signals A Shift From Tools To Autonomous Digital Actors

        April 8, 2026

        AI Chatbots Draw Scrutiny As Teens Engage In Intimate Roleplay And Emotional Dependency

        April 8, 2026

        Ai-Powered Startup Signals Rise Of One-Person Billion-Dollar Companies

        April 8, 2026

        OpenAI Secures Historic $122 Billion Funding Round at $852 Billion Valuation

        April 7, 2026
      • Security

        Anthropic Code Leak Raises Questions About AI Security and Industry Oversight

        April 8, 2026

        DeFi Platform Drift Halts Operations After Multi-Million Dollar Crypto Hack

        April 7, 2026

        Fake WhatsApp App Exposes Users To Government Spyware Operation

        April 7, 2026

        ICE Deploys Controversial Spyware Tool In Drug Trafficking Investigations

        April 7, 2026

        Telehealth Firm Discloses Breach Amid Rising Digital Health Vulnerabilities

        April 6, 2026
      • Health

        European Crackdown Targets Social Media’s Impact on Children

        April 8, 2026

        AI Chatbots Draw Scrutiny As Teens Engage In Intimate Roleplay And Emotional Dependency

        April 8, 2026

        Australia Moves To Curb Social Media Addiction Among Youth With Expanded Under-16 Ban

        April 5, 2026

        Australia’s eSafety Regulator Warns Big Tech As Teens Circumvent Social Media Restrictions

        April 5, 2026

        Meta Finally Held Accountable For Harming Teens, But Real Reform Remains Uncertain

        April 2, 2026
      • Science

        Starlink Outage Reveals Military Dependence on SpaceX

        April 16, 2026

        Amazon Buys Satellite Company Globalstar- It’s About Control of Space-Based Connectivity

        April 15, 2026

        Artemis II Splashdown Signals A Step Closer to Mass Space Travel

        April 12, 2026

        Peter Thiel’s Bold Ag-Tech Gamble Signals High-Tech Disruption of Traditional Ranching

        April 6, 2026

        White House Tech Advisor David Sacks Steps Down To Lead Presidential Science Advisory

        March 31, 2026
      • Tech

        Starlink Outage Reveals Military Dependence on SpaceX

        April 16, 2026

        Peter Thiel’s Bold Ag-Tech Gamble Signals High-Tech Disruption of Traditional Ranching

        April 6, 2026

        Zuckerberg Quietly Offers Musk Support As Tech Titans Align Around Government Power

        April 4, 2026

        White House Tech Advisor David Sacks Steps Down To Lead Presidential Science Advisory

        March 31, 2026

        Another Billionaire Signals Exit As California’s Taxes Drives Out High-Profile Entrepreneurs

        March 28, 2026
      TallwireTallwire
      Home»Tech»OpenAI’s Sora 2 App Faces Backlash Over Non-consensual Face Use and Fetish Content Generation
      Tech

      OpenAI’s Sora 2 App Faces Backlash Over Non-consensual Face Use and Fetish Content Generation

      Updated:February 21, 20266 Mins Read
      Facebook Twitter Pinterest LinkedIn Tumblr Email
      OpenAI’s Sora 2 App Faces Backlash Over Non-consensual Face Use and Fetish Content Generation
      OpenAI’s Sora 2 App Faces Backlash Over Non-consensual Face Use and Fetish Content Generation
      Share
      Facebook Twitter LinkedIn Pinterest Email

      A new investigation reveals that the AI video app Sora 2, released by OpenAI, is already being used to create fetish-style videos featuring individuals’ faces without full informed consent—one tech journalist found ten out of her top twenty-five cameos were fetish-oriented, including belly-inflation and giantess content. The app allows users who opt-in “cameos” of their own face, but some users leave their settings open to strangers, opening the door to unsettling usage. OpenAI has acknowledged shadowy misuse, and although the app prohibits nudity and overtly sexual content, niche fetish videos appear to be proliferating. While Sora 2 also allows other users to generate videos with well-known figures (including deceased ones) and is now under fire from celebrity estates—for instance the estate of Martin Luther King Jr. requested a pause on his likenesses—critics say the platform’s safeguards around consent, likeness rights and “deep-fake” style misuse are still inadequate. The tech raises new questions about privacy, digital identity, and the future of AI-driven content creation in the social-media era.

      Sources: Business Insider, The Guardian

      Key Takeaways

      – The Sora 2 app enables users to upload their own face and voice “cameo” to appear in generated videos, but when their settings are open to the public this exposure can lead to non-consensual usage in fetish or deep-fake style content.

      – Despite OpenAI’s stated ban on nudity and sexual content, niche fetish outputs (belly-inflation, giantess, feet, etc.) are reportedly widespread—suggesting the guardrails may be too weak or too narrowly defined.

      – Celebrity and estate push-back (such as demands around Martin Luther King Jr.’s likeness) highlight that the implications extend beyond individuals to public figures and copyright/likeness law—raising serious privacy, ethical, and regulatory issues for generative video platforms.

      In-Depth

      The launch of Sora 2 by OpenAI marks a leap in AI-driven video generation, but with that leap comes a substantial set of risks—and right-leaning watchers might question whether the promise of innovation is outpacing the protections for individuals, especially when it comes to explicit or quasi-explicit content and the use of human likenesses. According to the investigation published by Business Insider, a reporter allowed her face to be used in the “cameo” setting of Sora 2, thinking it would be a fun way to explore the app. Instead she discovered that many of the most popular videos using her face involved fetish content—belly-inflation sequences, giantess tropes, foot fetish clips—all created by strangers using her likeness without meaningful consent. The fact that ten out of her top 25 most-viewed “cameos” were fetish videos suggests a pattern—not random isolated misuse but an ecosystem inclined toward this kind of content. The app’s design—allowing anyone (if you permit it) to create a clip with your likeness—means the door is wide open.

      From a conservative vantage, this raises immediate concerns. One is the erosion of individual control: consent should be clear, informed, and revocable—but opting into a cameo and then having your likeness reused in graphic or fetish contexts seems to fall short of that standard. In a world where digital identity is increasingly precious, giving away “face rights” loosely is akin to relinquishing personal control. Secondly, the fact that niche fetish content is thriving suggests that the platform’s content definitions are misaligned with common-sense expectations. While nudity and overt sexual acts might be banned, fetish content sits in a gray zone—and in practice that gray zone is becoming saturated. If a platform allows a user’s face to be used in fetish contexts by others, it opens the door to reputational harm, emotional distress, and even legal risk.

      Moreover, the issue extends to public figures and deceased individuals. The Guardian reports increasing concern from actors like Bryan Cranston, who have found their likeness used in Sora 2 videos they did not authorize. In response, OpenAI has promised stronger safeguards and has publicly supported legislation like the NO FAKES Act, which aims to ban unauthorized AI-generated likenesses. Even more striking: the estate of Martin Luther King Jr. requested a pause on his likeness being used after “disrespectful” deepfakes appeared—OpenAI complied, but only after public backlash. This reveals a reactive rather than proactive regulatory posture.

      On the business side, OpenAI’s move to launch Sora 2 as a social-video app (rather than simply a research tool or API) is strategic: vertical-scroll feeds, celebrity-style viral content, and user-generated “viral moment” potential tap directly into the largest social-media growth engine. That’s innovation, but from a policy lens it means the platform is exposed to mass-scale misuse almost immediately. The combination of advanced video generation, voice and face encoding, and social-feed mechanics creates a potent mix of creative empowerment and threat to personal and public rights.

      A right-leaning viewpoint would question whether regulation is simply reacting too slowly. Rapid innovation in Big Tech often outpaces rights protections, and platforms like Sora 2 show that even “responsible AI” firms risk becoming conduits for misuse if guardrails are inadequate. The default setting should lean toward individual control and explicit consent, especially when one’s face or voice can appear in content they never envisioned. Platforms should anticipate worst-case uses—from fetish imagery to impersonation and deep-fake fraud—and build default “opt-in from closed” systems, rather than open by default. Additionally, content policies covering niche fetish contexts should be clearer—if misuse is happening because the rules are fuzzy or enforcement weak, liability should tilt toward stricter standards.

      In practical terms for individuals, the Sora 2 scenario is a cautionary tale: be very careful about enabling your likeness in any AI platform, especially if you open it to “anyone.” The promise of creative fun may mask serious implications. For developers and platforms, the takeaway is that video generation is a differ­ent domain from still-image generation: motion, voice, and likeness combined raise far deeper personal-rights issues. And for regulators and policy-builders, this case underscores that AI content platforms should face the same kind of pre-emptive scrutiny that broadcast, film, and print media once did—not merely after the fact.

      In sum, while Sora 2 may represent a breakthrough in creative technology, the way it is currently being used—especially for fetish content and deepfakes of real peoples’ faces—raises serious ethical and regulatory concerns. If a user’s face can be swept into a video by an app they barely understand, and emerge in a context they wouldn’t choose, then the societal cost of “innovative” can be high. As generative video becomes mainstream, ensuring individual consent, controlling public-figure impersonation, and clarifying the gray zone of fetish content should be treated not as optional but foundational.

      OpenAI
      Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
      Previous ArticleOpenAI’s Math Misstep Draws Heavy Industry Backlash
      Next Article OpenAI’s Sora 2 Exposes the Weakness of Deep-fake Protection Standards

      Related Posts

      Starlink Outage Reveals Military Dependence on SpaceX

      April 16, 2026

      The Gaming World as of April 2026

      April 15, 2026

      Amazon Buys Satellite Company Globalstar- It’s About Control of Space-Based Connectivity

      April 15, 2026

      NASA Astronauts Use iPhones to Capture Historic Artemis II Mission Images

      April 8, 2026
      Add A Comment
      Leave A Reply Cancel Reply

      Editors Picks

      Starlink Outage Reveals Military Dependence on SpaceX

      April 16, 2026

      The Gaming World as of April 2026

      April 15, 2026

      Amazon Buys Satellite Company Globalstar- It’s About Control of Space-Based Connectivity

      April 15, 2026

      NASA Astronauts Use iPhones to Capture Historic Artemis II Mission Images

      April 8, 2026
      Popular Topics
      Tesla Cybertruck Sundar Pichai Satya Nadella Tesla trending spotlight Startup starlink Space Satellite Series A Tim Cook Viral Taiwan Tech Series B Software Stocks SpaceX Samsung UAE Tech
      Major Tech Companies
      • Apple News
      • Google News
      • Meta News
      • Microsoft News
      • Amazon News
      • Samsung News
      • Nvidia News
      • OpenAI News
      • Tesla News
      • AMD News
      • Anthropic News
      • Elbit News
      AI & Emerging Tech
      • AI Regulation News
      • AI Safety News
      • AI Adoption
      • Quantum Computing News
      • Robotics News
      Key People
      • Sam Altman News
      • Jensen Huang News
      • Elon Musk News
      • Mark Zuckerberg News
      • Sundar Pichai News
      • Tim Cook News
      • Satya Nadella News
      • Mustafa Suleyman News
      Global Tech & Policy
      • Israel Tech News
      • India Tech News
      • Taiwan Tech News
      • UAE Tech News
      Startups & Emerging Tech
      • Series A News
      • Series B News
      • Startup News
      Tallwire
      Facebook X (Twitter) LinkedIn Threads Instagram RSS
      • Tech
      • Entertainment
      • Business
      • Government
      • Academia
      • Transportation
      • Legal
      • Press Kit
      © 2026 Tallwire. Optimized by ARMOUR Digital Marketing Agency.

      Type above and press Enter to search. Press Esc to cancel.