Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    AI Safety Researcher Resigns, Warns ‘World Is in Peril’ Amid Broader Industry Concerns

    February 15, 2026

    Amazon’s Eero Signal Introduces Cellular Backup for Home Internet Outages

    February 15, 2026

    Microsoft Warns Hackers Are Exploiting Critical Zero-Day Bugs Targeting Windows, Office Users

    February 15, 2026
    Facebook X (Twitter) Instagram
    • Tech
    • AI News
    • Get In Touch
    Facebook X (Twitter) LinkedIn
    TallwireTallwire
    • Tech

      Amazon’s Eero Signal Introduces Cellular Backup for Home Internet Outages

      February 15, 2026

      AI Safety Researcher Resigns, Warns ‘World Is in Peril’ Amid Broader Industry Concerns

      February 15, 2026

      OpenAI Disbands Mission Alignment Team Amid Internal Restructuring And Safety Concerns

      February 14, 2026

      Startup’s New Chip Tech Aims to Make Luxury Goods Harder to Fake

      February 14, 2026

      Microsoft Exchange Online’s Aggressive Filters Mistake Legitimate Emails for Phishing

      February 13, 2026
    • AI News

      Amazon’s Eero Signal Introduces Cellular Backup for Home Internet Outages

      February 15, 2026

      AI Safety Researcher Resigns, Warns ‘World Is in Peril’ Amid Broader Industry Concerns

      February 15, 2026

      Amazon Eyes Marketplace to Let Publishers Sell Content to AI Firms

      February 15, 2026

      OpenAI Disbands Mission Alignment Team Amid Internal Restructuring And Safety Concerns

      February 14, 2026

      Startup’s New Chip Tech Aims to Make Luxury Goods Harder to Fake

      February 14, 2026
    • Security

      AI Safety Researcher Resigns, Warns ‘World Is in Peril’ Amid Broader Industry Concerns

      February 15, 2026

      Microsoft Warns Hackers Are Exploiting Critical Zero-Day Bugs Targeting Windows, Office Users

      February 15, 2026

      Microsoft Exchange Online’s Aggressive Filters Mistake Legitimate Emails for Phishing

      February 13, 2026

      China’s Salt Typhoon Hackers Penetrate Norwegian Networks in Espionage Push

      February 12, 2026

      Reality Losing the Deepfake War as C2PA Labels Falter

      February 11, 2026
    • Health

      Amazon Pharmacy Rolls Out Same-Day Prescription Delivery To 4,500 U.S. Cities

      February 14, 2026

      AI Advances Aim to Bridge Labor Gaps in Rare Disease Treatment

      February 12, 2026

      Boeing and Israel’s Technion Forge Clean Fuel Partnership to Reduce Aviation Carbon Footprints

      February 11, 2026

      OpenAI’s Drug Royalties Model Draws Skepticism as Unworkable in Biotech Reality

      February 10, 2026

      New AI Health App From Fitbit Founders Aims To Transform Family Care

      February 9, 2026
    • Science

      XAI Publicly Unveils Elon Musk’s Interplanetary AI Vision In Rare All-Hands Release

      February 14, 2026

      Elon Musk Shifts SpaceX Priority From Mars Colonization to Building a Moon City

      February 14, 2026

      NASA Artemis II Spacesuit Mobility Concerns Ahead Of Historic Mission

      February 13, 2026

      AI Agents Build Their Own MMO Playground After Moltbook Ignites Agent-Only Web Communities

      February 12, 2026

      AI Advances Aim to Bridge Labor Gaps in Rare Disease Treatment

      February 12, 2026
    • People

      Google Co-Founder’s Epstein Contacts Reignite Scrutiny of Elite Tech Circles

      February 7, 2026

      Bill Gates Denies “Absolutely Absurd” Claims in Newly Released Epstein Files

      February 6, 2026

      Informant Claims Epstein Employed Personal Hacker With Zero-Day Skills

      February 5, 2026

      Starlink Becomes Critical Internet Lifeline Amid Iran Protest Crackdown

      January 25, 2026

      Musk Pledges to Open-Source X’s Recommendation Algorithm, Promising Transparency

      January 21, 2026
    TallwireTallwire
    Home»Tech»Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year
    Tech

    Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year

    4 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year
    Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Microsoft is rolling out a new “People” grouping feature in OneDrive that uses AI to detect faces in photos and cluster them by person, ostensibly to streamline photo search and organization. According to Microsoft’s support documentation, turning this facial grouping on or off is limited to three toggles per year, and disabling the feature leads to deletion of all grouping data within 30 days. This has triggered concern from privacy advocates, especially since the default preview behavior seems to opt users in (rather than asking permission) and Microsoft has not publicly explained the rationale behind the three-times-a-year cap. Meanwhile, some preview users report that in practice the toggle restriction may not always be enforced — in tests, users could freely enable or disable the feature without seeing the three-times warning. Microsoft maintains that the face scans and biometric data are only used for the individual user’s account and not to train broader AI models.

    Sources: The Register, PC Gamer

    Key Takeaways

    – The new OneDrive “People” feature groups photos by detected faces and is set to default to “on” (opt-out) in preview, raising questions about user consent.

    – Microsoft’s support pages claim users can toggle the feature off only three times per year, though some users in testing have not encountered that limitation in practice.

    – Microsoft insists that all biometric and face-grouping data stays private to the user’s account (i.e. not used to train shared AI models) and will be deleted 30 days after opt-out.

    In-Depth

    Microsoft’s OneDrive is entering the domain of automated facial recognition with a new “People” grouping feature, currently in limited preview. The idea is simple: as photos are uploaded, AI will detect faces, cluster similar ones, and allow users to name individuals so future search by “person” becomes easier. In theory, this is a welcome productivity boost for people juggling thousands of photos scattered across devices and clouds.

    But the execution raises red flags. Rather than asking users to opt in, the feature is being enabled for preview users by default — requiring an opt-out for those who prefer not to participate. More concerning is the reported policy that users can only disable the feature three times per calendar year, per Microsoft’s documentation. That means if you toggle it off and later change your mind, you might hit your limit of changes and be locked into whatever choice you made until the next year.

    Why such a cap? Microsoft has not offered clear justification. One speculation is that turning off the feature triggers deletion of the face-grouping data (which is true: Microsoft says all grouping metadata is deleted within 30 days of opt-out), and turning it back on requires rescanning. Repeated toggling might strain system resources or complicate consistency across large photo libraries. But that’s at best a convenience argument — when you’re dealing with biometric data (face signatures, embeddings), the stakes are higher. In many jurisdictions, these kinds of facial identifiers are treated as sensitive biometric information, subject to stricter regulation and consent requirements.

    In practice, some users suggest the toggle limit isn’t always enforced. For example, Windows Central reported preview testers being able to freely turn the feature on or off without encountering the “three times” warning — hinting the limitation may be a legacy UI warning rather than a hard backend control. That discrepancy underscores a transparency problem: Microsoft’s official support page still describes the feature as “coming soon,” even though some previews are already active.

    Microsoft’s assurances are worth noting: it claims face grouping data is only used within the user’s own account and will not feed into global AI training. Additionally, when users disable the feature, Microsoft says it will delete all corresponding grouping data within 30 days. These promises align with principles of data minimization and privacy, but the public has little means to audit or verify them. Without transparency or third-party review, users must trust Microsoft’s word.

    From a right-leaning perspective, I see this as emblematic of the tech industry’s push to normalizing AI and biometric processing, often ahead of user understanding or regulatory guardrails. Yes, tools like this can add real convenience — but privacy should never be the default sacrifice for ease. Companies should default to opt-in for sensitive features like facial recognition, and if they choose a toggle limit, they should clearly explain it and let users override it in exceptional cases.

    Until more details emerge, users should vigilantly monitor OneDrive’s settings, opt out if they’re uncomfortable, and treat Microsoft’s promises as conditional rather than absolute. With AI increasingly woven into core services, vigilance is the best defense.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleMicrosoft’s Dragon Copilot Boosts Clinician Time With Patients
    Next Article Microsoft’s Windows 11 “Agentic” AI Push Introduces Productivity Gains — and Fresh Security Risks

    Related Posts

    Amazon’s Eero Signal Introduces Cellular Backup for Home Internet Outages

    February 15, 2026

    AI Safety Researcher Resigns, Warns ‘World Is in Peril’ Amid Broader Industry Concerns

    February 15, 2026

    OpenAI Disbands Mission Alignment Team Amid Internal Restructuring And Safety Concerns

    February 14, 2026

    Startup’s New Chip Tech Aims to Make Luxury Goods Harder to Fake

    February 14, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Amazon’s Eero Signal Introduces Cellular Backup for Home Internet Outages

    February 15, 2026

    AI Safety Researcher Resigns, Warns ‘World Is in Peril’ Amid Broader Industry Concerns

    February 15, 2026

    OpenAI Disbands Mission Alignment Team Amid Internal Restructuring And Safety Concerns

    February 14, 2026

    Startup’s New Chip Tech Aims to Make Luxury Goods Harder to Fake

    February 14, 2026
    Top Reviews
    Tallwire
    Facebook X (Twitter) LinkedIn Threads Instagram RSS
    • Tech
    • Entertainment
    • Business
    • Government
    • Academia
    • Transportation
    • Legal
    • Press Kit
    © 2026 Tallwire. Optimized by ARMOUR Digital Marketing Agency.

    Type above and press Enter to search. Press Esc to cancel.