Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    UK, Australia, Canada Clash With Elon Musk Over AI Safety, Truss Pushes Back

    January 13, 2026

    Researchers Push Boundaries on AI That Actually Keeps Learning After Training

    January 13, 2026

    Smart Ring Shake-Up: Oura’s Patent Win Shifts U.S. Market Landscape

    January 13, 2026
    Facebook X (Twitter) Instagram
    • Tech
    • AI News
    Facebook X (Twitter) Instagram Pinterest VKontakte
    TallwireTallwire
    • Tech

      Smart Ring Shake-Up: Oura’s Patent Win Shifts U.S. Market Landscape

      January 13, 2026

      Researchers Push Boundaries on AI That Actually Keeps Learning After Training

      January 13, 2026

      UK, Australia, Canada Clash With Elon Musk Over AI Safety, Truss Pushes Back

      January 13, 2026

      Joby Aviation Expands Ohio Footprint to Ramp Up U.S. Air Taxi Production

      January 13, 2026

      Amazon Rolls Out Redesigned Dash Cart to Whole Foods, Expands Smart Grocery Shopping

      January 13, 2026
    • AI News
    TallwireTallwire
    Home»Tech»Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year
    Tech

    Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year

    4 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year
    Microsoft’s OneDrive AI Adds Face-Grouping — But You Can Only Opt Out Three Times a Year
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Microsoft is rolling out a new “People” grouping feature in OneDrive that uses AI to detect faces in photos and cluster them by person, ostensibly to streamline photo search and organization. According to Microsoft’s support documentation, turning this facial grouping on or off is limited to three toggles per year, and disabling the feature leads to deletion of all grouping data within 30 days. This has triggered concern from privacy advocates, especially since the default preview behavior seems to opt users in (rather than asking permission) and Microsoft has not publicly explained the rationale behind the three-times-a-year cap. Meanwhile, some preview users report that in practice the toggle restriction may not always be enforced — in tests, users could freely enable or disable the feature without seeing the three-times warning. Microsoft maintains that the face scans and biometric data are only used for the individual user’s account and not to train broader AI models.

    Sources: The Register, PC Gamer

    Key Takeaways

    – The new OneDrive “People” feature groups photos by detected faces and is set to default to “on” (opt-out) in preview, raising questions about user consent.

    – Microsoft’s support pages claim users can toggle the feature off only three times per year, though some users in testing have not encountered that limitation in practice.

    – Microsoft insists that all biometric and face-grouping data stays private to the user’s account (i.e. not used to train shared AI models) and will be deleted 30 days after opt-out.

    In-Depth

    Microsoft’s OneDrive is entering the domain of automated facial recognition with a new “People” grouping feature, currently in limited preview. The idea is simple: as photos are uploaded, AI will detect faces, cluster similar ones, and allow users to name individuals so future search by “person” becomes easier. In theory, this is a welcome productivity boost for people juggling thousands of photos scattered across devices and clouds.

    But the execution raises red flags. Rather than asking users to opt in, the feature is being enabled for preview users by default — requiring an opt-out for those who prefer not to participate. More concerning is the reported policy that users can only disable the feature three times per calendar year, per Microsoft’s documentation. That means if you toggle it off and later change your mind, you might hit your limit of changes and be locked into whatever choice you made until the next year.

    Why such a cap? Microsoft has not offered clear justification. One speculation is that turning off the feature triggers deletion of the face-grouping data (which is true: Microsoft says all grouping metadata is deleted within 30 days of opt-out), and turning it back on requires rescanning. Repeated toggling might strain system resources or complicate consistency across large photo libraries. But that’s at best a convenience argument — when you’re dealing with biometric data (face signatures, embeddings), the stakes are higher. In many jurisdictions, these kinds of facial identifiers are treated as sensitive biometric information, subject to stricter regulation and consent requirements.

    In practice, some users suggest the toggle limit isn’t always enforced. For example, Windows Central reported preview testers being able to freely turn the feature on or off without encountering the “three times” warning — hinting the limitation may be a legacy UI warning rather than a hard backend control. That discrepancy underscores a transparency problem: Microsoft’s official support page still describes the feature as “coming soon,” even though some previews are already active.

    Microsoft’s assurances are worth noting: it claims face grouping data is only used within the user’s own account and will not feed into global AI training. Additionally, when users disable the feature, Microsoft says it will delete all corresponding grouping data within 30 days. These promises align with principles of data minimization and privacy, but the public has little means to audit or verify them. Without transparency or third-party review, users must trust Microsoft’s word.

    From a right-leaning perspective, I see this as emblematic of the tech industry’s push to normalizing AI and biometric processing, often ahead of user understanding or regulatory guardrails. Yes, tools like this can add real convenience — but privacy should never be the default sacrifice for ease. Companies should default to opt-in for sensitive features like facial recognition, and if they choose a toggle limit, they should clearly explain it and let users override it in exceptional cases.

    Until more details emerge, users should vigilantly monitor OneDrive’s settings, opt out if they’re uncomfortable, and treat Microsoft’s promises as conditional rather than absolute. With AI increasingly woven into core services, vigilance is the best defense.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleMicrosoft’s Dragon Copilot Boosts Clinician Time With Patients
    Next Article Microsoft’s Windows 11 “Agentic” AI Push Introduces Productivity Gains — and Fresh Security Risks

    Related Posts

    Smart Ring Shake-Up: Oura’s Patent Win Shifts U.S. Market Landscape

    January 13, 2026

    Researchers Push Boundaries on AI That Actually Keeps Learning After Training

    January 13, 2026

    UK, Australia, Canada Clash With Elon Musk Over AI Safety, Truss Pushes Back

    January 13, 2026

    Amazon Rolls Out Redesigned Dash Cart to Whole Foods, Expands Smart Grocery Shopping

    January 13, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Smart Ring Shake-Up: Oura’s Patent Win Shifts U.S. Market Landscape

    January 13, 2026

    Researchers Push Boundaries on AI That Actually Keeps Learning After Training

    January 13, 2026

    UK, Australia, Canada Clash With Elon Musk Over AI Safety, Truss Pushes Back

    January 13, 2026

    Joby Aviation Expands Ohio Footprint to Ramp Up U.S. Air Taxi Production

    January 13, 2026
    Top Reviews
    Tallwire
    Facebook X (Twitter) Instagram Pinterest YouTube
    • Tech
    • AI News
    © 2026 Tallwire. Optimized by ARMOUR Digital Marketing Agency.

    Type above and press Enter to search. Press Esc to cancel.