Microsoft's OneDrive spots your mates, remembers their faces, and won't forget easily

Microsoft's OneDrive is increasing the creepiness quotient by using AI to spot faces in photos and group images accordingly. Don't worry, it can be turned off - three times a year.

This writer has been enrolled in a OneDrive feature on mobile to group photos by people. We're not alone - others have also reported it turning up on their devices.

According to Microsoft, the feature is coming soon but has yet to be released - and it's likely to send a shiver down the spines of privacy campaigners.

It relies on users telling OneDrive who the face is in a given image, and will then create a collection of photos based on the identified person. Obviously, user interaction is required, and asking a user to identify faces in an image is hardly innovative. However, OneDrive's grouping of images based on an identified face is different. According to Microsoft's documentation, a user can only change the setting to enable or disable the new People section three times a year.

The Register asked Microsoft why only three times, but the company has yet to provide an explanation.

Unsurprisingly, Microsoft noted: "Some regions require your consent before we can begin processing your photos" - we can imagine a number of regulators wanting to discuss this. It took until July 2025 before Microsoft was able to make Recall available in the European Economic Area (EEA), partly due to how data was processed.

However, it is that seemingly arbitrary three-times-a-year limit applied to the People section that is most concerning. Why not four? Why not as many times as a user wants?

Turning it off will result in all facial grouping data being permanently removed in 30 days. There is also no indication of what Microsoft means by three times a year. Does the year run from when the setting is first changed, from when the face ID began, or on another date?

This feature is currently in preview and has yet to reach all users. While Microsoft is clear that it won't use facial scans or biometric data in the training of its AI models, and that grouping data can't be shared (for example, if a user shares a photo or album with another user), the idea of images being used in this way might make some customers uncomfortable. ®

Search
About Us
Website HardCracked provides softwares, patches, cracks and keygens. If you have software or keygens to share, feel free to submit it to us here. Also you may contact us if you have software that needs to be removed from our website. Thanks for use our service!
IT News
Nov 15
Researchers find hole in AI guardrails by using strings like =coffee

Who guards the guardrails? Often the same shoddy security as the rest of the AI stack

Nov 14
Canonical pushes Ubuntu LTS support even further - if you pay

Enterprise Linux vendors keep jostling to see who can prop up geriatric distros the longest

Nov 14
Now you can share your AI delusions with Group ChatGPT

Just when you thought virtual collaboration couldn't get worse, OpenAI stuffs a bot into your group conversations

Nov 14
GPU goliaths are devouring supercomputing - and legacy storage can't feed the beast

VDURA boss: Your x86 clusters are obsolete, metadata is eating 20% of I/O, and every idle GPU second burns cash

Nov 14
Tales from the pit: AI and the software engineer

Feature Exploring the evolving relationship between human engineers and their algorithmic assistants

Nov 14
Trillionaire fantasies, investor dreams, reality nightmares

Opinion Why Musk won't ever realize the shareholder-approved Tesla payout

Nov 14
UK tribunal says reselling Microsoft licenses is A-OK

Windows giant disagrees and plans to appeal