Microsoft is planning to introduce an AI-powered “Recovery” feature to its Copilot+ line of PCs that has sparked privacy concerns. The Recall technology will allow users to easily find and remember anything they’ve seen on their PC by taking periodic snapshots of a user’s screen and analyzing them. The information will be stored in a way that allows users to search for things using natural language. Copilot+ PCs will organize information based on relationships unique to each user. Default configurations will come with enough storage for up to three months of snapshots.
Microsoft has implemented several measures to protect user privacy and security, including storing all data captured by Recall locally on the user’s Copilot+ PC in an encrypted manner. The technology will not save continuous audio or video, and users have the option to disable the feature. Enterprise administrators can also automatically disable recovery using group or mobile device management policies.
Despite these assurances, concerns have been raised about potential privacy and security risks associated with Recall. Security researcher Kevin Beaumont described Recall as a new “security nightmare” due to its ability to capture sensitive information like passwords and financial account numbers without content moderation. The ICO has requested more transparency from Microsoft regarding the technology.
Gal Ringel, CEO of Mine, criticized Recall for its invasive nature and lack of restrictions on hiding sensitive data. Microsoft’s resources to securely process and store large amounts of unstructured data raise concerns about the risks of collecting potentially sensitive screenshots. Organizations using Recall should consider their own risk profiles and take additional privacy safeguards.
The continuous screenshot capture functionality of Recall poses a potential threat to sensitive data if a device is compromised, according to Stephen Kowski, Field CTO at SlashNext. While Microsoft has implemented controls to protect user privacy, there are suggestions for additional safeguards like automatically identifying and redacting sensitive data in screenshots and clear user consent flows.
Although similar to UEBA tools used by organizations for endpoint security monitoring, Recall adds additional exposure to endpoints that can be targeted by attackers. Johannes Ullrich, dean of research at the SANS Institute, emphasized that UEBA data collection is designed with security in mind, while Recall provides attackers with a database of past activities.
Microsoft has not directly addressed the growing privacy concerns but has highlighted privacy mechanisms implemented around Recall in a blog post. The company’s assurances have not fully alleviated worries about the potential risks associated with the technology.
Article Source
https://www.darkreading.com/data-privacy/microsofts-recall-feature-draws-criticism-from-privacy-advocates