Tuesday, June 11, 2024

Top 5 This Week

Related Posts

Microsoft Updates Recall Function on Copilot+ PCs to Address Privacy Concerns

Microsoft is facing concerns from experts regarding the privacy and security implications of its new function called Recall. Recall is a feature in Microsoft’s Copilot+ line of PCs running Windows 11 that creates a visual timeline of a user’s activity by taking screenshots every five seconds. These screenshots are stored locally on the device and analyzed using on-device AI to respond to user queries. However, security experts have raised concerns that Recall could be hacked, compromising user data and privacy.

One of the main worries is that Recall captures and stores all screen activity, including encrypted apps, which could potentially be accessed by anyone using the same device. Milli Sen, the founder of Paradigm, a firm specializing in generative AI, expressed concern about the privacy risks associated with Recall. He stated that privacy was the biggest concern with Recall, and now it turns out to be true.

Cybersecurity expert Kevin Beaumont warned that with Recall, it is now possible to steal everything a user has ever typed or viewed on their Windows PC with just two lines of code. This raised significant alarm regarding the potential for unauthorized access to sensitive information.

In response to the backlash, Microsoft announced updates to the Recall function. First, Recall will now be disabled by default on all Copilot+ PCs, instead of being enabled by default as originally planned. Secondly, turning on Windows Hello, a feature that allows users to sign in using facial recognition, fingerprint, or PIN, will be necessary to enable Recall. This added layer of authentication aims to make signing into the PC easier and safer.

Furthermore, Microsoft is enhancing data protection through Windows Hello Enhanced Sign-in Security. This means that Recall snapshots can only be decrypted and accessed after user authentication. The search index database will also be encrypted to further safeguard user data.

While Microsoft’s updated privacy and security promises have been met with some positivity, concerns remain about the implementation details. Security researchers are encouraged to conduct deep dives into the updates to ensure the effectiveness of the measures.

Kevin Beaumont called on Microsoft to commit to not forcing users to enable the Recall function in the future. He also stated that Recall should be turned off by default in Microsoft’s Group Policy and Intune services for enterprise organizations. These measures would provide users with greater control over their privacy and security.

In light of recent security incidents, including a Chinese hacking incident that compromised U.S. national security interests, Microsoft’s security practices have come under scrutiny. A federal review board found that the hacking incident was preventable and blamed Microsoft’s inadequate security culture. The board emphasized the need for Microsoft to demonstrate the highest standards of security, accountability, and transparency.

As a result, Microsoft’s vice chairman and president, Brad Smith, is set to testify before the House Homeland Security Committee to address the company’s security shortcomings and outline plans to strengthen security measures. This hearing will provide lawmakers with insights into Microsoft’s commitment to protecting user data and ensuring a more secure technology ecosystem.

In conclusion, Microsoft’s Recall function has raised concerns among experts regarding privacy and security risks. The company has responded to the backlash by implementing updates to enhance user authentication and data protection. However, questions remain about the implementation details and the overall security culture at Microsoft. The upcoming hearing will shed light on Microsoft’s plans to address these concerns and improve its security practices.

Popular Articles