Cheraw Chronicle

Complete News World

Microsoft Recall is off to a bad start on the security front

Microsoft Recall is off to a bad start on the security front

It’s been a week since Microsoft proudly introduced Copilot+ PCs: “the fastest, most secure Windows PCs ever.” Microsoft said. However, it now appears that the security of Recall, one of the announced AI applications, leaves a lot to be desired.

Things are starting out pretty well for Microsoft, maybe something in good. They continue to brag about the improvements made to the MacBook Air and all the software made possible by the core Neural Processing Unit (NPU). However, last Friday it became clear that Recall, which was considered a real AI tool, could simply run on a laptop with lower hardware specifications than those of Copilot+ computers (something that would normally be some sort of bundled deal ). Dedicated to Microsoft). While this may say more about Recall than the capabilities of the new wave of AI PCs, as they’re being called, it was at least noticeable. However, it now appears that the same tool was not secured properly.

What does Recall do again?

Computers become a mess over time, and sometimes you tend to forget files or certain information. Think of a particular PowerPoint slide or a conversation with your colleague. Recall aims to help you find all these digital items. It does this by constantly “monitoring” what you do on your computer, then generating an organized report. You can then look that up in some sort of digital timeline. It sounds very useful, but then you have to give up some of your privacy, I hear you think. This is true, although the app does not record anything when you have a private window open or when you view copyrighted material. But there is still a lot of information, especially since the application launches automatically.

But you can at least assume that this information is stored under lock and key. but? Well, turns out that’s not the case after all, says Kevin Beaumont Blog “Screenshots are taken every few seconds,” he says. “They are automatically OCR scanned by Azure AI on your device and saved to a database in the user’s folder.” The problem arises when this information turns out to be stored unencrypted and therefore readable. If malicious parties have access to this, they could learn a lot about you. More so than if they had to hack each program individually. In fact, this simplifies the hacker’s life somewhat.

Why is Microsoft doing this?

You might think that a company like Microsoft should know better how to reveal these technical aspects themselves first, rather than waiting to surprise others with them. After all, it turns out that it is not particularly difficult to find out. This is perhaps a symptom of the speed with which Microsoft is launching its AI products. Microsoft undoubtedly saw the collaboration with OpenAI (and thus ChatGPT) as an opportunity to gain an advantage in this race, and quickly integrated AI into several products and services, under the name Copilot. However, the question remains whether its users are really waiting for this.

With the announcement of the Copilot+ PCs, Microsoft wants to be the leader when it comes to AI, but practice should show whether this is the best strategy – or whether it has acted hastily or overconfidently. While Microsoft is currently investing heavily in better hardware, we are simultaneously seeing growth in cloud software, which runs over the Internet. If we need expensive hardware, it is for special AI software. But if this software is not used enough in practice, it will be a painful miscalculation on Microsoft’s part.

See also  Apple removes Fakespot app from iOS App Store after Amazon request

Read more Microsoft news and never miss a thing with our new app.