Microsoft plans to run parts of the Copilot software locally on the PC. That's what Intel executives say. However, the problem is that many devices do not have enough computing power to make this possible.
Much of Copilot's computing is now done in the cloud. If this could be done partly locally, it would reduce latency and Copilot would provide better performance and privacy benefits, because the data no longer needed to be sent to the cloud. These are exactly Microsoft's plans Tom's devices Based on statements by Intel executives. They talked about this during the company's AI Summit in Taipei.
However, hardware is still an important limitation. The intention is to have enough computing power to run parts of the Copilot software locally, without users realizing it. Therefore devices should not become slower and the battery should not drain faster. The latter can certainly be prevented by: neural processing unit, Or npu, it can be used instead of a graphics processing unit, Intel executives say. Even Microsoft will insist on this.
TrendForce reported in January that so-called AI PCs with Copilot integration should have NPUs containing 40 tera operations per second, Or tops. Intel executives have now confirmed this, but there aren't many of these NPUs out there. Intel reports that its Core Ultra chips now offer between 10 and 15 peaks, while AMD offers NPUs with up to 16 peaks. The only processor commonly used to run Windows that meets the requirements is the Qualcomm Snapdragon X Elite with 45 points. Intel's next generation Core Ultra chips should meet the requirements as well, according to the company.
“Lifelong entrepreneur. Total writer. Internet ninja. Analyst. Friendly music enthusiast.”
More Stories
Monster Jam Showdown Launch Trailer
The European Digital Twin Ocean prototype reveals many possibilities
Instagram now lets you add a song to your account