Anonymous 04/02/2024 (Tue) 03:55 No.61026 del
>>60244

to add:

this is more what i was lookin for, but Mistral LLM would be a plus:

Microsoft Copilot AI will soon run locally on PCs
According to Intel, it could require a specific level of AI processing power.
Mar 28, 2024

Microsoft's Copilot AI service is set to run locally on PCs, Intel told Tom's Hardware. The company also said that next-gen AI PCs would require built-in neural processing units (NPUs) with over 40 TOPS (trillion operations per second) of power — beyond the capabilities of any consumer processor on the market.


Intel said that the AI PCs would be able to run "more elements of Copilot" locally. Currently, Copilot runs nearly everything in the cloud, even small requests. That creates a fair amount of lag that's fine for larger jobs, but not ideal for smaller jobs. Adding local compute capability would decrease that lag, while potentially improving performance and privacy as well.


Microsoft was previously rumored to require 40 TOPS on next-gen AI PCs (along with a modest 16GB of RAM). Right now, Windows doesn't make much use of NPUs, apart from running video effects like background blurring for Surface Studio webcams. ChromeOS and macOS both use NPU power for more video and audio processing features, though, along with OCR, translation, live transcription and more, Ars Technica noted.

Message too long. Click here to view full text.