Beyond the Hype
It’s fair to say that AI PCs are fast becoming a major talking point for vendors and organizations alike. Yet, for businesses looking to refresh their devices, making the right choice has never been more important.
In a recent webinar, Elliott Jones spoke with AI expert Rob May to discuss whether the talk around AI PCs is all hype or if there is more to it. In this article, we offer the key takeaways—with video clips (with subtitles)—from the webinar to help you make informed decisions as we enter a new era of computing.
What Is an AI PC?
AI PCs contain a dedicated chipset designed to handle artificial intelligence applications and utilize AI to improve performance, boost security, and provide greater personalization.
Programs powered by AI are nothing new, but PCs featuring neural processing units (NPUs) designed to enhance machine learning tasks are a relatively new frontier. With the rise of AI-powered chatbots such as ChatGPT, we’ve heard plenty about Large Language Models (LLMs)—algorithms that, through machine learning and large datasets, have the power to understand and reproduce human language.
AI PCs work in a similar way on a smaller, more localized, scale, using Small Language Models (SLMs) that, though more limited, are better suited to optimizing individual devices and performing more specific and focused tasks. One key benefit of SLMs is the ability to selectively move data between a machine’s physical storage and cloud storage networks, which AI expert Rob May thinks offers the best of both worlds.
“That mixed approach combines the strength of the cloud for intensive tasks and data storage with the speed and the privacy advantages of local processing,” he explains. This combination of advantages, Rob adds, “lowers latency, saves bandwidth, improves data security, and reduces the amount of sensitive data that’s sent to the cloud.”