What the heck is an AI PC? I spoke to Intel, AMD and Qualcomm to find out
Let's get to the bottom of this
I heard the phrase “AI PC” 36 times while I was at MWC 2024, and you’ve probably heard it hundreds more since Intel Core Ultra dropped at the end of last year. Every time I hear it, one question becomes a little louder in my head: what the hell is an AI PC?
Don’t get me wrong, the name kind of gives the game away — it’s a PC that does AI (duh). But as I started talking to the likes of Intel, Qualcomm and AMD, the answer is a lot more complicated than that, with huge future implications too.
So whether you need a quick explainer of this term, or you’re keen to see where this whole AI thing is going, stick around. We’re about to go into the hypothetical tall grass (probably generated by DALL-E).
The official definition of an AI PC
The term "AI PC" was coined by Intel for the company’s Core Ultra (Meteor Lake) chipset launch, so it technically is a bit of marketing speak. Here’s how the company defines it:
- Comes with Copilot — Pretty self-explanatory. You can’t really have an AI PC without the AI to interact with.
- Has the Copilot key — RIP Right Control. Microsoft is going all-in on AI and making a small but significant keyboard change, by replacing it with a dedicated Copilot key. This isn’t a hard and fast rule.
- Comes with new NPU, CPU, and GPU powered silicon — Translation: it runs Intel Core Ultra.
One key example that came out of MWC is the Honor MagicBook Pro 16, which packs a combo of an Intel Core Ultra chipset and a dedicated RTX 40-series GPU. I’ll leave my official thoughts for the review, but as of right now, I can safely confirm it’s quite the powerful creator laptop.
What it actually means
We can sit here and say that Intel has exclusive ownership over the term, but they’re certainly not the only company taking AI seriously in its chips. AMD was indeed the first company to add a dedicated AI engine to its chips (the Ryzen 7040 to be specific), Qualcomm’s Snapdragon X Elite is ready to do the same later this year, and Apple is bragging about the M3 MacBook Air being the best consumer laptop for AI.
So instead of sticking firm with Intel’s marketing, let’s dive a little deeper into the future every chipmaker is chasing down at the moment. Put simply, it’s about bringing AI processes down from the cloud and onto the device.
Sign up to get the BEST of Tom's Guide direct to your inbox.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
With a cloud server in the middle, you’re dealing with obstacles like latency and potential security scares. Running AI processes on the device itself leads to far faster performance (up to 2.2x AI performance for video editing compared to 13th Gen Intel), and better power efficiency (36% power reduction for video conferencing compared to the same last gen chip).
This is a common challenge that Intel has heard many times from developers, which led to the NPU being born. “The general trend is to offload as much of it [the AI processes] to endpoint [on-device] as possible,” David Feng, VP of Client Computing Group at Intel commented.
The benefits of this are vast, from faster operations that don’t rely on the cloud and its latency, to on-device security and preservation of battery life. To the average joes like me and you, that’s huge in terms of relying on AI to speed up workflows such as intelligently filling certain parts of an image in Photoshop or asking Copilot 365 to help with generating a monster spreadsheet.
It’s a lot of this which has pushed Intel’s big vPro platform announcement at MWC — bringing the AI acceleration to business laptops, paired with NPU-powered security from CrowdStrike, and helping the chipmaker reach its ambitious goal of reaching 100 million users of this silicon.
Qualcomm sees it the same way, but talks about the possibilities a little differently. In fact, the company’s not just talking about an AI PC, its sights are on it being a “next generation PC,” because it’s beyond AI.
Beyond mentioning the 75 trillion operations per second (TOPS) that Snapdragon X Elite is capable of, Qualcomm CMO Dom McGuire talked about how the AI capabilities of multiple devices will start to communicate with one another.
“Computational workloads and use cases are being distributed across divide form factors, so what freedom does that give us to do something different here?” McGuire pondered, which is something that he thinks “Microsoft’s definitely looking at” in its next OS updates.
Because let’s look at everything. It’s not just the AI PC, as phones are taking advantage of it too, and we’re seeing new gadgets like the Humane AI Pin and Rabbit R1. The secret sauce is going to be in how they could all interact in the very near future
What about the future?
At the moment, our use of AI comes down to two categories: functions that run quietly in the background (like webcam auto-framing and exposure control), and tools you interact with like ChatGPT and Google Gemini.
But what does the future of the AI PC look like? Put simply, it’s all about two things: compression and proactivity.
Compression
That first one is big for the businesses going in on AI. All you have to do is look at the fact that Microsoft lost $20 for every $10 Copilot subscription. For the way that people want to use it, the cost to the business is just not sustainable, which is where being able to shrink it down is going to be critical.
Currently, the NPU in Intel Core Ultra chips are capable of 11 TOPS, and with the rapid evolution of AI models, it can be argued that these models are going to start pushing the upper echelon of that limit soon — potentially up to 10x beyond the 7-billion-parameter Llama 2 model that the company shows off alongside its 80 models the processors support.
Feng points to this as being a critical part of its viability: “you don’t want to do a 70 billion, but we learned that by improving that compression algorithm, I can get a seven billion parameter model two-five years from now that is way better than the same-size model of today.”
Proactivity
“I think in the future, for AI to be generally useful and less novel, it has to learn from you and be more proactive, and suggestive in making recommendations,” McGuire stated. He’s pointing towards how the AI PC and your other connected devices can work together as a “sort of sixth sense, or a second brain.”
That’s not to say there aren’t going to be some obstacles. “What’s gonna happen if there’s a conflict, right?” McGuire questioned, as he pointed towards the multiple competing models that could be at odds with each other. Maybe you want specific purposes from multiple AIs on the same device. That’s going to need to be cleared up in the next few years.
And if it is smoothed out, he envisioned a future where the car (with Snapdragon) remembers that tonight is date night, sets your route, and orders your groceries to pick up in a flash. This goes beyond being simply proactive, but even further into being anticipatory of your actions.
Just one piece of the puzzle
And that’s when it hit me. In loving tech (shout-out to my fellow nerds), it can be easy to zone in with a tunnel vision-esque view on product categories in specific silos. The AI PC is capable of some interesting tasks right now, but those possibilities will continue to grow, to be more anticipatory of you and your needs.
But for the complete experience, we have to step back and see how every device you own could work together in the future of AI — elevating it from a novel way of making pictures and turning on dark mode to something that could touch nearly every part of your life in a truly helpful way. Beyond anything we see from the chatbots of today.
In some ways, that’s pretty scary. This is a fair reaction to have, and every company I spoke to is hammering that security message hard to reassure that what it learns about you will stay local to any device. But in other ways, the AI PC vision is going to be far bigger than just that laptop you own, and that’s a very exciting prospect.
More from Tom's Guide
Jason brings a decade of tech and gaming journalism experience to his role as a Managing Editor of Computing at Tom's Guide. He has previously written for Laptop Mag, Tom's Hardware, Kotaku, Stuff and BBC Science Focus. In his spare time, you'll find Jason looking for good dogs to pet or thinking about eating pizza if he isn't already.