Intel's next-gen Lunar Lake CPUs will be able to run Microsoft's Copilot AI locally thanks to 45 TOPS NPU performance

Intel's Pat Gelsinger at Intel Vision 2024
(Image credit: Intel)

What you need to know

  • Intel hosted its Vision 2024 event in Phoenix, Arizona this week, where it showed off next-gen Lunar Lake processors and many upcoming AI advancements.
  • It was revealed that the Lunar Lake chips will offer total package performance in the range of 100+ TOPS, with 45 TOPS set aside for the NPU.
  • Intel Core Ultra CPUs from the current Meteor Lake lineup offer about 34 TOPS of performance total, with 10 TOPS for the NPU alone.
  • Microsoft plans to make its AI assistant Copilot run locally on laptops, which will require at least 40 TOPS of NPU performance.

Intel's Vision 2024 event kicked off this week in Phoenix, Arizona, with a major focus on AI and what the company has in store for its next generation of Lunar Lake processors (CPU).

As first reported by Tom's Hardware, Intel CEO Pat Gelsinger revealed that the next-gen Lunar Lake CPUs will have more than 100 TOPS of total package performance for AI tasks, with the NPU getting 45 TOPS on its own.

TOPS stands for "Tera Operations per Second" or "Trillion Operations per Second," and it's currently the standout term being used to approximately measure the expected AI performance in AI PCs. The term ties in directly with the Neural Processing Unit (NPU), which is a new PC component optimized specifically to handle AI tasks.

Intel Core Ultra processor with NPU

(Image credit: Windows Central)

With Qualcomm's Snapdragon X Elite CPU offering 45 TOPS of NPU performance and 75 TOPS total across the chip, it's unsurprising that Intel was quick to give an update on next-gen Lunar Lake capabilities. 

Its Meteor Lake CPUs — like the Core Ultra chips we're seeing in the best laptops available today — top out at 34 TOPS total package performance (with the CPU, GPU, and NPU portions counted together), with just 10 TOPS carved out for the NPU alone.

On AMD's side, its Ryzen 8040-series mobile chips have an NPU with 16 TOPS; that number goes up to 39 TOPS for the whole package. AMD is also hard at work, boosting its AI performance to keep up with the competition. It's expected that its next-gen "Strix Point" mobile CPUs will offer up to three times as much AI performance thanks to a new XDNA 2 NPU, as revealed in December at AMD's Advancing AI event.

Intel, AMD, and Qualcomm are clearly putting a lot of emphasis on AI performance as measured by TOPS, and for good reason. AI PCs are all the rage right now, and AI features will continue to embed themselves in the devices we use every day. I don't think it's yet time to rush out and buy an AI PC, but it's clear that things are only going to heat up as we move forward.

Lunar Lake will be able to run Copilot locally

Copilot Pro on Windows

(Image credit: Windows Central)

Microsoft has revealed its requirements for what makes up an AI PC, which includes a new CPU, GPU, and NPU as well as Copilot and a dedicated Copilot key on the keyboard. These requirements are preliminary, and we can expect competing definitions and evolving terms.

Microsoft's next major Windows release is going all-in on AI, and the expected Fall release should bring deep AI integration into the OS. Having an NPU with a higher TOPS count will only help with the new AI enhancements. Perhaps most important, at least for now, is the fact that Intel revealed at its late-March AI summit that Copilot will soon run locally on AI PCs. This should improve everything from security and reliability to accessibility and cost reduction.

Tom's Hardware reported that running Copilot locally on your PC will require at least 40 TOPS of NPU performance, which was only achieved by Qualcomm's unreleased Snapdragon X Elite chip. 

But with Intel and AMD promising big NPU performance boosts in next-gen chips, you should soon have more options if you're interested in putting AI to work.

Cale Hunt

Cale Hunt brings to Windows Central more than eight years of experience writing about laptops, PCs, accessories, games, and beyond. If it runs Windows or in some way complements the hardware, there’s a good chance he knows about it, has written about it, or is already busy testing it. 

  • fjtorres5591
    The chip and PC companies need to be careful with their marketing hype.
    If they're not careful they'll OSBOURNE the whole market and turn hype into FUD.
  • ShinyProton
    Pretty funny.
    They are bragging about the performance of their next generation chip that is matching the one Qualcomm is releasing now.
    This is probably why Intel is also mixing the CPU + GPU into the recipe.

    Also, no price yet 😏
  • fjtorres5591
    ShinyProton said:
    Pretty funny.
    They are bragging about the performance of their next generation chip that is matching the one Qualcomm is releasing now.
    This is probably why Intel is also mixing the CPU + GPU into the recipe.

    Also, no price yet 😏
    In their defense, local "AI" software uses all three processors. If they're not properly balanced, the code will bottleneck at the weakest unit.

    And don't forget Intel has been behind in the architecture wars several times before, when folks were holding wakes for x86 and they have always come back to outpace the flavor of the month, be it SPARC, ALPHA, POWERPC or whatever.

    This *might* be the one time they don't come back but it's still too early to bury them. Especially if they manage to leapfrog TSMC manufacturing lead. All that money they're getting from the gerontocracy might come in handy.

    Finally, bragging about vapor products is a tried and true account control tactic from IBM's heyday. When you're the installed base leader (which Intel is) freezing the market hurts the competition worse than you. Folks won't necessarily rush to buy today's challenger (with the inevitable disruption from switching)if waiting a bit gets then equal or better from the entrenched player.

    As long as Intel's promises remain credible they don't have to lead in specmanship right now.
  • naddy69
    Didn't we see stories here about "AI" needing "nearby nuclear power plants" a few weeks ago? Because "AI" uses absurd amounts of electricity?

    Never mind that the idea of building nuclear power plants to power "AI" is insane. Totally impractical, because it takes MANY years to get a nuclear power plant built/licensed/online.

    So why would I want this junk running locally on my laptop? What about battery life? Current Intel CPUs are already power-sucking battery-killers.

    Again, the "AI" hype here is beyond belief.