Artificial Intelligence Features
Latest Artificial Intelligence Features

Lenovo Qira and HP IQ are both heading your way soon. Which one suits you best?
By Cale Hunt published
AI Lenovo and HP have both devised on-device AI assistants designed to make your digital life easier. I dug into the features to discover similarities and differences.

These apps make your PC's NPU worthwhile, and they have nothing to do with Copilot+
By Cale Hunt published
AI Your AI PC's Neural Processing Unit (NPU) is a useful piece of hardware, but are you making the most out of it? I dug up 7 apps that make good use of the NPU; which ones are you using?

Would you trust Microsoft with your medical records? I already do.
By Jennifer Young published
Copilot Many are skeptical about the use of AI in healthcare, but the truth is that it's already making a huge difference

5 things I learned after using the two top AI gaming assistants
By Cale Hunt published
PC gaming Hallucinations, ignorance, usage spikes, and more — here's what I learned after testing the two major AI gaming assistants.

Using AI to learn PowerShell has shown me humans are still king
By Richard Devine published
AI I've been using AI to try and learn some PowerShell as a complete beginner, and it's been an eyeopening experience so far. Also proof that human experts are always going to be important.

Running Ollama on WSL vs. Windows: how does it compare?
By Richard Devine published
AI On Windows 11, you can use Ollama either natively or through WSL, with the latter being potentially important for developers. The good news is, it works well.

You don't need to spend a fortune on a GPU to run LLMs in Ollama
By Richard Devine published
AI If you're looking at your PC and wondering what sort of GPU you might need to power local LLMs, the good news is it doesn't have to be as expensive as you think. Allow me to explain.

Why an older GPU might crush a newer one for AI
By Richard Devine published
AI If you're running LLMs locally on your PC using Ollama there's one key hardware spec you need to take into consideration. If not, your performance will tank.

I tried to replicate this Copilot feature with local AI, but it's just not the same
By Richard Devine published
AI Using Copilot to summarize web articles is one of my favorite features. I tried to replicate it using an on-device AI model and it just isn't the same.

Why you NEED to use LM Studio over Ollama for local AI if you use AMD or Intel integrated graphics
By Richard Devine published
AI I've been playing with Ollama a lot recently, but it lacks in one key area that has sent me back over to trying LM Studio, with great success, and no need for dedicated GPU.
All the latest news, reviews, and guides for Windows and Xbox diehards.
