Powercolor’s Edge AI aims to significantly reduce GPU power consumption without a big hit in frame rates

Many PC enthusiasts don't like the fact that the best top-end GPUs use lots of power and try all kinds of ways of reducing the consumption, such as undervolting, frame rate caps, or lowering the max power limit. Graphics card vendor PowerColor is experimenting with a slightly different approach by using an NPU to manage the power usage in games, without impacting performance, in a system called Edge AI.

A demonstration of the work in progress was on display at PowerColor's Computex stand. While we didn't get a chance to see it in action (there was a huge amount to try and see at the event), tech site IT Home and X user Harukaze5719 managed to grab some pictures of the setup and see it running on two computers running Final Fantasy XV.

PowerColor's engineers hotwired an external NPU to an AMD graphics card, in one of the computers, and programmed it to manage the GPU's power consumption while rendering. At the moment, there's no indication of exactly what's going on behind the scenes but NPUs (neural processing units) are specialised processors for handling the math operations involved in AI routines.

What I suspect is going on is that the NPU is running a neural network that takes metrics such as the GPU's load, voltage, and temperature, as well as aspects from the game being rendered, and alters the GP voltage in such a way that the power consumption is significantly reduced on average.

In the Final Fantasy XV demonstration, the PC without an NPU ran the game at 118 fps, with the graphics card using 338 W to achieve this. The other setup, with the NPU-GPU combo, was hitting 107 fps at 261 W of power. That's a 23% reduction in energy consumption for a 9% drop in frame rate.

PowerColor's demo stand actually claims that Edge AI improves performance but it obviously forgot that if you're going to showcase a new bit of technology, then you kind of want to check that it does what you're saying it does before you wave it about in public. But even with that minor marketing boo-boo, the whole concept of Edge AI looks like it has quite a bit of potential.

Reducing the power consumption of a graphics card has multiple benefits—less heat is dissipated into your gaming room, the whole PC uses less electricity, and the peripheral components on the graphics card will last longer. All that seems worth the relatively small drop in performance.

See more

At the moment, Edge AI requires an external NPU to be wired to various points on a graphics card but if it's only monitoring voltages and temperatures, then an internal NPU could be used to do the same thing. The majority of these are embedded in laptop-based chips, such as AMD's new Ryzen AI series, Intel's Core Ultra range, and Qualcomm's Snapdragon X—processors that are rarely going to be used with a discrete graphics card.

The neural network that Edge AI runs could, in theory, be done on a GPU but they're not really designed to do such things on as little power as possible, unlike NPUs.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

That 77 W decrease in GPU power seen in the Final Fantasy demonstration would probably be far smaller if the routines were GPU-accelerated instead (and the fps drop would likely be larger, too).

I don't think PowerColor is planning on releasing a graphics card with an NPU on the circuit board, as that would eat into the profit margins. Instead, I suspect it's preparing Edge AI to be ready for when NPUs are routinely embedded in desktop CPUs from every vendor and if that's the case, then it's one of the few uses of AI that I'd genuinely look forward to seeing in action.

And if it's a success for PowerColor, you can bet your last dollar that every other graphics card manufacturer will want to replicate. As long as all these systems are optional to use, then that would be a positive step forward for the GPU industry as a whole.

Source

About Author