Why Dion Harris believes Nvidia’s energy-intensive chips could actually help the climate

It takes a ton of energy to train the most advanced AI. According to one estimate, by the year 2027, AI electricity needs could be comparable to the energy use of all of Argentina.

That’s largely to do with the fact that the chips used to train AI models, called graphics processing units or GPUs, are far more powerful—and power hungry—than those used to, say, handle web traffic. AI-related GPUs fill rack upon rack in cavernous data centers, sending power demands soaring and prompting skeptics and climate hawks alike to worry about the potentially devastating impact that large language models could have on the climate.

It’s Dion Harris’s job to ensure that doesn’t happen.

As head of data center product marketing at Nvidia, Harris works with the company’s product teams and its clients to both grow Nvidia’s foothold in data centers and make those data centers more sustainable. An equally important part of that role? Making the public case that the efficiency AI will bring to a range of industries—from manufacturing to smart grid technologies—can actually be a positive thing for the planet long-term. “When you deploy these models, oftentimes, you’re replacing a much more inefficient practice or model,” Harris says. “There are savings that happen downstream, which often get missed.”

Using AI to address, rather than exacerbate the climate crisis, is a worthy, if somewhat speculative, goal. Already, Google has said it will likely miss its climate targets due to the soaring power demands of training AI, and some utility companies are keeping coal plants online just to satisfy the needs of data centers.

Nvidia can’t control any of that. What it can control is the energy efficiency of the chips that fill those data centers, and on that point, it’s made substantial strides. Just this year, Nvidia rolled out its next generation Blackwell GPUs, which the company says are 25 times more energy efficient than prior versions. The company is also working on liquid cooling for its servers, a process of cooling servers down that requires far less water than other approaches. A big part of Harris’s work involves ensuring that as Nvidia’s technology advances, its data center partners are ready and able to implement it. “Nvidia always starts with figuring out how we can make our platform as efficient as possible, so that it can drive the right behaviors throughout the value chain,” Harris says.

Harris acknowledges that all of this is a long-term bet on Nvidia’s part. In the midst of an unprecedented scramble to build new data centers to accommodate the growing appetite for AI, the industry’s overall power demands are only poised to grow. But while training AI is undoubtedly driving energy usage up, Harris argues there are also plenty of examples where the deployment of AI is driving energy usage down. The manufacturing giant Foxconn, for one, has said it used AI modeling to cut energy consumption by 30%.

“Those are very real, tangible savings that will have a huge impact,” Harris says. The more we can use AI to realize those energy savings, he argues, the better it will be for the climate in the long run. That is, as long as those savings can outweigh the costs.

This story is part of AI 20, our monthlong series of profiles spotlighting the most interesting technologists, entrepreneurs, corporate leaders, and creative thinkers shaping the world of artificial intelligence.

No comments

Read more