Energy Addiction: AI's Next Big Challenge

17 August 2023
5 min read

Investors should take a closer look at companies that help create a more energy-efficient ecosystem for AI.

There’s a big buzz around artificial intelligence (AI) and its potential to change the world. But much less has been said about its energy footprint. Companies that help solve this energy conundrum could enable a sustainable future for this burgeoning technology—and create opportunities for equity investors.

What’s known as “generative” AI uses machine learning to generate content—including text, audio, video and images. OpenAI’s wildly popular ChatGPT is perhaps the most well-known example. There are countless applications for generative AI, from academic writing to audio and video editing to scientific research. Companies everywhere are hunting for AI applications that can enhance productivity and create business benefits in industries ranging from healthcare to investment management.

But here’s the rub: AI requires massive computational power to train models. And that raises a thorny issue—namely, the energy impact of AI.

Generative AI Is an Energy Hog

What’s behind the magic of machine learning? There are two primary stages. The first is training, which involves gathering information so that machines can learn everything possible to create a model. The second is inference, whereby the machine uses that model to generate content, analyze new data and produce actionable results.

All of this requires energy. The more powerful and complex the AI model, the greater the training time and energy required (Display). 

AI Models’ Computational Complexity Requires Plenty of Power
Training time for AI models in petaFLOP/s-days*
Bar chart showing energy use by AI models, as measured by petaFLOP/s-days—a measure of compute performance.

Historical analysis does not guarantee future results.
*FLOPs (floating-point operations per second) is a measure of compute performance used in deep-learning models that require floating-point operations. PetaFLOP/s-days represents the number of days required to train a particular model, assuming that machine training the model performed a fixed amount of computation (1,015 neural net operations per second for one entire day). 
As of June 30, 2023 
Source: Bank of America, NVIDIA and AllianceBernstein (AB)

OpenAI’s GPT-3 model is illustrative. The energy needed to train GPT-3 could power an average American’s home for more than 120 years, according to a report from Stanford University. Meantime, a Bay Area chipmaker notes that energy requirements for training models that include transformers—a form of deep-learning architecture—have increased by 275 times every two years.

The Many Sources of Energy Consumption

AI’s energy consumption will come from many corners. In addition to training and running large models, the proliferation of AI-assisted products, including AI search and chatbots, will gobble up terawatts.

Increasingly complex models will, in turn, require the use of more specialized hardware, such as graphic processing units (GPUs). The good news is that GPUs deliver much more performance-per-watt than traditional central processing units (CPUs), which could offset the overall power requirements to train and run AI models.

Ultimately, these drivers of energy consumption will accelerate the construction of power-hungry data centers, which already account for nearly 1% of global energy use, according to the International Energy Agency. Even before AI began to take off, studies predicted a sharp increase in data center construction, driven by the energy needs of new technologies.

There’s also the issue of emissions to consider. In particular, investors are pushing companies to measure Scope 3 emissions—upstream and downstream emissions that can be difficult to quantify. As AI use increases, the Scope 3 emissions of all data users—including firms that traditionally have low carbon footprints—are likely to grow correspondingly.

How Are Companies Addressing the AI Energy Conundrum?

Fortunately, companies are beginning to address the enormous AI energy challenge. These include firms that are central to AI and those only nibbling at the periphery. We think investors should pay attention to three key areas:

Hardware and Software: Reducing AI-related energy use will require new processor architectures. US semiconductor makers are focused on delivering more energy-efficient performance. In fact, a major chipmaker has set a goal of increasing the energy efficiency of its processors and accelerators used in AI training and high-performance computing by 30 times over a five-year period. According to another chipmaker, its GPU-based servers in some applications, such as large-language model training, use 25 times less energy than CPU-based alternatives. As GPUs from these chipmakers and others take share from CPUs in data centers, energy efficiency should increase even further.

Conserving energy will also require advanced transistor-packaging techniques. Technologies such as dynamic voltage frequency scaling and thermal management will be required to produce more efficient machine learning. We believe companies involved in semiconductor chip production and inspection will have a significant role to play in bringing these new innovations to market.

Investors will also be hearing more about power semiconductors, which help improve the power management of AI servers and data centers. Power semiconductors regulate current and can lower overall energy use by integrating more functionality in smaller footprints.

Improvements in Data Center Design: As AI adoption fuels the expansion of data center capacity, firms that supply data center components could reap benefits. Key components include power supplies, optical networking, memory systems and cabling. Tech companies that use the data centers themselves also have a strong incentive to continue improving data center design and energy consumption.

Renewable Energy: Renewables made up 21.5% of US electricity generation in 2022, according to the Energy Information Administration. With 80% of the US power grid nonrenewable, near-term power could come from traditional fossil fuels.

But over time, AI demand could open the door for more renewable energy use. That’s especially true given that AI data centers will be operated by tech giants, whose net zero policies are among the industry’s best. As a result, we expect that accelerated adoption of AI could improve the investment prospects of the entire renewable-power ecosystem.

Investing in Energy Solutions

In all these areas, we believe that investors should search for quality companies with a technological advantage, persistent pricing power, healthy free-cash-flow generation and resilient business models. Companies with strong fundamentals that are poised to participate in and benefit from increased demand for energy-efficient AI capabilities could provide attractive opportunities for equity investors with a sustainable focus and those with an absolute-return mandate.

As AI adoption accelerates and search engines are replaced by chatbots, the energy impact of this revolutionary form of machine learning should not be overlooked. Initiatives aimed at creating a more energy-efficient AI ecosystem might not be in the spotlight now, but they could eventually unlock attractive return potential for investors who can spot the potential solutions early.

Claire Walter, Research Analyst—Sustainable Thematic Equities, contributed to this analysis.

The views expressed herein do not constitute research, investment advice or trade recommendations and do not necessarily represent the views of all AB portfolio-management teams. Views are subject to revision over time.


About the Authors