STOCK TITAN

AB: Energy Addiction: AI’s Next Big Challenge

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Positive)
Tags
AI
Rhea-AI Summary
AI's energy consumption is a concern for sustainability, but companies are addressing the challenge by focusing on hardware and software improvements, data center design, and renewable energy.
Positive
  • US semiconductor makers like AMD and NVIDIA are working on more energy-efficient processors for AI training
  • Companies involved in semiconductor chip production and inspection, such as TSMC and ASML, will play a significant role in bringing new innovations to market
  • Data center components suppliers and tech companies like Amazon, Google, and Microsoft are improving data center design and energy consumption
  • Renewable energy could be a viable option for AI data centers operated by companies like Microsoft and Alphabet, improving the investment prospects of the renewable-power ecosystem
Negative
  • None.

NORTHAMPTON, MA / ACCESSWIRE / September 5, 2023 / AllianceBernstein

By Daniel C. Roarty, CFA| Chief Investment Officer-Sustainable Thematic Equities and Ben Ruegsegger, CFA| Portfolio Manager-Sustainable US Thematic; Senior Research Analyst-Sustainable Thematic Equities

There's a big buzz around artificial intelligence (AI) and its potential to change the world. But much less has been said about its energy footprint. Companies that help solve this energy conundrum could enable a sustainable future for this burgeoning technology-and create opportunities for equity investors.

What's known as "generative" AI uses machine learning to generate content-including text, audio, video and images. OpenAI's wildly popular ChatGPT is perhaps the most well-known example. There are countless applications for generative AI, from academic writing to audio and video editing to scientific research. Companies everywhere are hunting for AI applications that can enhance productivity and create business benefits in industries ranging from healthcare to investment management.

But here's the rub: AI requires massive computational power to train models. And that raises a thorny issue-namely, the energy impact of AI.

Generative AI Is an Energy Hog

What's behind the magic of machine learning? There are two primary stages. The first is training, which involves gathering information so that machines can learn everything possible to create a model. The second is inference, whereby the machine uses that model to generate content, analyze new data and produce actionable results.

All of this requires energy. The more powerful and complex the AI model, the greater the training time and energy required (Display).

OpenAI's GPT-3 model is illustrative. The energy needed to train GPT-3 could power an average American's home for more than 120 years, according to a report from Stanford University. Meantime, Bay Area chipmaker NVIDIA notes that energy requirements for training models that include transformers-a form of deep-learning architecture-have increased by 275 times every two years.

The Many Sources of Energy Consumption

AI's energy consumption will come from many corners. In addition to training and running large models, the proliferation of AI-assisted products, including AI search and chatbots, will gobble up terawatts.

Increasingly complex models will, in turn, require the use of more specialized hardware, such as graphic processing units (GPUs). The good news is that GPUs deliver much more performance-per-watt than traditional central processing units (CPUs), which could offset the overall power requirements to train and run AI models.

Ultimately, these drivers of energy consumption will accelerate the construction of power-hungry data centers, which already account for nearly 1% of global energy use, according to the International Energy Agency. Even before AI began to take off, studies predicted a sharp increase in data center construction, driven by the energy needs of new technologies.

There's also the issue of emissions to consider. In particular, investors are pushing companies to measure Scope 3 emissions-upstream and downstream emissions that can be difficult to quantify. As AI use increases, the Scope 3 emissions of all data users-including firms that traditionally have low carbon footprints-are likely to grow correspondingly.

How Are Companies Addressing the AI Energy Conundrum?

Fortunately, companies are beginning to address the enormous AI energy challenge. These include firms that are central to AI and those only nibbling at the periphery. We think investors should pay attention to three key areas:

Hardware and Software: Reducing AI-related energy use will require new processor architectures. US semiconductor makers like AMD and NVIDIA are focused on delivering more energy-efficient performance. In fact, AMD has set a goal of increasing the energy efficiency of its processors and accelerators used in AI training and high-performance computing by 30 times over a five-year period. According to NVIDIA, its GPU-based servers in some applications, such as large-language model training, use 25 times less energy than CPU-based alternatives. As GPUs from AMD, NVIDIA and others take share from CPUs in data centers, energy efficiency should increase even further.

Conserving energy will also require advanced transistor-packaging techniques. Technologies such as dynamic voltage frequency scaling and thermal management will be required to produce more efficient machine learning. We believe companies involved in semiconductor chip production and inspection, including Taiwanese chipmaker TSMC and Netherlands-based ASML, will have a significant role to play in bringing these new innovations to market.

Investors will also be hearing more about power semiconductors, which help improve the power management of AI servers and data centers. Power semiconductors regulate current and can lower overall energy use by integrating more functionality in smaller footprints. Firms like Kirkland, Washington-based Monolithic Power Systems and German semiconductor manufacturer Infineon Technologies are at the forefront of their development, in our view.

Improvements in Data Center Design: As AI adoption fuels the expansion of data center capacity, firms that supply data center components could reap benefits. Key components include power supplies, optical networking, memory systems and cabling. Tech companies that use the data centers themselves-think Amazon.com, Google and Microsoft-also have a strong incentive to continue improving data center design and energy consumption.

Coming full circle, AI itself is being used to optimize data center operations. In 2022, Google DeepMind released the results of a three-month experiment that involved training a learning agent called BCOOLER to optimize Google's data center cooling procedures. The result: BCOOLER achieved roughly 13% energy savings-underscoring that energy efficiency is improving in data centers, even as their numbers grow.

Renewable Energy: Renewables made up 21.5% of US electricity generation in 2022, according to the Energy Information Administration. With 80% of the US power grid nonrenewable, near-term power could come from traditional fossil fuels.

But over time, AI demand could open the door for more renewable energy use. That's especially true given that AI data centers will be operated by the likes of Microsoft and Google's parent company, Alphabet, Inc., whose net zero policies are among the industry's best. As a result, we expect that accelerated adoption of AI could improve the investment prospects of the entire renewable-power ecosystem.

Investing in Energy Solutions

In all these areas, we believe that investors should search for quality companies with a technological advantage, persistent pricing power, healthy free-cash-flow generation and resilient business models. Companies with strong fundamentals that are poised to participate in and benefit from increased demand for energy-efficient AI capabilities could provide attractive opportunities for equity investors with a sustainable focus and those with an absolute-return mandate.

As AI adoption accelerates and search engines are replaced by chatbots, the energy impact of this revolutionary form of machine learning should not be overlooked. Initiatives aimed at creating a more energy-efficient AI ecosystem might not be in the spotlight now, but they could eventually unlock attractive return potential for investors who can spot the potential solutions early.

Claire Walter, Research Analyst-Sustainable Thematic Equities, contributed to this analysis.

The views expressed herein do not constitute research, investment advice or trade recommendations and do not necessarily represent the views of all AB portfolio-management teams. Views are subject to revision over time.

Learn more about AB's approach to responsibility here

View additional multimedia and more ESG storytelling from AllianceBernstein on 3blmedia.com.

Contact Info:
Spokesperson: AllianceBernstein
Website: https://www.3blmedia.com/profiles/alliancebernstein
Email: info@3blmedia.com

SOURCE: AllianceBernstein



View source version on accesswire.com:
https://www.accesswire.com/780396/ab-energy-addiction-ais-next-big-challenge

AllianceBernstein Holding, L.P.

NYSE:AB

AB Rankings

AB Latest News

AB Stock Data

4.10B
113.34M
1.52%
19.82%
0.8%
Asset Management
Investment Advice
Link
United States of America
NASHVILLE