CONTRIBUTORS
Andy O'Brien, CFA
Portfolio Manager, Global Technology,
United States
On an investment research team, it’s a bit unusual for semiconductor and utilities analysts to be deeply involved with a common theme. But the far-reaching implications of artificial intelligence (AI) are driving research collaboration in new and exciting ways. As AI enthusiasm and investment has intensified, the underlying issue of power consumption has grown from background noise to a loud roar.
Power-hungry GPUs — graphics processing units that power AI technology — are becoming the primary underlying silicon within data centers. An average AI server is as much as 14 times more power-intensive than a traditional enterprise server. Data center power consumption was already growing quickly prior to 2022. But the recent widespread adoption of AI, and its subsequent reliance on parallel processors, has served as an accelerant. It is estimated that data center power consumption could increase from 2% of total US electricity consumption to as much as 10% in the next four years.1 This brings potential disruption — and opportunities — to a wide range of businesses.
The challenges and opportunities for chip makers
A key concern for semiconductor companies is that at some point, perhaps as soon as 2026, power constraints could limit data center expansion and negatively impact the demand for their GPUs and other accelerators.2 We believe this is relatively unlikely as we see this issue as solvable. Through increased attention, capital, and innovation from a variety of companies exposed to this theme, we expect data center capacity growth will continue at or above planned levels. In addition to increased data center efficiencies, there is also an opportunity to increase energy capacity via both renewable and traditional sources.
Beyond the potential constraint on data center capacity growth, another implication for semiconductor companies is an increased focus on the power intensity of chips. We expect NVIDIA’s approach of making the biggest and most power-hungry chips will create opportunities for other players in the space. This will likely benefit companies focused on competitive AI chips as well as those participating in custom silicon efforts. The interplay between performance and power consumption will be increasingly important for chip makers in the years ahead.
In addition to the focus on the core GPU, there are other opportunities for innovation to drive efficiencies. One example is liquid cooling. As data center infrastructure becomes increasingly power dense, traditional air cooling begins to reach its limits and companies will turn toward alternative methods such as direct-to-chip liquid cooling. In our research, we are finding attractive companies with solutions in both air and liquid cooling.
Hyperscalers can help with power consumption challenges
Large cloud service providers such as Microsoft, Amazon, and Google — known as hyperscalers — will also be critical for unlocking solutions to power constraints. Like semiconductor companies, they face a range of challenges, but we expect a wide and innovative range of approaches.
A less-discussed risk is the potential for hyperscalers to purchase power that would otherwise be used for retail consumption, such as heating homes. If, for example, a hyperscaler was willing to pay higher rates than a small township, the power supply to consumers could be constrained. This is both a fundamental and reputational risk for these businesses. Today such scenarios are more hypothetical, but it will be critical to track advances in power efficiency across the tech stack to ensure these risks remain manageable.
Power consumption should be considered a primary risk — and opportunity — related to the AI theme, and it will need continuous research and tracking. Despite the challenges, we continue to believe the rapidly evolving AI technology presents compelling investment opportunities across a range of industries. It is clear to us that we remain in the early stages of a growing wave of investment in generative AI compute, and we believe this trend is sustainable. But we are also tracking a proliferation of new and exciting applications across enterprise and consumer use cases, and these will be critical to ensuring continued growth. Our investment team remains excited about a broad range of investment opportunities set to benefit from the deployment and adoption of Generative AI.
Endnotes
- Source: Putnam Equity Research model that tracks GPU/Accelerator growth and the corresponding impact on the power consumption. As of June 2024.
- Source: Putnam Equity Research on when AI power demands could outstrip power supply. Sources include company earnings calls and press reports such as "Musk's xAI, Oracle end talks on $10 bln server deal, the Information reports." Reuters. July 9, 2024.
WHAT ARE THE RISKS?
All investments involve risks, including possible loss of principal.
Equity securities are subject to price fluctuation and possible loss of principal.
Investment strategies which incorporate the identification of thematic investment opportunities, and their performance, may be negatively impacted if the investment manager does not correctly identify such opportunities or if the theme develops in an unexpected manner.
Focusing investments in information technology (IT) and technology-related industries, carries much greater risks of adverse developments and price movements in such industries than a strategy that invests in a wider variety of industries.
Any companies and/or case studies referenced herein are used solely for illustrative purposes; any investment may or may not be currently held by any portfolio advised by Franklin Templeton. The information provided is not a recommendation or individual investment advice for any particular security, strategy, or investment product and is not an indication of the trading intent of any Franklin Templeton managed portfolio.
