Posted on

Artificial intelligence (AI) energy consumption is rising at a frightening pace: 2 stocks that could rise thanks to this trend

Artificial intelligence (AI) energy consumption is rising at a frightening pace: 2 stocks that could rise thanks to this trend

These two companies will solve two major problems arising from the rapid adoption of AI.

The proliferation of artificial intelligence (AI) has increased demand for more powerful chips deployed in data centers to train complex large language models (LLMs), and also for bringing these models into production through AI inference.

However, bundling multiple powerful chips that consume a lot of power and generate a lot of heat also means that data centers now have to contend with two new challenges. The first is to find a way to reduce power consumption. Market research firm IDC predicts that energy consumption in AI data centers will grow at a staggering 45% compound annual growth rate through 2027.

The company predicts that total data center power consumption could more than double between 2023 and 2028. Goldman Sachs predicts that data center electricity demand could increase by 160% by 2030, suggesting that data center operators will have to spend a lot of money on electricity.

The second problem that AI data centers create is higher heat generation. When multiple high-power chips are deployed in AI server racks, it is inevitable that they will produce a lot of heat. Unsurprisingly, there are concerns that AI data centers could have a negative impact on the climate and place greater strain on the power grid.

However, there are two companies that want to solve these challenges – Nvidia (NVDA 3.13%) And Super microcomputer (SMCI 2.07%) – and examine how their products could see a significant increase in adoption to address the problem of increasing heat and power generation in data centers.

1. Nvidia

Nvidia’s graphics processing units (GPUs) have been the chips of choice for AI training and inference. This is evident from the company’s over 85 percent share of the AI ​​chip market. Nvidia’s chips have been used to train popular AI models such as ChatGPT and OpenAI Metaplatforms“Llama and cloud service providers are increasingly looking to the company’s offerings to train even larger models.”

One reason for this is that Nvidia’s AI chips become more powerful with each generation. For example, the chip giant points out that its upcoming Blackwell AI processors will enable companies to “develop and run real-time generative AI on large language models with trillions of parameters, at up to 25 times lower cost and energy consumption than its predecessor .”

More importantly, this remarkable reduction in energy consumption comes with a 30x increase in performance. So not only can AI models now be trained and deployed much faster using Nvidia’s chips, but the same can now be done using much less power. Nvidia, for example, points out that its Blackwell processors can train OpenAI’s GPT-4 LLM using just 3 gigawatts of power, compared to the whopping 5,500 gigawatts that would have been required a decade ago.

Therefore, it will come as no surprise to see Nvidia maintain its lead in the AI ​​chip market, as its processors are likely to be in high demand due to the cost and performance advantages. For this reason, analysts at Japanese investment bank Mizuho predict that Nvidia’s revenue will exceed $200 billion in 2027 (which coincides with fiscal 2026).

That would be more than triple the company’s fiscal 2024 revenue of $61 billion. More importantly, Mizuho’s forecast suggests Nvidia could easily beat Wall Street’s estimates of $178 billion in fiscal 2026 revenue. As such, Nvidia stock’s impressive rise appears to be sustainable, which is why investors would do well to buy it while it’s still trading at a relatively attractive valuation.

2. Super microcomputer

Server manufacturer Supermicro has received a lot of negative press recently. From a pessimistic report from short seller Hindenburg Research about financial irregularities to a Justice Department investigation reported by the Wall Street Journal, investors have been selling Supermicro shares in a panic. Additionally, news of a delay in filing the company’s annual 10-K report appears to have added to the pessimistic sentiment.

However, investors should note that Hindenburg’s claims are likely biased since the short seller would have an interest in seeing Supermicro fall, and it remains to be seen whether her arguments are credible. Additionally, there is no confirmation from the Justice Department whether it is actually investigating Supermicro. Of course, Supermicro has a history of “false accounting,” which is likely why investors are panicking.

At the same time, however, investors should note that nothing has been proven yet and it is not certain that the Justice Department is conducting an investigation into the company. What is noteworthy, however, is that Supermicro is addressing the problem of higher heat generation in AI data centers with its liquid-cooled server solutions.

The stock rose sharply on October 7 after announcing that the company had shipped over 2,000 liquid-cooled server racks since June. Additionally, Supermicro notes that it plans to deploy more than 100,000 GPUs quarterly with its liquid cooling solutions. The company claims that its direct liquid-cooled server solutions can contribute to energy savings of up to 40% and space savings of 80%, which likely explains why its server racks are seeing solid demand.

Even better: Supermicro management pointed out last year that it can ship 5,000 liquid-cooled server racks per month, and it will come as no surprise that capacity utilization is increasing as data center operators look to reduce costs and energy consumption. Finally, Supermicro says the potential 40% power reduction allows you to deploy more AI servers in a fixed power range to increase computing power and reduce LLM training time, which is critical for these large CSPs and AI factories is.

Meanwhile, overall demand for liquid-cooled data centers is forecast to grow at an annual rate of over 24% through 2033, generating nearly $40 billion in annual revenue in 2033, compared to $4.45 billion last year. Supermicro is already experiencing impressive growth and this new opportunity, driven by increased heat and power generation in data centers, could provide additional momentum for the company.

Of course, investors would like more clarity on the company’s operations following recent developments, but it’s worth remembering that Supermicro’s earnings are expected to grow at a CAGR of 62% over the next five years. Therefore, this AI stock should be on the radar of investors looking to make the most of the opportunities presented by the AI-related challenges discussed in this article.

Randi Zuckerberg, former director of market development and spokesperson for Facebook and sister of Mark Zuckerberg, CEO of Meta Platforms, is a member of The Motley Fool’s board of directors. Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Goldman Sachs Group, Meta Platforms, and Nvidia. The Motley Fool has a disclosure policy.