Data Center , Technology

How AI Is Driving Data Center Transformation - Part 3

Data Centers Explore New Technologies to Go Green and Run Cooler
How AI Is Driving Data Center Transformation - Part 3
Image: Shutterstock

As data centers incorporate GPUs to handle AI workloads, they face a new challenge. GPUs emit a lot of heat, so traditional cooling solutions such as computer room air conditioning, CRAC, units and computer room air handler, CRAH, will not suffice. This has prompted data centers to explore innovative cooling technologies designed for AI workloads.

In Part 2 of this series, we explored how data centers are addressing the need for more power and cooling while managing environmental impact. Part 3 will explore new and innovative cooling technologies being deployed by data centers and their plans for sustainability.

According to AFCOM's 2024 State of Data Center Report, AI is already having a major influence on data center design and infrastructure. Global hyperscalers and data center service providers are increasing their capacity to support AI workloads. This has a direct impact on power and cooling requirements. In terms of power, the average rack density is expected to rise from 8.5 kW per rack in 2023 to 12 kW per rack by the end of 2024, with 55% of respondents expecting higher rack density in the next 12 to 36 months.

As GPUs are fitted into these racks, servers will generate more heat, increasing both power and cooling requirements. The optimal temperature for operating a data center hall is between 21 and 24°C (69.8 - 75.2°F), which means that any increase in rack density must be accompanied by improvements in cooling capabilities. As traditional cooling systems can no longer meet these cooling demands, data centers are looking at new solutions that include liquid immersion cooling - LIC, rear-door heat exchangers, direct-to-chip cooling and liquid-to-air cooling.

New Cooling Technologies

Sumit Mukhija, CEO of ST Telemedia Global Data Centres (India), said his company has experimented with LIC and direct-to-chip cooling in their innovation lab. But they are also making other modifications in their data centers to prepare for increased rack densities, such as raising floor heights and increasing floor loading capacities to support heavier racks.

Jason Plamondon, regional senior manager - sustainability for Asia Pacific at Equinix, told ISMG his company is investing in "innovative" cooling technologies to improve efficiency. These advancements will also enable more businesses to use "the most performant cooling solutions" for the powerful, high-density hardware that supports compute-intensive workloads such as AI workloads.

"We have expanded our support for advanced liquid cooling technologies such as direct-to-chip to more than 100 of our International Business Exchange, IBX, data centers, in more than 45 metros around the world. This builds on Equinix's existing offering that supports liquid-to-air cooling, through in-rack heat exchangers, at nearly every IBX today," Plamondon said.

Green Data Centers

Equinix is also investing in renewable energy sources to build sustainable data centers. In 2015, it claimed to have become the first data center company to set a 100% renewable energy goal.

"In June 2021, we set a goal to be climate neutral by 2030 - aligned with a near-term science-based target across Scope 1, 2 and 3 as defined by the Greenhouse Gas Protocol and in line with the 2015 Paris Climate Agreement," Plamondon said.

According to Plamondon, Equinix continues to build upon its Future First sustainability strategy, which is designed to deliver "meaningful and measurable progress" that positively impacts its customers, partners, investors and employees. This commitment is a critical step to reduce greenhouse gas emissions.

In a video interview with ISMG, Vipin Shirsat, managing director - India, Princeton Digital Group, said sustainability is a major focus at PDG, which has pledged to go net neutral by 2030 - across all its sites and markets (see: Global Hyperscalers Head to Asia to Execute AI Strategies).

"Inside PDG, we have a framework that is followed by every data center asset. And this framework guides us toward initiatives we need to take to attain the sustainability goal we have set for ourselves," Shirsat said.

Data Center Efficiency

The efficiency of a data center is measured by a metric called power usage efficiency, PUE, which is the ratio of the total amount of power used by a data center to the power used by its computing equipment. To be more efficient, data center providers aim to reduce their PUE rating and bring it closer to 1. A way to achieve that is to reduce the power consumed by the cooling units through advanced cooling technologies. Here are some examples of data centers that have succeeded in this endeavor.

In October 2022, PhonePe, a leading Indian fintech platform for mobile payments, announced the launch of its first green data center in India. Built in partnership with Dell Technologies and NTT, the 4.8-megawatt facility, which occupies 13,740 sq. ft. at Mahape, Navi Mumbai, is built and designed with hybrid cooling technologies such as direct contact liquid cooling, DCLC, and LIC. With DCLC, the PhonePe data center achieved a PUE of 1.3. It also uses LIC, wherein its servers are dipped in a dielectric oil, helping achieve a PUE of 1.1. The PhonePe data center uses a combination of cooling technologies to achieve an aggregated PUE of 1.27 to reduce its power consumption to 1.58 mW. In comparison, a traditional air-cooled system can achieve a PUE of 1.6.

Yotta is another data center that has invested in DCLC technology. Using copper pipes, cool water is run through the cooling distribution unit right up to the racks or the chip within the server.

"This [type of engineering] is complicated and very difficult to do," said Sunil Gupta, co-founder, managing director and CEO of Yotta. "In the first phase, we went in for rear-door heat exchangers, which is not a new technology. But the future is with LIC technology, which can bring the PUE down to 1.1."

According to Gupta, Yotta's Shakti Cloud is India's fastest AI supercomputing infrastructure, with 16 exaflops of compute capacity supported by 16,384 Nvidia H100 Tensor Core GPUs - the largest deployment of GPUs in India.

Vimal Kaw, senior director - products and services, NTT Global Data Centers, India, predicts that DCLC will be popular in India in the next two and a half years. He stated that data center infrastructure is not ready for LIC as the equipment for this is not readily available.

Tim Rosenfield, co-founder and co-CEO of Sustainable Metal Cloud, told CNBC that most data centers are not ready for liquid of any type, whether it is immersion or direct chip cooling. "The market is figuring out the best way to employ this, and I think there'll be multiple ways," Rosenfield said. Sustainable Metal Cloud said its immersion cooling technology is 28% cheaper to install than other liquid-based solutions and reduces energy consumption by up to 50%.

Giordano Albertazzi, CEO of Vertiv, told CNBC, "There is still a lot of air cooling that still happens in the data center and will continue to happen even in the full high-density AI data center." But Albertazzi is confident that liquid-cooling adoption could accelerate in 2024. Vertiv offers thermal management solutions, including hybrid air- and liquid-cooling solutions, for fully liquid-cooled data centers.

Another major player in this area is Supermicro, which offers thermal management and liquid cooling solutions. Thermal management companies are the new stars on the block and are much sought after.


About the Author

Brian Pereira

Brian Pereira

Sr Executive Editor - CIO.inc, ISMG

Pereira has nearly three decades of journalism experience. He is the former editor of CHIP, InformationWeek and CISO MAG. He has also written for The Times of India and The Indian Express.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing cio.inc, you agree to our use of cookies.