By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
TechgoonduTechgoonduTechgoondu
  • Audio-visual
  • Enterprise
    • Software
    • Cybersecurity
  • Gaming
  • Imaging
  • Internet
  • Media
  • Mobile
    • Cellphones
    • Tablets
  • PC
  • Telecom
Search
© 2023 Goondu Media Pte Ltd. All Rights Reserved.
Reading: Q&A: Data centres need better cooling for Southeast Asia AI boom, says Vertiv
Share
Font ResizerAa
TechgoonduTechgoondu
Font ResizerAa
  • Audio-visual
  • Enterprise
  • Gaming
  • Imaging
  • Internet
  • Media
  • Mobile
  • PC
  • Telecom
Search
  • Audio-visual
  • Enterprise
    • Software
    • Cybersecurity
  • Gaming
  • Imaging
  • Internet
  • Media
  • Mobile
    • Cellphones
    • Tablets
  • PC
  • Telecom
Follow US
© 2023 Goondu Media Pte Ltd. All Rights Reserved.
Techgoondu > Blog > Enterprise > Q&A: Data centres need better cooling for Southeast Asia AI boom, says Vertiv
Enterprise

Q&A: Data centres need better cooling for Southeast Asia AI boom, says Vertiv

Alfred Siew
Last updated: August 22, 2024 at 5:18 PM
Alfred Siew
Published: August 22, 2024
9 Min Read
SHARE
Chee Hoe Ling, vice-president of product management at Vertiv Asia. PHOTO: Vertiv.

Data centres are mushrooming in various parts of Southeast Asia, as the demand for AI workloads and the region’s growing digital economy drive the need for more computing, storage and networking performance.

One big challenge for the region, of course, is the tropical climate that makes it difficult to cool down the more densely packed and hot-running servers in new data centres.

As a result, many data centres have resorted to liquid cooling. Some even immerse an entire server rack in liquid to transfer the heat away more efficiently than by using the air pushed out by fans.

Traditional cooling methods will not be enough to handle the heat generated by more demanding AI workloads in future, says Chee Hoe Ling, vice-president of product management at Vertiv Asia, which makes equipment such as cooling units used in data centres.

However, even if liquid cooling becomes the norm, the answer lies not just in removing heat from one specific part of the data centre, he explains.

“Data centres in this region will require extensive capacity increases across the entire power train, from the grid to chips in each rack,” he adds, in this month’s Q&A.

NOTE: Responses have been edited for brevity and house style.

Q: Briefly, how can we overcome the challenge of cooling down a data centre in tropical countries in Southeast Asia to support the region’s growing digital economy?

A: Cooling methods in tropical climates can’t follow the rulebook of regions with cooler environments. For instance, in warm and humid climates like Singapore, cooling IT equipment accounts for 37 per cent of total energy consumption in data centres.

Different approach to cooling can be taken. The Vertiv Liebert AHU chilled water system is one example. This cooling unit uses water- or air-side “economisations” and high-efficiency fans, working with hot-aisle containment to avoid the mixture of supply and return air.

The system also operates at higher return air temperature and with higher chilled water temperature, increasing the system efficiency.

It offers a higher-capacity yet compact cooling solution as kilowatt-per-rack demand continues to increase and data centre space is limited. It operates efficiently to combat rising energy costs.

Q: Many technologies have been tested over the years, which now include more exotic ones like liquid immersion. Which ones have proven to be both cost effective and sustainable for the long term?

A: There’s a tangible reason behind the growing profile of liquid immersion, and that is the higher thermal transfer properties of water or other fluids. In fact, liquid cooling can be up to 3,000 times more effective than using air, which is especially pertinent in the face of high-density racks.

Liquid immersion solutions have also demonstrated the potential to reduce cooling energy costs by as much as 95 per cent. These efficiencies were long recognised in mainframe and gaming applications, and now we’re seeing this expand into other applications. I think, in the long run, this will emphatically prove to not only be better for data centre operators’ bottomlines, but for the environment as well.

However, unlike air cooling, liquid cooling affects the numerator (total data centre power) and the denominator (IT equipment power) in the power usage effectiveness (PUE) calculation. This makes PUE an ineffective means to compare the efficiency of liquid with air-cooling systems.

Instead, total usage effectiveness (TUE) is more helpful – when this is done we notice liquid cooling reduces total data centre power by 10.2 per cent, while improving TUE by over 15 per cent.

Q: AI workloads are driving up denser data centres to deliver the performance needed. Are data centres in Southeast Asia able to deliver this, considering the challenges inherent with the tropical climate?

Q: Data centres in this region will require extensive capacity increases across the entire power train, from the grid to chips in each rack. Introducing liquid-cooling technologies into the data centre white space and eventually enterprise server rooms, will be a requirement for most deployments.

Traditional cooling methods will not be able to handle the heat generated by graphics processing units (GPUs) running AI calculations. Investments to upgrade the infrastructure needed to power and cool AI hardware are substantial and navigating these new design challenges is critical.

The transition to accelerated computing will not happen overnight. Data centre and server room designers across Southeast Asia must look for ways to make power and cooling infrastructure future-ready, with considerations for the future growth of their workloads.

Getting enough power to each rack requires upgrades from the grid to the rack. In the white space specifically, this likely means high amperage busway and high-density rack power distribution units (PDUs).

To reject the massive amount of heat generated by hardware running AI workloads, three liquid cooling technologies are emerging as primary options – direct-to-chip, rear-door heat exchange, and immersion cooling.

For direct-to-chip liquid cooling, cold plates sit atop the heat-generating components – usually chips such as CPUs and GPUs – to draw off heat.

Rear-door heat exchangers, however, replace the rear door of the IT rack with heat exchanging coils, through which fluid absorbs heat produced in the rack.

Finally, immersion cooling submerges servers and other components in the rack in a thermally conductive dielectric liquid or fluid, eliminating the need for air cooling altogether.

In addition, with the recently launched Sustainable Tropical Data Centre Testbed (STDCT) in Singapore, there is more reason to be buoyant about unlocking newer, more efficient thermal management technologies that are tailored to the specificities of tropical climates.

Q: Greenfield data centres may come with the latest efficient cooling but older ones need work to improve their PUE. How challenging is this work to redesign or retrofit such facilities in the region?

A: Retrofitting’s energy efficiency benefits overwhelmingly come to the fore when operators pivot from PUE to TUE.

On the issue of modernising infrastructure, there are several chokepoints that data centre managers may be wary of. These include compatibility, space constraints, downtime during upgrades, as well as potential disruptions to ongoing operations.

We understand that data centre managers have highly specific requirements about introducing liquids into the rack. When retrofitting or repurposing existing environments, we can help with our optimised designs that minimize disruption to existing workloads by leveraging available cooling infrastructure and heat rejection where possible.

For example, we can integrate direct-to-chip liquid cooling with a rear-door heat exchanger to maintain a room-neutral cooling solution. In this case, the rear-door heat exchanger prevents excess heat from escaping into the room.

For an air-cooled facility looking to add liquid cooling equipment without any modifications to the site itself, we have liquid-to-air design options available. This same strategy can be deployed in a single rack, in a row, or at scale in a large high-performance computing (HPC) deployment. For multi-rack designs, we have also included high amperage busway and high-density rack PDUs to distribute power to each rack.

These options are compatible with a range of different heat rejection options that can be paired with liquid cooling. This establishes a clean and cost-effective transition path to high-density liquid cooling without disrupting other workloads in the data room. 

How rugged mobile devices are empowering hybrid workstyles
AMD confident about chip supply, despite higher expected demand
NEC shows off tech that recognises faces behind tinted glass, sunglasses at Interpol World 2019
NUS assistant professor finds winning startup formula with deep technology research
Diversity strategies for the technology sector
TAGGED:Chee Hoe Lingdata centreLiebertliquid coolingliquid immersionQ&ASoutheast Asiasustainabilitytropical data centreVertiv

Sign up for the TG newsletter

Never miss anything again. Get the latest news and analysis in your inbox.

By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Whatsapp Whatsapp LinkedIn Copy Link Print
Avatar photo
ByAlfred Siew
Follow:
Alfred is a writer, speaker and media instructor who has covered the telecom, media and technology scene for more than 20 years. Previously the technology correspondent for The Straits Times, he now edits the Techgoondu.com blog and runs his own technology and media consultancy.
Previous Article Grid modernisation is integrating multiple industries
Next Article GlocalMe RoamPlug review: 3-in-1 pocket Wi-Fi for group travel
Leave a Comment

Leave a ReplyCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Stay Connected

FacebookLike
XFollow

Latest News

Scammers are so successful they even accidentally scam themselves now
Cybersecurity Internet
June 10, 2025
Doom: The Dark Ages review: Future fantastic demon slaying
Gaming
June 10, 2025
Plaud NotePin review: Note-taking made easy with AI
Internet Mobile
June 9, 2025
Can smart grocery carts, biometric payments boost retailers like FairPrice?
Enterprise Internet
June 6, 2025

Techgoondu.com is published by Goondu Media Pte Ltd, a company registered and based in Singapore.

.

Started in June 2008 by technology journalists and ex-journalists in Singapore who share a common love for all things geeky and digital, the site now includes segments on personal computing, enterprise IT and Internet culture.

banner banner
Everyday DIY
PC needs fixing? Get your hands on with the latest tech tips
READ ON
banner banner
Leaders Q&A
What tomorrow looks like to those at the leading edge today
FIND OUT
banner banner
Advertise with us
Discover unique access and impact with TG custom content
SHOW ME

 

 

POWERED BY READYSPACE
The Techgoondu website is powered by and managed by Readyspace Web Hosting.

TechgoonduTechgoondu
© 2024 Goondu Media Pte Ltd. All Rights Reserved | Privacy | Terms of Use | Advertise | About Us | Contact
Join Us!
Never miss anything again. Get the latest news and analysis in your inbox.

Zero spam, Unsubscribe at any time.
 

Loading Comments...
 

    Welcome Back!

    Sign in to your account

    Username or Email Address
    Password

    Lost your password?