Séverine Hanauer : “Le refroidissement par immersion est une technologie prometteuse pour les datacenters”


The analyst firm Gartner forecasts an increase in global spending on public cloud services of 20.4% in 2024. This increase can be explained in particular by the craze for artificial intelligence. But digital uses are causing a sharp increase in the electricity consumption of data centers. They must both meet their customers’ demands for computing power while reducing their carbon footprint.

Recently, more than a hundred European data center operators and sector associations signed the Data Center Pact. climate neutrality of data centersthrough which they have committed to achieving climate neutrality by 2030. To reduce their electricity consumption, many data centers are relying on chilled water cooling.

Explanations with Séverine Hanauer, Director Telco Strategic Segments & Edge Deployment, Southern Europe at the American Vertiv, one of the world’s leading providers of critical digital infrastructure.

Engineering Technologies: On the one hand, we see a sustained demand for artificial intelligence (AI) capabilities and on the other hand, the need to reduce energy consumption, costs and gas emissions at greenhouse effect. Is this a headache for the data center industry?

Severine Hanauer
A graduate in International Business Techniques, Séverine Hanauer has held various specialized positions around the sale of high-power inverters, then various management positions. She is now Director of Strategic Segments Telco & Edge Deployment at Vertiv, a specialist in the protection and optimization of sensitive infrastructures for data centers. Copyright: Vertiv

Séverine Hanauer: data center players are making great progress on a continuous basis, on all aspects of their activities and in the construction of their buildings. But artificial intelligence (AI), automation, high-performance computing (HPC), and machine learning are increasing processing demand, leading to higher thermal densities per chip. Despite the various agreements at the global level and the goodwill announced, we are still seeing a global increase in the carbon footprint. If we really want to be more virtuous towards our environment, we must change our behavior and optimize our use of digital toolswhether professionally or personally.

Is this why liquid cooling has become an essential option in high-density data centers?

The power of processors is becoming more and more important. It is even accelerating with the integration ofartificial intelligence. By carrying out more and more data processing over the same period, the heat density inside a computer rack continues to increase. This is the reason why many efforts are made in terms of the urbanization of rooms. Thus, the server racks are positioned to optimize the management of hot/cold air flows. Everything is partitioned into cold or hot aisles. Data center operators, but also companies that have their own facilities, also integrate cooling systems to optimally manage the heat dissipation of IT equipment.

But the most important thing is to integrate the cooling technology best suited to the application and physical configuration of the site. Among the latest trends, we find different solutions and in particular systems based on chilled water which come in the form of cooling walls to better channel air flows and cool as close as possible to IT equipment. The objective is also to limit the multiplication of traditional cooling equipment in a computer room.

Data center operators must take into account the density which is more or less important depending on the type of applications, such as high-performance computing (HPC) and AI which rely on the latest generations of the most powerful processors. In cases of extreme heat density, liquid cooling is the only option to efficiently remove heat, thanks to the heat transfer properties of liquids.

The direct-to-chip liquid cooling system is the closest we’ve come to using a standard computer bay. The difference lies in a bay design which must integrate a liquid collector to distribute it to the IT equipment, itself necessarily direct-to-chip compatible. For this solution, Vertiv has developed the Liebert® XDU, a system installed between liquid production and IT equipment to ensure perfect distribution. In this case, it is essential to modify the technical space of the site, which requires specific planning.

Another liquid cooling system, by immersion (Immersion Cooling), requires a rearrangement of the data center. In fact, the standard organization in vertical bays is replaced by horizontal tanks in which the IT equipment is entirely immersed in a thermally conductive dielectric fluid. Immersion cooling is a promising technology, but not yet completely mature, because it also requires the development of the skills of data center operators to be fully operational on these new solutions, their challenges and their constraints.

Edge computing is growing. Is that why you designed Liebert APM2?

Many companies, small or large, which have small computer rooms, are in fact interested in edge computing, because the computer processing is carried out as close as possible to the end users and thus reduces the transmission time of the processed data.

Edge computing players may also be interested in liquid cooling technology. For example, they see the possibility of having a hybrid technical architecture with part of their site equipped with classic bays powered by a water cooling system and, in another part, equipment cooled by immersion. They can thus respond to internal or external demand for customers who wish to use the most powerful processors.

In terms of backed-up power supply, one of Vertiv’s latest UPS models is the Liebert® APM2, an efficient and scalable power solution suitable for edge sites. It is compatible with Li-Ion and VRLA batteries and offers very high efficiency, up to 97.5% in double conversion mode. It thus allows a reduction in operating and energy dissipation costs thanks to its performance and modularity.

Image credit of one: vecstock – freepik


Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top