Data centres are often referred to as the heart, backbone or bottleneck of digitalisation. All comparisons point to the fact that nothing works in the digital world without data centres. Every video we stream, every search query we send, every cloud service we use: data centres are responsible for processing all of our actions online.
As digitalisation permeates all areas of life, data centres are springing up all over the world. The race to process our data as quickly as possible is in full swing—but it comes with consequences. The power consumption of data centres is rapidly increasing, leading to rising emissions. What’s more, their water consumption puts pressure on drought-stricken countries, while rapidly outdated servers add to the global mountain of electronic waste.
Until now, the manufacture of devices was assumed to be responsible for the largest share of digitalisation’s CO2 emissions. But this seems to be changing. Data centres are overtaking, in no small part due to the increasing dominance of AI, and there’s a consensus that they’ll take the lead in coming years. This threat to the environment is clear to Dr Ralph Hintemann. A partner and senior researcher at the Borderstep Institute for Innovation and Sustainability in Berlin, a key focus of his work is the sustainable design of data centres.
In this interview, we asked Ralph Hintemann what’s behind the high CO2 emissions and water consumption of data centres. We talked about why AI servers in particular have such a short lifespan, the potential of waste heat recovery and how local authorities could play a crucial role in making data centres more sustainable.
RESET: Why are data centres so crucial for sustainable digitalisation?
Ralph Hintemann: If you look at digitalisation, you can distinguish between three areas: data centres, telecommunications networks and end devices. In the past, end devices massively dominated the demand for resources. A 2020 study showed that networks and data centres together accounted for less than a third of total CO2 emissions from digitalisation. End devices, on the other hand, accounted for more than two-thirds.
But all of that’s now changing. Video streaming, mobile phone use and many of the apps we use privately are based on services that run in data centres. Above all, however, companies and businesses need data centres. And we are seeing a massive increase in emissions in this area.
The International Energy Agency says that the power consumption of data centres is likely to more than double by 2030. That is enormous growth. And that’s why data centres will be dominate CO2 emissions from digitalisation in the long term. For years, the trend with end devices has actually been that they consume less and less power. Data centres, on the other hand, run 24 hours a day, 365 days a year and therefore consume a lot of electricity.
At the moment, however, the manufacture of appliances still dominates CO2 emissions, doesn’t it?
Yes, we currently still assume that end devices are responsible for more CO2 emissions than data centres. But data centres are growing massively. Even the EU has now set the target of tripling data centre capacity in Europe over the next five to seven years. And that could mean three times as much electricity. If that happens, it won’t be long before CO2 emissions from end devices become obsolete.
Our digital carbon footprint: Interview with Jens Gröger (Öko-Institut)
Our digital lives generate high CO2 emissions, mainly through device production and the operation of data centres. However, some lesser-known sources of emissions also contribute to our digital carbon footprint. Jens Gröger (Öko-Institut) tells us more in this interview.
Our growing demand for data centres is mainly due to the AI boom, isn’t it?
Even without AI, the sector is experiencing strong growth because we’re digitising production and administration. And we’re also using more and more digital services in our private lives. However, it’s assumed that artificial intelligence will further accelerate this growth and that we’ll have a significantly higher demand for energy and resources from data centres in the future, primarily due to AI.
What is the current energy requirement of AI?
People always talk about AI in general terms, but there are many different AI applications to consider. Applications that run on a smartphone or laptop don’t consume much power at all. In one project, we’re currently trying to optimise the use of waste heat from data centres with the help of AI. And the models we’re using are programmed on a laptop.
When we talk about AI today, we usually picture large language models (such as ChatGPT) or image and video generation models. And these forms of AI actually require enormous amounts of resources.
There is a study from the USA, where most of the Big Tech companies are based. It shows that AI applications already account for around half of the power consumption of the country’s data centres. Meanwhile, worldwide, AI accounts for around 20 percent of data centre capacity—about as much as Holland’s total electricity requirements for a year. In Germany, on the other hand, we have practically no large AI data centres outside of research.
Does this mean that Large Language Models (LLMs) are trained in special computer centres?
Yes, to train and subsequently use LLMs, we need specially-designed computers that use GPUs, i.e. graphics processors. Because of this, the manufacturer Nvidia has become one of the most valuable companies in the world. We could also train LLMs on normal hardware but it’s not as efficient.
However, graphics processors also require a lot of resources, especially electricity. And that’s not the only problem. GPUs are not used for as long as classic servers. While other servers typically run for six years or longer, AI servers are usually replaced much more quickly, according to our information. And that leads to ever-higher resource consumption.
Why do AI servers have such a short lifespan?
Well, there used to be many data centres, especially large ones, that replaced their classic servers after just two or three years. Today, they are used for much longer. Hardware naturally costs money; the necessary expansion in computing power could not have been achieved with new hardware alone. Computing processes that do not require as much power have also been shifted to older hardware. Fortunately, there are now more and more companies that specialise in refurbishing decommissioned servers and then reselling them for use in less powerful applications. Today, we expect classic servers to have an average service life of six years or more.
AI servers, on the other hand, are often replaced more quickly because performance requirements in this area are increasing massively. AI models are getting bigger and require more and more computing power—hence, they’re being replaced more quickly.
And what about increasing server performance?
There is a law called Moore’s Law. One of the founders of Intel realised that the number of transistors per chip doubles approximately every one and a half to two years. The structures on the microchips therefore became smaller and smaller, while the computing power became ever greater. This has been the case for decades.
We’ve been reaching the limit for some years: with microchips, it’s hardly possible to go any smaller. That’s why people have been talking about the end of Moore’s Law for some time now. And we can also see from new chips that performance is no longer increasing as quickly.
However, this doesn’t mean that computing power is increasing more slowly overall. Increasing digitalisation and the AI trend require ever faster growth in computing power. As a result, resource requirements are also increasing. The end of Moore’s Law therefore has the effect of accelerating the growth of data centres.
When we talk about replacing servers, we’re talking about huge amounts of electronic waste, aren’t we? Data centres are huge buildings full of server cabinets standing in long corridors. Are the entire server cabinets replaced with new ones or just individual components?
The cabinets contain the servers and, of course, network and storage technology. And yes, this technology is typically replaced completely. It’s similar to our PCs and other devices. You might replace a hard drive but that’s not standard procedure. Typically, the devices are replaced completely.
In data centres, there is a growing trend not just to recycle servers but to reuse them. As already mentioned, more and more companies are addressing the issue of server refurbishment. For example, a server is given more RAM and a new hard drive and can be used for a few more years.
The material recycling of servers, on the other hand, is more difficult. Many different elements are built into the devices. We are currently barely able to separate these with our recycling processes.
In addition to increased refurbishment, what else should be done to ensure that servers are used for longer?
The power requirements of digitalisation will continue to increase, even with more efficient servers. To improve this, one option is refurbishment. Another is cascading utilisation: using new servers for applications that require very high performance and older servers for applications with low performance requirements. Everyone is probably already familiar with this principle; for certain tasks, an old smartphone or an old laptop is sufficient.
However, cascading utilisation isn’t currently very common. Typically, servers are purchased and simply replaced after a certain period of time. One obstacle here is that many server manufacturers often only offer support for five years in their contract terms. Many operators therefore replace servers after five years as standard, even though the servers could have run for seven years or longer. Such structures need to be addressed: why shouldn’t a server manufacturer offer seven years of spare parts or updates?
Repairable and durable—making computers more sustainable
Recycled materials, smaller packaging and CO2 certificates: computer manufacturers may seem sustainable but is it all a facade? We write about the real solutions for sustainable computers.
There is another aspect of data centres that is heavily criticised: their high water consumption. There are countless protests across the world against data centre construction because of their thirst for water.
Water consumption is a complex issue because data centres consume water in three different ways. Firstly, the production of IT hardware requires a lot of water. However, there is still little information on this and hardly any awareness.
The second area, which currently causes the highest water consumption, is through electricity consumption. Electricity is often produced by vaporising water in oil, gas, coal or nuclear power plants, which have huge cooling towers and consume a lot of water.
And then there is the third area: water consumption in the data centres themselves. Here, water is mainly used for efficient cooling. Just as sweat evaporates on our skin and cools us down, data centres are cooled by water evaporation.
Water consumption for cooling can be enormous. A data centre can use over a billion litres of water per year, whether that’s clean drinking water, collected rainwater or processed water. The location of the data centre plays a part in whether cooling with water makes sense from an ecological point of view. If there is plenty of water available near the data centre, it may make sense. But in areas where water is scarce, reducing water supplies further could put strain on people and the environment.
Unfortunately, evaporative cooling works best in areas with dry air and high temperatures—exactly the kind of places where water is scarce. In areas with high humidity and low temperatures, evaporative cooling is less effective and is therefore not used as frequently.
What options do we have to reduce water consumption in data centres?
A data centre only needs large quantities of water directly for one particular cooling technology, evaporative cooling. But we don’t have to use this method.
From our studies, we know that only around 30 to 40 percent of data centres in Germany use evaporative cooling. In my opinion, this type of cooling should only be used where sufficient water is available. And even then, it’s best not to use high-quality drinking water but collected rainwater or grey water. Where water is scarce, cooling should take place without using water.
Why do most data centres use drinking water and not grey water or rainwater?
This is mainly a question of cost. You can use grey water or rainwater water but it has to be purified, which is complex and expensive.
What would be another efficient and sustainable cooling technology?
There are many different approaches to efficient cooling, for example using liquids or water in a closed circuit. Other methods include direct free cooling or using higher temperatures in the data centre.
At RESET, we’ve reported on swimming pools heated by data centre servers. In Stockholm, meanwhile, an entire district is already supplied with waste heat from a data centre. How scalable are these solutions? And are they really helpful in limiting the environmental impact of data centres?
Using waste heat from data centres will play a major role in Germany and Central Europe in the future. We have this enormous growth in data centres. And in purely physical terms, all that happens in a data centre is that electricity is converted into heat and computing power is produced in the process.
At the same time, in Germany and throughout Central Europe, our heat supply is heavily based on fossil fuels such as oil and gas. It therefore makes sense to combine using waste heat from data centres with the energy transition.
In my view, it is an ecological necessity to use as much of this waste heat as possible. The main reason why we didn’t do this in the past was that it was simply cheaper to burn oil and gas to generate heat instead of connecting data centres to heating networks. This is because you would have to use a heat pump to increase the temperature of waste heat before it can be used to heat homes, requiring extra electricity.
In Scandinavia, and especially in Sweden, the framework conditions are different. There are many more district heating networks than in Germany and the price of electricity used to be very low. This means that operating a heat pump quickly pays off and you can use and distribute heat from data centres much more cost-effectively.
But that’s changing now in Germany and Central Europe, too.
Yes, oil and gas will become more expensive and we need to expand our heating networks. And there are new, large heat pumps that make it easier to process heat from data centres. That’s why there are now more waste heat projects in Germany. But even so, we’re still at the beginning.
Do you mean that we’re still in the prototype phase when it comes to using waste heat from data centres? Or why is it not yet standard procedure?
I wouldn’t talk about prototypes; it’s currently about implementation. We don’t need any more innovations for waste heat utilisation: it’s already proven technology.
We’re now starting to implement more waste heat utilisation, but of course, that always takes a certain amount of time. There are rarely situations where a new major heat consumer, such as a new housing development, is situated right next to a data centre. There are projects in Frankfurt and Berlin where residential areas are now being supplied with heat from data centres. But implementation takes a few years, as the flats have to be built first.
Replacing an existing heat supply, on the other hand, is much more complex. One problem is that district heating networks in Germany often operate at very high temperatures of up to 100 degrees Celsius. It takes a lot of energy to bring the waste heat from data centres to this level. It’s hardly worth it unless there are new heating networks that operate at lower temperatures.
What about innovations for greater efficiency in data centres? Is there potential there?
In terms of efficiency, the current focus is primarily on making cooling and the entire building technology more resource-efficient. Cooling is already 25 to 30 percent more efficient than it was ten years ago. But we also need more electricity to cool data centres today than we did ten years ago because of a higher demand for computing power.
However, most of the electricity consumption is in the hardware and software. This is where the greatest potential for efficiency lies. But this also presents a major challenge because most software solutions today are not efficient at all: they’re programmed in such a way that they require far more resources than necessary. In the past, it was all about building software as quickly and securely as possible. We’re only just starting to teach the efficiency of software in computer science degree programmes. It’s therefore very difficult to make real progress in this area. We don’t even know how exactly we can measure efficiency. But there are initial approaches, such as the Blue Angel for software.
With artificial intelligence, the efficiency of software and hardware will become even more important. The type of AI, which models and which hardware we use play a major role in power consumption. I see an opportunity for Europe and Germany in particular. Perhaps we can find our position in international competition by making AI so efficient that we don’t need these huge data centres like in the USA.
Billions of kilowatt hours wasted
Large amounts of waste data is stored in data centres, consuming unnecessary energy. What’s more, because computing power is so cheap, companies write code with no regard for energy efficiency. And the design of AI models is all too often based on ‘brute force methods’ instead of well thought-out technology, says Professor Aoife Foley, the Chair of Net Zero Infrastructure at the University of Manchester.
We’ve talked about various solutions. But what does a truly sustainable data centre look like? How should we design a blueprint for data centres of the future?
For a data centre to be sustainable, it must be powered by renewable electricity and not use offset certificates to earn its green status. It should also use as much of the waste heat from the data centre as possible. This is especially true in Germany and other cooler countries.
The data centre should also be designed for sustainability as early as the construction phase. For example, there are now concepts for data centres using timber construction, wooden racks or more eco-friendly concrete.
It should use the hardware for as long as possible and rely on efficient software and hardware, utilising them to capacity. There are still many data centres where server utilisation is below 20 percent. That is neither efficient nor sustainable.
And when we talk about sustainable data centres, what goes on there should also have something to do with sustainability. For example, a data centre that consumes fewer resources in industrial production or makes our energy networks more efficient. If, on the other hand, a data centre does crypto mining or uses computing processes to tap into new oil reserves, then it’s not promoting sustainability, however sustainable its practices.
Ultimately, the energy and resource requirements, especially in large data centres, will always be considerable. It’s therefore almost impossible to achieve a 100 percent sustainable data centre.
Looking at what is processed in data centres is also interesting. There are now banks where you can have an influence on what happens to your money. But I’ve never heard of data centres excluding the processing of certain data.
Yes, I’m not aware of this either. There are, of course, green cloud or email providers, but what data is being processed doesn’t seem to be an issue for the main data centres so far.
How can we ensure a green digital future?
Growing e-waste, carbon emissions from AI, data centre water usage—is rampant digitalisation compatible with a healthy planet? Our latest project explores how digital tools and services can be developed with sustainability in mind.
There are now a number of regulations at an EU level to limit the environmental impact of data centres. What do you think, are we on the right track?
The next few years will show whether we’re on the right track. Up until 10 years ago, nobody even looked at data centres. But a lot has changed since then. The EU has set itself the goal of operating data centres and telecommunications infrastructures in a climate-friendly or even climate-neutral way by 2030. Germany has also set corresponding targets. With the Energy Efficiency Directive at an EU level, for the first time, there really are rules that apply specifically to data centres. In Germany, we have also implemented these with the Energy Efficiency Act, which defines clear initial rules.
At an EU level, the main issue at the moment is recording. Data centres have to report where they’re located and how much electricity and water they consume. This will be used to create a register.
In Germany, there are also initial specifications on how efficient data centre buildings must be with their air conditioning and power supply. They also state that 50 percent of electricity must come from renewable sources. From 2027, the target is 100 percent.
In Germany, there is also an obligation to use waste heat for new data centres that go into operation. At an EU level, the only requirement for waste heat utilisation so far is that the possibility must be examined.
However, this also raises the question of whether waste heat utilisation makes sense for all EU countries. In Spain or Greece, for example, it may not be absolutely necessary to prescribe the use of waste heat from data centres. The situation is similar for air conditioning. In countries with higher outside temperatures, more energy is required for cooling. Therefore, somewhat stricter limits are certainly appropriate in Germany than in Greece.
Are there other ways of influencing data centre operators to become more sustainable?
When building new data centres in Berlin, Brandenburg or the Frankfurt area, more and more local authorities or civil society groups are making additional sustainability demands that aren’t prescribed by law. For example, they say that the building must be constructed in a particularly sustainable way and have solar energy on the roof or that reforestation measures are necessary. Fortunately, local authorities are increasingly making such demands. And the operators of data centres very often play along, whether that’s to meet their own sustainability goals or because they need to remove local opposition and build as quickly as possible.
Of course, this is also a challenge for administrations. They first have to know how to deal with data centres and which requirements make sense. But there are already some great examples of data centres and local authorities working very well together. This starts with the redevelopment of brownfield sites, sustainable building construction and the use of waste heat, through to nature and species conservation compensation measures and support for municipal climate protection measures.
What can I do as an individual to reduce the CO2 emissions of data centres?
It is important to realise that every action on the internet consumes resources. The amount of resources is very low for a message but higher when using AI or streaming videos. When choosing cloud or email services, for example, you can opt for a sustainable provider. The more demand there is, the more market pressure there will be on larger companies.
And as a company or organisation, I could also choose a green data centre.
Exactly. Companies and organisations have significantly more options for action. For example, companies can also ask a data centre to tell them what their environmental footprint is. There’s a lot of movement in this area at the moment, so more transparency is emerging, also as a result of EU legislation.
Thank you very much for the interview, Ralph!
The post “Data Centres Will Account for the Majority of Digitalisation’s CO2 Emissions”: An Interview With Ralph Hintemann (Borderstep Institute) appeared first on Digital for Good | RESET.ORG.