The case for eco-friendly computing
Industry Voices

The case for eco-friendly computing

Mark photo Deep Green Exmouth Installation 2.jpg

The exponential growth of data and computing is also exponentially growing the energy consumption and carbon emissions of data centres.

As an industry we consume 3% of the entire world's electricity supply. Amazingly, the amount of compute the world needs is projected to increase 10-fold - that is we need 10x the current capacity of data centres - over the next 10 years. Clearly something has to change. 30% of the world's electricity supply can't be made available to the data centre industry. The reality is all sectors are now competing with each other for electricity as we all rush to decarbonise. There simply aren't going to be enough electrons to go around.

Incremental efficiency gains in existing facilities can be achieved via existing cooling best practices and optimising power usage effectiveness (PUE) ratios but these measures alone won't be anywhere near enough. If the data centre industry is to expand its capacity to meet this explosion in demand, we urgently need to find a way we can operate without requiring more electricity than we currently consume. But how?

Deep Green Technology, a provider of distributed, edge data centres, has developed a model for carbon-neutral, sustainable data centres that reduces environmental impact while benefiting local communities. But perhaps most importantly of all we recycle heat and it's this recycling, which is the unlock, the get out of jail card for the data centre industry in the race to Net Zero. By twinning data centres with sectors which require heat such as home heating, swimming pools, laundries (30% of all industrial and leisure processes as it happens), we are performing a kind of alchemy. We are allowing the data centre industry to expand to 10x its size, whilst reducing the overall percentage of the total electricity pie we need.

Deep Green's pilot site in Exmouth, uses 100% renewable energy to power high-density workloads. But what makes us stand out is our immersion cooling and economic model. Rather than relying solely on power-hungry AC units, instead our servers are immersed in mineral oil baths that remove heat far more efficiently. In this way we can capture 97% of the energy that goes into the machines as usable heat. This warmth we then recycle into local buildings and swimming pools, for free. From a carbon perspective, by avoiding scope 1 & 2 emissions and offsetting our reduced scope 3, we are providing a carbon free way of scaling critical infrastructure. But longer term, by taking the compute to where the heat is required, we can expand our estate effectively without limit, and without putting extra strain on the grid. It's a simple, but revolutionary idea.

How easy is it for other providers to replicate what we're doing? The biggest obstacle is mindset. Reducing the environmental footprint of a compute estate requires a fundamental rethinking of your infrastructure design. Starting by gaining buy-in across an organisation is essential. Time in-market to show the solution can go toe to toe with a conventional data centre operation in terms of uptime and reliability is critical. Proof of concepts and baby steps to begin to build this trust and confidence.

Ensuring a consistent heat output for our hosts is the next biggest challenge. We generate heat at around 55-60 degrees but to continue to be welcome and useful to our heat hosts, we need to ensure our heat is always there. Big changes in containerisation and the way cloud native applications are built will soon naturally maximise the utilisation of each and every server. In the meantime, some clever load balancing is required to ensure heat output, cost of capital and asset utilisation stay in sync.

Operationally, building and running a decentralised network of meshed data centres is counter-intuitively pretty straightforward. We don't fret about cooling in a way a conventional data centre might; closed loop, single phase immersion is so simple there is very little to go wrong, unlike fan cooled kit and immersing computers actually protects and expands their lifespans. In terms of maintenance, whilst the baths of oil are pretty messy today, very soon (we're already trialling the kit) they will be replaced with hot-swappable units, which can be removed without contact to the oil by robots and sent for repair via AVs. Decentralisation actually makes maintenance easier, cheaper and quicker.

When it comes to deployments, it is a lot easier to land a Deep Green unit than it is to build a conventional data centre. We have no planning, grid connectivity, water, connectivity or build costs, which itself dramatically reduces our scope 3. We have 1000s of MW of sites already signed up, all of which already have the power we need. It takes us less than a week to land and "plug in" to a host site such as a swimming pool. Once we're in, our cooling costs are almost zero.

Consumer scrutiny and regulation like Europe's corporate sustainability reporting are already mandating that environmental impact be managed proactively so whilst data centres have not traditionally played this district energy role, there won't be anywhere to hide in the years ahead. Starting early to practice these new skills is essential. As Rackspace's head of sustainability remarked, "what's not eficient for the environment will not be efficient for business."

The good news is that demand for 0 emissions digital infrastructure is rising fast. Businesses with a Net Zero target will flock to your banner. They are to ours. With carbon now priced at $75 a tonne and rising, mimicking innovations like Deep Green will quickly become a competitive necessity, not just a nice eco-friendly add-on. The firms taking steps today to make green the default will have a first-mover advantage as the industry rapidly transforms.

Gift this article