Green datacentres: a contradiction in terms?
Telcos’ role in enabling sustainable cloud through decentralised compute.
Traditional operational engineering design tends to concentrate activities into a few large locations. The underlying operating principal is that by physically centralising production activities and the supporting resources, we maximise control, efficiency, resilience, automation and thereby achieve economies of scale.
Energy generation is an example of this. For years, policy makers and their advisors argued that generating energy from renewables could never cost-in, partly because these facilities were deemed sub-scale and too distributed which made them expensive to install, connect, operate and maintain. And yet enlightened leadership, determination and human ingenuity have proved otherwise. Much to the sceptical commentators’ surprise, renewables now provide much lower cost energy than the centralised alternatives including traditional nuclear power plants.
Streamed media delivery has also become increasingly dispersed. Netflix originally distributed its video streaming from servers in a few centralised locations (and from similarly concentrated cloud services). It now has a content distribution network comprised of over 16,000 servers which are widely distributed deep inside ISPs’ networks. Other examples abound where we are seeing a trend to decentralisation of activities: additive manufacturing (3D printing), military operations and more recently, hybrid working.
True, doing stuff centrally makes it easier to physically marshal resources and control operating environments. But it also creates vulnerabilities (attack points and single points of failure), transportation overheads, and delivery delays. And then there are inefficiencies arising from market concentration.
Benefits of Decentralisation
New technologies, particularly information and communications technologies, are turning this thinking on its head. We can now achieve many of the benefits from concentrated operations without the downsides. This trend to decentralisation, enabled by new technology is providing key benefits:
Edge computing is both a key enabler of decentralisation and itself an illustration of the decentralisation of cloud compute from vast datacentres to much smaller ‘cloudlets’. Decentralisation is creating demand for edge compute because distributed activities need more distributed data processing. Edge compute is itself a production activity that was previously physically concentrated. Telecoms operators are ideally placed to meeting these needs and happen to also make excellent anchor tenants with their own – increasingly cloud-native – network workloads.
To make cloud compute more sustainable, we need to re-frame how we think about it. The challenge is that when approaching sustainability, everyone always starts with the datacentre, not the compute.
Today’s mammoth datacentres are becoming more energy efficient but only in a narrow sense of the term ‘energy efficient’. They are getting better (and more efficient) at disposing of the waste (heat) that the servers generate. On top of the compute energy, datacentres only require an additional 20-40% cooling energy, to dump the heat that is generated from the servers. This means for every useful unit of energy consumed, they need to use between 1.3 and 1.5 units of total energy. This is referred to as the PUE (Power Usage Effectiveness). It is the metric that datacentre and cloud operators typically use to measure energy performance and demonstrate sustainability credentials.
This is a bit like claiming that it is more efficient to compress plastic bottles into more compact blocks before committing them to local landfill. Generating waste is … well… wasteful. Re-use is… well… useful.
This distinction can be emphasised by adopting and setting targets for a measure such as ERE (Energy Reuse Effectiveness). ERE is roughly equivalent to PUE but has the considerable advantage of driving greater efficiency through re-use. Whereas datacentres struggle to achieve a PUE below 1.3, EREs of 0.3 have already been achieved.
Energy security and accelerated national net-zero goals mean that PUE (and soon ERE) will come under authorities’ scrutiny and then regulation. This has already started in some countries (Singapore, Norway). When households and businesses cannot afford to pay their energy bills, wasting heat in vast quantities will attract legislators’ attention. By re-using waste heat, some (mainly edge) datacentres will be able to achieve EREs of well below 1. The reason why smaller, edge datacentres are better placed to achieve lower EREs is that they are better able to address potential demand for waste heat; re-using the waste heat from typical datacentres is inherently challenging.
We can re-frame this challenge as an opportunity – how could compute evolve to support the need for heat in society? Some innovators (heata.co, Quarnot, Leafcloud) are generating the waste heat from cloud compute, literally right next to where it can be easily re-used by distributing workloads in a highly dispersed way.
Will today’s massive datacentres succumb to the same forces of decentralisation as hydrocarbon-fuelled power plants? Will they suffer the same fate and become potentially stranded assets: dinosaurs from a by-gone era replaced by smaller, more nimble, more efficient species better able to adapt to the challenges of a new, tougher climate? This may not appear to be an immediate threat, but given the right conditions, the current edge trickle could quickly turn into a flood. Only a few years ago, it was unthinkable that to meet their net-zero goals countries would legislate for the demise of the internal combustion engine. In 2021, the UK announced that no new fossil-fuelled vehicles would be sold from 2030. Other countries are following this lead.
Telecoms operators should see this as a big opportunity. Astute investors should also take note. Compare Tesla’s PE ratio to VW’s to understand the risks of assuming that the current approach to cloud infrastructure is sustainable: financially and environmentally.
Author: Philip Laider is Managing Director, Consulting at STL Partners
STL Partners interviews GECCO: Innovating with edge form factors
Edge computing continues to gain traction. While it is not yet clear whether edge computing will be more sustainable than cloud data centres environments, a number of vendors have already begun to explore a range of options to minimise their carbon footprint. STL Partners sat down with GECCO, an edge server manufacturer, to understand how they are addressing this challenge through exploring innovative form factors.
How to make edge computing more sustainable
If telecoms and technology companies are to meet their sustainably goals, it is imperative to make data centres run more efficiently.
Data Centre Sustainability: How Telecom Operators are Driving Change
Data centres and sustainability are top of the agenda for telecoms operators and their supporting vendor community.