Views, News & more
It is well known that moving from an on-premises infrastructure to a cloud-based architecture can deliver instant benefits to businesses in terms of agility, scalability and cost-efficiency. When considering a cloud migration, the environmental implications might not be one of the foremost matters in a company’s decision.
However, as the tech community and the wider world become more environmentally conscious, it will become increasingly important for companies to be able to demonstrate their green credentials to customers, clients and other firms in their supply chain.
As a result of this shift in thinking, it is worth examining the environmental implications of moving from on-premises systems to the cloud. This includes the numerous benefits, as well as some potential drawbacks and areas in which progress still needs to be made.
There are several clear environmental benefits of moving to a cloud-based infrastructure, perhaps the most obvious being that increasing numbers of companies moving from individual data centres to a centralised system will massively reduce energy usage.
On-premises systems, previously the norm for many companies, understandably require massive amounts of power. These kinds of systems are generally constantly kept running, requiring a constant supply of power, as well as the round the clock running of a cooling system to prevent the machinery from overheating.
Naturally, the large numbers of firms migrating from these cumbersome individual infrastructures to the cloud will have a considerable impact on companies’ energy usage. According to a study by Lawrence Berkeley National Laboratory and Northwestern University (funded by Google), transitioning to the cloud could cut the energy usage firms require for their infrastructures down by around 87 per cent.
Similarly, moving from on-premise to the cloud can have a huge impact on the emissions generated by a company’s infrastructure. In the case of an individual, on-premise infrastructure, emissions are generated at nearly every stage of its operation.
Far from only producing harmful emissions while in use, on-site data centres create emissions both before and after their working lifecycle ends. Prior to use, the procurement or production of the materials needed to build them creates emissions, as does their assembly and transportation.
Following a usage period in which they continue to generate emissions, the disposal process generates more, as can the products themselves once disposed of.
Obviously, these issues may still exist with the large data centres required to support cloud infrastructure, but the scale is vastly reduced and as such the emissions of harmful greenhouse gases generated are significantly lower.
While the cloud computing sector still has a long way to go until it becomes a completely eco-friendly industry, it does offer far greater scope for harnessing renewable energy sources than on-site infrastructures generally do.
Over recent years, leading cloud providers (many of whom are large tech companies) have been investing huge amounts of money in powering their data centres through renewable energy sources such as wind, solar and hydro.
While the cost of an individual company converting their infrastructure to renewable energy may be prohibitive, dedicated providers know that it is an investment worth making in order to reduce their emissions and stand out as an eco-conscious brand. By switching to the cloud, individual companies can, in turn, tap into this.
Where can improvements be made?
Despite its obvious environmental advantages over on-premise infrastructures, cloud computing still has some eco-issues that providers will be looking to address in future. While there have been advances in the use of renewable energy to power cloud data centres, their size and the fact that they need to be constantly running still means that they consume huge amounts of energy.
The International Energy Agency claims that data centre electricity consumption in 2019 was around 200 terawatt-hours, which is close to one per cent of the global total. Clearly, despite the efficiency of centralised resources, this is still a huge carbon footprint.
One of the key issues that will need to be resolved as cloud providers look to the future is the cooling of their data centres. Data centres generate massive amounts of heat as they operate 24/7, meaning that cooling is an absolute necessity. However, the current methods have considerable environmental downsides.
Cooling through electrical systems, similar to air-con, obviously has huge drawbacks from a sustainability point of view in terms of energy consumption. Water-cooling has been seen as an alternative, but the scale of the water required and the energy required to cool it both have an environmental impact.
Despite these difficulties, cloud providers are still taking considerable steps to attempt to offset the impact of their cooling. For instance, an increasing number of new data centres are being located within the Arctic circle, offering an abundant supply of cold air and water to enable providers to cool their systems more efficiently.
Cloud computing is such an energy-intensive industry that it is bound to have a considerable environmental impact. However, the centralisation of resources and operations offered by cloud data centres mean that a cloud infrastructure has major environmental benefits over on-premise systems. At the same time, the dedication of most cloud providers to improving their sustainability means that, over time, cloud computing will only become more eco-friendly.