It was once said that software is eating the world, but it's true when it comes to sustainability. The ICT sector is responsible for up to 3.9% of global emissions — almost as much as the airline and shipping industries. And this is before you start thinking about the energy resources around AI. More applications mean more code which has to be stored somewhere.
While topics such as hardware efficiency, green energy, and ewaste are common targets for workplace sustainability, the software has been traditionally absent from the mix.
But the last couple of years have seen the world's biggest software makers on a mission to make software greener, focusing on software that uses fewer physical resources and less energy.
So how do you build or architect applications in the cloud to be more sustainable? Can we develop software that makes hardware more sustainable?
When it comes to software, lowering inefficiencies by reducing technical debt, right-sizing, VM optimisation, and writing efficient code reduces the environmental impact of software. And it offers a valuable opportunity for startups and enterprises of all levels to achieve greater sustainability and save some money on cloud storage.
I recently attended KubeCon + CloudNativeCon Europe, where green software engineering was a lead topic.
I spoke to Dr Huamin Chen, a Senior Principal Software Engineer and Sustainability Technical team lead at the Red Hat CTO office, to learn more.
The first step to tackling software's energy expenditure is tracking it.
A couple of years ago, Chen joined forces with fellow developers and researchers to launch the project Kepler (Kubernetes-based Efficient Power Level Exporter), a community-driven open source project founded by Red Hat's emerging technologies group with early contributions from IBM Research, Weave Works, and Intel.
The project came about when Chen realised "while research was plentiful in terms of open source sustainability practices, there was little to offer when it came to commercial products and projects."
Kepler utilises cloud-native methodologies and technologies such as CPU performance counters and machine learning models to probe per-container energy consumption related to system counters and exports them as metrics.
These metrics can be used to build dashboards that present power consumption at various levels, including containers, pods, namespaces or different compute nodes in the cluster.
Kepler metrics can be utilised by a Kubernetes scheduler to place the upcoming workload on the compute node that is projected to improve performance per watt, ultimately reducing the cluster-level power consumption.
Similarly, Kubernetes auto-scalers can use Kepler's power consumption metrics in auto-scaling algorithms to determine the resources needed for better energy efficiency.
The project has focused on providing reporting capabilities, especially in industrial applications like manufacturing, which involve enormous physical infrastructure, environments controlled by AI, CPUs and many machine learning models.
According to Chen:
"When the customer can see the metrics from Kepler about their workload and consumption, they are more motivated to do something to help to reduce it."
He notes that one of the strengths of Kepler is its strong research foundation:
"R&D may not always have the rigorous scientific methodology or rigour to create commercial grade software that helps the community or the customer.
Our collaborative efforts mean we can generate software products that are accurate, based on the facts, not based on intuition."
Also crucial for Kepler, is to develop software metrics that can speak the same sustainability language and goals of carbon zero legislation:
"We want customers to be able to pick up any of our software without reading the fine print or documentation and understand exactly what the software can do from a sustainability standpoint."
All eyes on the environmental impact of AI
Notably, while AI is an often used tool for tracking and predicting carbon emissions, its environmental impact is less understood.
The concern is growing about the massive amount of energy, cooling systems, and required for computational resources and the energy required for AI training and inference. Common models like BERT could emit as much as a Trans-Atlantic flight.
However, there's no universal benchmark for tracking the carbon intensity of AI systems.
Chen and the Kepler team plan to showcase their progress in sustainable AI in the coming months.
He notes: "Consider the hundreds of millions of people using Open AI, a one percent reduction has a significant impact."
The EU is funding work in this area, including the project SustainML, a collaboration between multiple universities, research institutions, and private companies across Europe. It has developed a tool it calls Carbontracker to enable AI designers to predict the energy consumption of machine learning applications.
It uses Machine Learning models to estimate the carbon emissions associated with different hardware and software configurations and predict future training runs' carbon footprint. It also supports predicting the total duration, energy, and carbon footprint of training a DL model.
There's also the academic project FlexGen that aims to facilitate the running of large language models with fewer resources, such as single commodity GPU. It's creating a high-throughput system to enable new applications of foundation models to throughput-oriented tasks on low-cost hardware, such as a single 16GB GPU.
An opportunity to meet customer demand
So what does all of this mean when it comes to the end user? Well, in 2021, France passed legislation to reduce the environmental impact of digital technology.
This includes focusing on using energy-efficient algorithms, optimised data centres, and establishing eco-design guidelines for software and digital services.
It is foreseeable that other European countries will follow suit, calling enterprises to provide greater transparency about their software practices.
In terms of being part of the greater good, green software engineering also offers a tangible way for companies to meet their ESG targets. This not only helps with regulatory compliance, but also increases demands from customers and investors. The more prosaic reality is that greener software reduces data storage needs and energy expenditure costs.
We're far from sustainable software being front of mind in the larger discourse around sustainability in tech, but software providers are leading the movement, and momentum is bubbling away under the surface, ripe for eruption. This is how we get commercial solutions into the hands of users.
Additional resources
For brevity, I've focused on some key software initiatives. I'd also encourage you to check out the following:
TAG Environmental Sustainability supports projects and initiatives that deliver cloud-native applications, including building, packaging, deploying, managing, and operating them.
The Green Software Foundation is a nonprofit founded by Accenture, GitHub, Microsoft and ThoughtWorks, established with the Linux Foundation and the Joint Development Foundation Projects to build a trusted ecosystem of people, standards, tooling and leading practices for building green software, and work together to reduce the carbon emissions of software.
The Green Web Foundation focuses on enabling the transition to a fossil-free internet, including stewarding the world's largest open database tracking which parts of the internet and websites run on renewable power.
Climate Action Tech is a community of practice of tech workers that provides support and guidance for systemic change.
Lead image: Geralt.
Would you like to write the first comment?
Login to post comments