We have already seen how software engineers are starting to buy greener cloud hosting providers. And this is only the beginning of a movement called “sustainable software engineering”. “Everyone has a role to play in the solution,” says Asim Hussain, Green Developer Relations Manager at Microsoft. “Sustainable software engineering is an emerging discipline at the intersection of climate science, software, hardware, power markets, and data center design.” And even if it may seem like a novelty, such a state of mind will have to be generalized to reduce the emissions linked to daily applications.
To equip developers with the basics, Hussain runs a online training course which perfectly answers the question: what is sustainable software engineering? In just 33 minutes, attendees are given a whirlwind tour of the eight core principles that underpin this emerging discipline. Hussain is committed to the cause and wants others adhere to the principles of sustainable software engineering too. To understand what drives such initiatives, consider how much power software applications can consume. And an example in many people’s minds is artificial intelligence (AI).
Understanding AI Emissions
In 2019, researchers in the United States have published energy and policy considerations for deep learning in natural language processing [PDF]. Deep learning is a powerful tool for generalizing data and has led to a sea change in the performance of the AI model. Making headlines recently has been the impressive AI-rendered digital art generated by just a few keyword prompts. Voice technology is also advancing by leaps and bounds, and there are the captivating subject of deepfakes also to be considered. But the rewards of deep learning only come when you have many layers of neural networks – hence the name. And that’s a lot of processing. GPT-3, the latest in a series of natural language models, has 175 billion parameters.
The US study looked at simpler natural language models, including GPT-2 (which has two orders of magnitude fewer parameters than its successor GPT-3), and found that even these smaller versions still had a footprint. important carbon. The carbon dioxide emissions associated with training a natural language processing model can be as high as 6 figures, which – to put that figure into perspective – is similar to running five average American cars. And that includes a lifetime’s worth of fuel and the emissions associated with building the cars in the first place.
Green Power and ASICs
Data center providers may claim that their facilities are powered by renewable energy, but even green energy sources must still be powered from the grid – for example, when there is no or little wind. sun in the sky. And there are great visualizations such as Electricity cards which provide a live feed of the energy mix country by country. It is true that hardware improvements have an impact on researchers’ calculations, which the team acknowledges. Tensor processing units (TPUs) — custom-developed application-specific integrated circuits (ASICs) designed to perform common machine learning operations — are becoming more common. And these chips are more economical than GPUs for accelerating certain models.
But even when using TPUs, AI developers still run into the problem of diminishing returns. Pushing models to higher performance scores takes more and more computational time for only a fraction of the rewards earned earlier in the training cycle. From a sustainable software engineering perspective, this scenario could be managed through what is known as “demand shaping”. If the code were carbon sensitive, the model training rate could be scaled to the availability of renewable energy. We’ve grown accustomed to online meetings that ramp up and down in audio and video quality based on available bandwidth, and there’s no reason training AI models can’t keep up.
Timing is everything
Hussain points to the merits of keeping an eye on carbon intensity – the impact of the current electricity mix on emissions. The numbers drop when renewable energy use is high and rise when coal and gas supplement the grid. Getting the timing right means users can reduce the environmental load of their applications without changing a single line of code.
This concept does not only apply to computers of course. And companies like Equiwatt work with energy companies to encourage consumers to be selective about when they use appliances. In the case of Equiwatt, the company has a downloadable app that alerts users when electricity consumption reaches its maximum. The intent of the alerts is to encourage recipients to turn off non-essential equipment. And to sweeten the deal, users get points which they can redeem for various goods and services.
It’s an interesting business model, and energy companies seem willing to pay tech companies to come up with solutions that burn less gas and coal. Having information at your fingertips is definitely a sign of progress and if there’s one thing tech companies do well, it’s data.