Insight and analysis on the information technology space from industry thought leaders.

The Solution to AI-Driven Emissions? Greener Code

While AI holds potential for combating climate change, its significant energy demands are exacerbating carbon emissions. The key to reducing AI's carbon footprint lies in GenAI-driven code optimization.

Industry Perspectives

August 20, 2024

6 Min Read
green code
Alamy

By Dr. Leslie Kanthan, TurinTech

Using AI to combat climate change is one of the technology's exciting use cases, but AI also has significant energy demands in the form of electricity consumption and carbon emissions from compute power demands.

Case in point: Google's emissions have climbed nearly 50% in five years due to the energy demands of powering its new AI products, and Microsoft, the biggest financial backer of ChatGPT developer OpenAI, has admitted that its 2030 net zero "moonshot" might not succeed.

Bill Gates says AI's green benefits will outweigh its emissions. That's great, but how?

Most of the solutions offered up are reactive. Transitioning to green energy sources like solar power for data centers isn't going to cut it, especially given the new energy demands of training a large language model (LLM). Training a large language model such GPT-3 has been estimated to use just under 1,300 megawatt hours (MWh) of electricity, the equivalent to the power used each year by 130 U.S. homes.

The problem is a growing concern for all companies, but data centers, cloud providers, and very large institutions are some of the top entities affected.

Many have endeavoured to optimize the logistics of maintaining and supplying hardware in an effort to reduce AI's carbon footprint, and while it's one part of a good approach, on its own it won't scale to meet the scope of AI's growing emissions.

Related:AI Obsession Obscures Bigger Promise of Climate Tech

The answer to reducing AI's carbon footprint lies in starting from the very beginning, through the strategic application of code optimization. The only way to do it thoroughly enough and fast enough to reduce energy consumption in data centers and cloud computing is to power code optimization with AI.

Old Code, Duplicate Code, Sloppy Code

Technology-driven emissions don't start and end with AI. Code is the culprit, and it started when businesses began to digitize.

Picture this: In 1995, a developer wrote some code designed to analyze market risk. In 2020, without knowing that the previous code existed 30 years ago, someone else in the same organization wrote the same code. It's an all-too-common situation, and the end result is that data centers are processing both legacy and duplicate code simultaneously, and unnecessarily.

Organizations are also churning out code faster than ever before, often with the help of AI. Development cycles are getting shorter and code generation and reuse are often not done with the level of care developers may take if they simply had more time. It makes for sloppy code that takes more compute resources, producing more emissions.

Related:Gartner Reveals 5 Trends That Will Make Software Engineering Faster, Greener

The reality is that there are many types of inefficient code spanning legacy systems, duplicates, and poorly written or hastily developed code. Beyond that, the universe of technology that opened with the web, the cloud, and mobile for modern applications adds an extra layer of complexity.

Code optimization is critical, and it starts with thoroughly analyzing existing software to find areas where optimization can lead to improvements in energy efficiency. By refining and streamlining code, companies can reduce memory usage, CPU load, and overall energy consumption. There are also other added upsides: Optimized code improves software performance and reduces costs.

Automated Code Optimization: The Ultimate Differentiator in Achieving Green AI

Code optimization has been around for many years, but as the surge of AI applications increases global data center electricity consumption, it has quickly become clear that manual code optimization will not help us meet our climate change goals.

Manual code optimization is resource-intensive, usually involving many iterations to refine, test, and deploy code that performs optimally. The bigger the codebase, the bigger the problem.

AI models also present a unique challenge as they need to be trained with a lot of data. As already noted, training AI systems is one of the most resource-intensive activities: The Center for the Governance of AI says that over the past 12 years, the amount of compute used to train AI systems has increased by a factor of 350 million.

Code optimization is being supercharged by generative AI — and not a moment too soon. If companies want to build products faster, continue to provide their services at scale, and meet their environmental, social, and governance (ESG) goals, they need their code optimization to keep pace. GenAI is the only hope they have to optimize code thoroughly enough and fast enough to ensure the good that they reap from AI outweighs the environmental impact.

GenAI code optimization platforms can be deployed alongside intelligence from human developers to optimize AI code at scale, turning inefficient code into "green" code. They work with pre-trained LLMs to automate the scanning of codebases and identify code inefficiencies, while integrating improvements at scale.

Where Will We See the Most Impact?

Large cloud providers are beginning to offer optimized compute and memory services, but it's critical that we continue to optimize AI models at the code level for efficiency without sacrificing performance.

Data centers are often overlooked in the CO2 emissions savings game. If we can optimize data center code, this is where we will ultimately see the most impact. By reducing the computational load on services and optimizing data processes, storage, and running software, companies could see nearly a 50% reduction in energy use.

Data-heavy applications where velocity matters are another area prime for disruption when it comes to emissions. This is particularly relevant in the financial sector, where banks, asset managers, and other financial parties use data-heavy models for high-velocity tasks such as modeling, trading, and risk management. The code libraries feeding these applications need to be completely optimized to mitigate risks and application performance for financial institutions — and to run greener code.

These are only a few of the exciting opportunities companies are capitalizing on when it comes to reducing the carbon footprint of their AI programs, and as companies prioritize sustainability, we're set to see GenAI become a true climate-change differentiator.

About the author:

Dr. Leslie Kanthan is the CEO and co-founder of TurinTech AI, the leading AI code optimization company that empowers businesses to build better, faster, and greener AI.

A pioneer in quantitative research, graph theory, and efficient similarity search techniques, before founding TurinTech, Leslie has held technical roles at financial institutions including Credit Suisse, Bank of America, Commerzbank, and Santander.

During his time in banking, he began to research solutions to solve the inefficiencies of manual machine learning development and code optimization processes. TurinTech has grown from four founders to 40+ employees within the space of a few years, raising over $6 million in funding from IQ Capital and SpeedInvest.

Leslie's expertise extends to predictive analytics and locality sensitive hashing, with his innovative work in these areas now powering technologies used by companies like Uber. In addition to his leadership at TurinTech, he co-founded DataSpartan, a specialist firm in AI, data science, and quantitative finance.

Sign up for the ITPro Today newsletter
Stay on top of the IT universe with commentary, news analysis, how-to's, and tips delivered to your inbox daily.

You May Also Like