Skip links

Quantifying the Carbon Emissions of Machine Learning

From an environmental standpoint, there are a few crucial aspects of training a neural network that have a major impact on the quantity of carbon that it emits. The rapid growth of machine learning (ML) technologies has completely changed a number of sectors by presenting previously unattainable prospects and efficiencies. However, as machine learning models become more complex and computationally demanding, concerns about their environmental impact, specifically in terms of carbon emissions, have gained prominence.

The Carbon Footprint of ML: To set the scene, this blog post summarizes the energy-intensive aspect of operating inference servers and training sophisticated models, and how this leads to greenhouse gas emissions. 

This section aims to make readers cognizant of the environmental impact embedded in the core of ML operations.

What are the carbon impacts of Machine Learning?

The most directly visible impact of training and deploying a Machine Learning model is the emission of CO2 and other greenhouse gases due to the increase in power consumption (i.e. dynamic consumption) incurred by the equipment at running time. Even if dynamic consumption has a big impact, we should not fail to see the forest for the trees, and consider the entirety of the ML pipeline. 

Notably, other dimensions of model impact that should be considered include: model preparation overhead, static consumption of the equipment, infrastructure, as well as the overall Life Cycle Analysis of the equipment. 

What are the most impactful steps I can take? 

As a practitioner :

  • Reduce your I/O and redundant computation/data copying/storage: start with smaller datasets to debug your model, and use shared data storage with members of your team so you don’t need to have individual copies.
  • Choose a low-carbon data center: When running models on the cloud, consult a tool like Electricity Map to choose the least carbon-intensive data center.
  • Avoid wasted resources: by steering clear of grid search and by reusing or fine-tuning previously trained models when possible. 

Also, strive towards designing your training and experimentation to minimize discarded computing time and resources in case of failure.

  • Tools like Green algorithms and ML CO2 Impact that can allow you to estimate your emissions afterwards. 

As an institution:

  • Deploy your computation in low-carbon regions when possible.
  • Provide institutional tools for tracking emissions and enable them by default on your computing infrastructure
  • Cap computational usage: for instance at maximum 72 hours per process, in order to reduce wasted resources.
  • Carry out awareness campaigns regarding the environmental impact of ML

Get started


NIKO is a streamlined tool for creating AI models quickly and easily, without writing a single line of code. Here is an introduction video about NIKO.

Try For Free


Leave a comment