Earth

Deep Learning and Climate Change: Toward a Sustainable Solution

Emerging technologies are continuing to shape industries and lead to new fields. Deep learning, a subfield of machine learning that employs artificial neural networks in emulating the human brain to optimize and make decisions. The applications of deep learning range across sectors, from manufacturing to healthcare to e-commerce. For instance, deep learning serves as a driving force behind autonomous vehicles, where advanced algorithms are able to engage in route mapping, obstacle prediction, and depth perception to lead a vehicle to its destination. In healthcare, deep learning’s role in computer vision makes a significant difference in medical imaging and surgical procedures. 

With the growing challenges posed by climate change, more attention has been turned toward leveraging advanced technologies in addressing environmental issues. Researchers and companies have been exploring the applications of deep learning in tackling climate change. For example, deep learning algorithms can aid in precision agriculture to help optimize farming practices to feed rising populations while maintaining steady land use. Other applications of deep learning in the environmental sphere include monitoring deforestation in tropical rainforests and predicting wildfires to manage risk. The applications of deep learning in addressing climate challenges continue to expand as deep learning approaches become more widely employed across companies and industries. 

However, the positive impacts of deep learning in tackling climate change do not come without consequences. In particular, deep learning’s own carbon footprint has been a source of discussion among resources. Notably, the training of deep learning algorithms results in significant energy usage that leads to a large amount of carbon emissions. In a research paper detailing the carbon emissions of deep learning for the particular use of natural language processing (NLP), Dr. Emma Strubell, Dr. Andrew McCallum, and Ananya Ganesh, then at the University of Massachusetts Amherst identified that training for a single NLP pipeline including experimentation results in 78,468 pounds, whereas the average carbon emissions as a result of a year of consumption for the average human is 11,023 pounds. It is also reported that data centers and computing facilities release up to 100 megatonnes of carbon emissions each year, contributing a significant amount to global carbon emissions. 

To approach the issue of carbon emissions as a result of training deep learning algorithms, researchers have been designing unique ways for fellow researchers to track the carbon footprint of their deep learning models. For example, a team of researchers that include Lasse F. Wolff Anthony, Benjamin Kanding, and Dr. Raghavendra Selvan at the University of Copenhagen designed the open-source tool Carbontracker to predict and obtain measurements for the carbon emissions of training a deep learning model. The tool employs Python to accommodate a range of machine learning models for which Python is commonly used. 

In addition, another team of researchers including Dr. Loïc Lannelongue, Dr. Jason Grealey, and Dr. Michael Inouye at the University of Cambridge created an online calculator to track the carbon emissions of deep learning model training. The team demonstrates the array of applications of the calculator ranging from weather forecasting to NLP. In weather forecasting, a single forecast day determined by deep learning can lead to 2.3 metric tonnes of carbon emissions. 

Nonetheless, the growing emphasis on addressing deep learning’s carbon footprint has led to increased development of algorithms such that they are both effective in accomplishing objectives and minimal in negative environmental impact. One method of note is designing more efficient training methods for deep learning such that the carbon emissions of such processes can be reduced. Another potential solution is identifying optimal data centers for training such models. For instance, according to the University of Cambridge team, data centers located in Norway and Switzerland emit significantly smaller amounts of carbon than those in Australia, South Africa, and several locations in the United States. Therefore, it is critical for developers to be aware of the data centers that are utilized to train new deep learning models. 

Major companies are playing a role in addressing the growing issue of carbon emissions as a result of emerging technologies. At Google, for instance, Dr. David Patterson, distinguished software engineer and former professor at the University of California, Berkeley, highlights that the Evolved Transformer model developed by the team at Google Research emits approximately 100 times less carbon dioxide equivalent than the traditional Transformer model. He also emphasizes the double-sided trait of the issue around deep learning’s carbon footprint, namely that models must not only be efficient but also utilize cleaner energy for the energy that such models do consume. Similarly to the data centers in Norway and Switzerland that allow for smaller amounts of energy consumption in running the models, the electricity grid near which a data center is located determines the carbon emissions of the models run at that center. Additionally, Dr. Sasha Luccioni, a founding member of Climate Change AI, suggests that another potential solution for encouraging more sustainable deep learning methods is offering tax incentives for data centers that utilize hydro or solar energy. Therefore, identifying and encouraging the use of optimal sources of energy coupled with developing more efficient models can make a difference in tackling the carbon footprint of deep learning. 

Alongside building more efficient algorithms and software and leveraging more sustainable data centers, hardware also plays a role in reducing the carbon footprint of deep learning. Quantum computers, for instance, are being examined as potential solutions to the issue of efficient computing due to their ability to represent several states of data at the same time. Graphics processing units (GPUs) are also being optimized to store and reuse data to enable deep learning algorithms to run more efficiently. 

Advancements across software, hardware, and physical data centers will therefore be critical moving forward in addressing the carbon footprint of deep learning and other emerging technologies. Alongside such technical advancements also comes the need for increased conversation around the issue. Bridging research discussions at events such as the  Conference and Workshop on Neural Information Processing Systems (NeurIPS) and World Summit AI with environmental-focused events such as the United Nations Climate Change Conferences will help encourage further advancement in the field, as well as ensure that technological advancement and sustainability go hand in hand. 

Ultimately, the issue of deep learning’s carbon footprint becomes a central aspect in artificial intelligence ethics. Advancement in AI methods such as deep learning have potential to address a range of challenges involving climate change; however, developing such technological solutions must also simultaneously involve accounting for any ramifications. The growing field of Green AI, which emphasizes innovation in advanced computing methods while minimizing the use of resources in the process, has spurred significant advancement in sustainable computing methods. As such innovation accelerates, deep learning, alongside other emerging technologies, has the potential to become a sustainable solution in combating climate change. 

Written by Emy Li
About the Speaker

Want to get more involved?

Volunteer or  Donate!