Green AI?

Is machine learning an environmental hazard or an environmental solution?

The answer is, apparently, “yes”.

Recently a number of papers have focused attention on machine learning and climate change. Interesting findings.

Tackling Climate Change with Machine Learning” (Rolnick et al. 2019) is a manifesto published in advance of the NeurIPS conference. This extensive and detailed report outlines many ways in which applying ML can have a positive impact on addressing significant aspects of climate change. In summary:

“ML can enable automatic monitoring through remote sensing (e.g. by pinpointing deforestation, gathering data on buildings, and assessing damage after disasters). It can accelerate the process of scientific discovery (e.g. by suggesting new materials for batteries, construction, and carbon capture). ML can optimize systems to improve efficiency (e.g. by consolidating freight, designing carbon markets, and reducing food waste). And it can accelerate computationally expensive physical simulations through hybrid modeling (e.g. climate models and energy scheduling models).”

A report from the AI Now Institute, “AI and Climate Change: How they’re connected, and what we can do about it” (Dobbe & Whittaker, 2019), is not so optimistic:

“The estimated 2020 global footprint [of the tech industry] is comparable to that of the aviation industry, and larger than that of Japan, which is the fifth biggest polluter in the world. Data centers will make up 45% of this footprint (up from 33% in 2010) and network infrastructure 24%.”

They conclude that overall, “we see little action to curb emissions, with the tech industry playing a significant role in the problem.”

While the Rolnick et al. report illustrates that applying ML to environmental challenges has been and will continue to be productive, the story is a bit different when looking at the environment challenges of training the ML models to do this very work.

Strubell et al., “Energy and Policy Considerations for Deep Learning in NLP” (2019), estimate that “training BERT [a widely used NLP model] on GPU is roughly equivalent to a trans-American flight.” The authors of “Green AI” (Schwartz et al., 2019) note that the amount of compute required to train a model has increased 600,000 times (!) since 2013. More and more data, millions of parameters, and hundreds of GPUs. And it’s getting worse. They advocate “making efficiency an evaluation criterion for research alongside accuracy and related measures. In addition, we propose reporting the financial cost or “price tag” of developing, training, and running models to provide baselines for the investigation of increasingly efficient methods.”

Whatever the directions taken, the ML community, and the tech industry more generally, are going to have to take their environmental impact much more seriously. The role of environmental solution is possible but not at the increased expense of environmental hazard.

…Mike

Leave a Reply

Your email address will not be published. Required fields are marked *