Reducing Carbon Footprint in AI: A Framework for Sustainable Training of Large Language Models

Sunbal Iftikhar, Steven Davy

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In the world of artificial intelligence (AI), large language models (LLMs) are leading the way, transforming how people understand and use language. These models have significantly impacted various domains, from natural language processing (NLP) to content generation, sparking a wave of innovation and exploration. However, this rapid progress brings to light the environmental implications of LLMs, particularly the significant energy consumption and carbon emissions during their training and operational phases. This requires a shift towards more energy-efficient practices in training and deploying LLMs, balancing AI innovation with environmental responsibility. This paper emphasizes the need for improving the energy efficiency of LLMs to align their benefits with environmental sustainability. The discussion covers the significant power consumption associated with training LLMs. We present a generic energy-efficient training framework of LLMs that employs federated learning (FL) and integrates renewable energy (RE), aiming to mitigate environmental impact of LLMs. Our objective is to encourage the implementation of sustainable AI practices that preserve the capabilities of LLMs while reducing their environmental impact, thus guiding the AI community towards the responsible advancement of technology.

Original languageEnglish
Title of host publicationProceedings of the Future Technologies Conference (FTC) 2024
EditorsKohei Arai
PublisherSpringer Science and Business Media Deutschland GmbH
Pages325-336
Number of pages12
ISBN (Print)9783031731099
DOIs
Publication statusPublished - 2024
Event9th Future Technologies Conference, FTC 2024 - London, United Kingdom
Duration: 14 Nov 202415 Nov 2024

Publication series

NameLecture Notes in Networks and Systems
Volume1154 LNNS
ISSN (Print)2367-3370
ISSN (Electronic)2367-3389

Conference

Conference9th Future Technologies Conference, FTC 2024
Country/TerritoryUnited Kingdom
CityLondon
Period14/11/2415/11/24

Keywords

  • Artificial intelligence
  • Carbon emissions
  • Energy efficiency
  • Federated learning
  • Large language models
  • Natural language processing
  • Renewable energy

Fingerprint

Dive into the research topics of 'Reducing Carbon Footprint in AI: A Framework for Sustainable Training of Large Language Models'. Together they form a unique fingerprint.

Cite this