TY - GEN
T1 - Reducing Carbon Footprint in AI
T2 - 9th Future Technologies Conference, FTC 2024
AU - Iftikhar, Sunbal
AU - Davy, Steven
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
PY - 2024
Y1 - 2024
N2 - In the world of artificial intelligence (AI), large language models (LLMs) are leading the way, transforming how people understand and use language. These models have significantly impacted various domains, from natural language processing (NLP) to content generation, sparking a wave of innovation and exploration. However, this rapid progress brings to light the environmental implications of LLMs, particularly the significant energy consumption and carbon emissions during their training and operational phases. This requires a shift towards more energy-efficient practices in training and deploying LLMs, balancing AI innovation with environmental responsibility. This paper emphasizes the need for improving the energy efficiency of LLMs to align their benefits with environmental sustainability. The discussion covers the significant power consumption associated with training LLMs. We present a generic energy-efficient training framework of LLMs that employs federated learning (FL) and integrates renewable energy (RE), aiming to mitigate environmental impact of LLMs. Our objective is to encourage the implementation of sustainable AI practices that preserve the capabilities of LLMs while reducing their environmental impact, thus guiding the AI community towards the responsible advancement of technology.
AB - In the world of artificial intelligence (AI), large language models (LLMs) are leading the way, transforming how people understand and use language. These models have significantly impacted various domains, from natural language processing (NLP) to content generation, sparking a wave of innovation and exploration. However, this rapid progress brings to light the environmental implications of LLMs, particularly the significant energy consumption and carbon emissions during their training and operational phases. This requires a shift towards more energy-efficient practices in training and deploying LLMs, balancing AI innovation with environmental responsibility. This paper emphasizes the need for improving the energy efficiency of LLMs to align their benefits with environmental sustainability. The discussion covers the significant power consumption associated with training LLMs. We present a generic energy-efficient training framework of LLMs that employs federated learning (FL) and integrates renewable energy (RE), aiming to mitigate environmental impact of LLMs. Our objective is to encourage the implementation of sustainable AI practices that preserve the capabilities of LLMs while reducing their environmental impact, thus guiding the AI community towards the responsible advancement of technology.
KW - Artificial intelligence
KW - Carbon emissions
KW - Energy efficiency
KW - Federated learning
KW - Large language models
KW - Natural language processing
KW - Renewable energy
UR - http://www.scopus.com/inward/record.url?scp=85209545270&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-73110-5_22
DO - 10.1007/978-3-031-73110-5_22
M3 - Conference contribution
AN - SCOPUS:85209545270
SN - 9783031731099
T3 - Lecture Notes in Networks and Systems
SP - 325
EP - 336
BT - Proceedings of the Future Technologies Conference (FTC) 2024
A2 - Arai, Kohei
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 14 November 2024 through 15 November 2024
ER -