Explainable AI for 6G Use Cases: Technical Aspects and Research Challenges

Shen Wang, M. Atif Qureshi, Luis Miralles-Pechuan, Thien Huynh-The, Thippa Reddy Gadekallu, Madhusanka Liyanage

Research output: Contribution to journalArticlepeer-review

Abstract

Around 2020, 5G began its commercialization journey, and discussions about the next-generation networks (such as 6G) emerged. Researchers predict that 6G networks will have higher bandwidth, coverage, reliability, energy efficiency, and lower latency, and will be an integrated 'human-centric' network system powered by artificial intelligence (AI). This 6G network will lead to many real-time automated decisions, ranging from network resource allocation to collision avoidance for self-driving cars. However, there is a risk of losing control over decision-making due to the high-speed, data-intensive AI decision-making that may go beyond designers' and users' comprehension. To mitigate this risk, explainable AI (XAI) methods can be used to enhance the transparency of the black-box AI decision-making process. This paper surveys the application of XAI towards the upcoming 6G age, including 6G technologies (such as intelligent radio and zero-touch network management) and 6G use cases (such as industry 5.0). Additionally, the paper summarizes the lessons learned from recent attempts and outlines important research challenges in applying XAI for 6G use cases soon.

Original languageEnglish
Pages (from-to)2490-2540
Number of pages51
JournalIEEE Open Journal of the Communications Society
Volume5
DOIs
Publication statusPublished - 2024

Keywords

  • 6G
  • AI
  • B5G
  • XAI
  • explainability

Fingerprint

Dive into the research topics of 'Explainable AI for 6G Use Cases: Technical Aspects and Research Challenges'. Together they form a unique fingerprint.

Cite this