Overcoming Hardware Limitations in Massive MIMO: A Generative AI Take

Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/GutachtenBeitrag in KonferenzbandBeigetragenBegutachtung

Abstract

Recent transition in mobile communication standards suggests massive multiple-input multiple-output (MIMO) to be an integral part of the foreseeable future. However, as antenna elements increase to hundreds in the fifth-generation (5G) and beyond, traditional signal processing methods become prone to significant hardware impairments compound from multiple chains, leading to a substantial performance degradation. This paper explores the effectiveness of generative artificial intelligence (AI) techniques in addressing these challenges within massive MIMO systems. For this purpose, the conditional generative adversarial network (CGAN), a special class of generative AI algorithms, is employed to enhance the accuracy of channel state information (CSI) estimation in a hardware-impaired transceiver setup. This problem is treated as an image-denoising task, where the noise is introduced by the hardware impairments and LS estimation error. Through simulations conducted across various antenna array sizes, the potential of generative AI to improve CSI estimation accuracy under hardware impairments is demon-strated. This highlights its capacity to address critical signal processing challenges in the next-generation wireless systems.

Details

OriginalspracheEnglisch
Titel2025 IEEE Wireless Communications and Networking Conference (WCNC)
Seiten1-6
ISBN (elektronisch)979-8-3503-6836-9
PublikationsstatusVeröffentlicht - Mai 2025
Peer-Review-StatusJa

Publikationsreihe

ReiheIEEE Conference on Wireless Communications and Networking (WCNC)
ISSN1525-3511

Externe IDs

ORCID /0000-0003-3045-6271/work/190570400
ORCID /0000-0001-8165-5735/work/193707089
Scopus 105006464593

Schlagworte

ASJC Scopus Sachgebiete

Schlagwörter

  • Massive MIMO, channel estimation, conditional generative adversarial networks (CGAN), generative AI, machine learning (ML)