On the outage probability of channel prediction enabled max-min radio resource allocation

Research output: Contribution to book/Conference proceedings/Anthology/ReportConference contributionContributed

Abstract

To realize the next generation of ultra-reliable low-latency communications (URLLC) with less radio resource consumption, methods relying solely on the addition of redundancy do not suffice. An alternative is to monitor the wireless channel and prevent outages due to small scale fading by allocating users to suitable radio resources based on the channel gain. To overcome monitoring delays, predicted channel information is utilized. In this paper we are analyzing the outage probability of a max-min radio resource allocation (RRA) approach when Gaussian errors are present in the channel state information (CSI). As a communications channel, a Rayleigh fast fading channel is assumed. The outage probability is lower and upper bounded to get analytical performance approximations. Furthermore, the performance is also empirically evaluated by means of extensive Monte-Carlo simulations. The analysis shows, that the achievable outage probability mostly depends on the size of the resource pool and the quality of the CSI.

Details

Original languageEnglish
Title of host publicationProceedings of the IEEE Wireless Communications and Networking Conference (WCNC)
Pages638-643
Number of pages6
ISBN (electronic)978-1-6654-4266-4
Publication statusPublished - 13 Apr 2022
Peer-reviewedNo

External IDs

Scopus 85130707469
dblp conf/wcnc/TrasslSSSF22
Mendeley 147fc08a-34ce-3a6e-9a40-6007fb6d25a3
unpaywall 10.1109/wcnc51071.2022.9771598
ORCID /0000-0002-0738-556X/work/177360497

Keywords

Keywords

  • URLLC, channel prediction, radio resource scheduling