Deep Reinforcement Learning for Dynamic Radio Access Selection over Future Wireless Networks

CC González, EF Pupo… - 2022 IEEE …, 2022 - ieeexplore.ieee.org
2022 IEEE International Symposium on Broadband Multimedia Systems …, 2022ieeexplore.ieee.org
Despite the fifth-generation (5G) of mobile communication systems being at its initial stage,
the research community has started to focus on its successor. The sixth-generation (6G) is
expected to provide massive scale communication and always-on intelligent connectivity,
enabling several emerging applications with stricter quality of service (QoS) requirements.
Softwarization technologies, network slicing paradigm, and artificial intelligence (AI) will be
critical pieces of 6G to manage ultra-dense heterogeneous environments composed of …
Despite the fifth-generation (5G) of mobile communication systems being at its initial stage, the research community has started to focus on its successor. The sixth-generation (6G) is expected to provide massive scale communication and always-on intelligent connectivity, enabling several emerging applications with stricter quality of service (QoS) requirements. Softwarization technologies, network slicing paradigm, and artificial intelligence (AI) will be critical pieces of 6G to manage ultra-dense heterogeneous environments composed of terrestrial and non-terrestrial networks. This work aims to find the most efficient combination of access network and network slices (NSs) in 6G heterogeneous scenarios to satisfy the user petition and maximize the QoS. We propose a Dynamic Radio Access Network Selection (DRANS) algorithm based on Deep-Reinforcement Learning (DRL) as a suitable method to handle the constant changes in network conditions and the diversity of users' demands. We address the DRL problem by considering an adaptation of the DQN approach termed Double Deep Q-Network (DDQN). The proposal is evaluated through numerical simulations, focusing on the effective utilization of network resources and the convergence rate.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果