https://doi.org/10.1140/epjqt/s40507-025-00381-y
Research
Performance comparison of the quantum and classical deep Q-learning approaches in dynamic environments control
Department of Energy Engineering, Sharif University of Technology, P.O. Box 14565-114, Tehran, Iran
Received:
27
December
2024
Accepted:
5
June
2025
Published online:
16
June
2025
There is a lack of adequate studies on dynamic environments control for Quantum Reinforcement Learning (QRL) algorithms, representing a significant gap in this field. This study contributes to bridging this gap by demonstrating the potential of quantum RL algorithms to effectively handle dynamic environments. In this research, the performance and robustness of Quantum Deep Q-learning Networks (DQN) were examined in two dynamic environments, Cart Pole and Lunar Lander, by using three distinct quantum Ansatz layers: RealAmplitudes, EfficientSU2, and TwoLocal. The quantum DQNs were compared with classical DQN algorithms in terms of convergence speed, loss minimization, and Q-value behavior. It was observed that the RealAmplitudes Ansatz outperformed the other quantum circuits, demonstrating faster convergence and superior performance in minimizing the loss function. To assess robustness, the pole length was increased in the Cart Pole environment, and a wind function was added to the Lunar Lander environment after the 50th episode. All three quantum Ansatz layers were found to maintain robust performance under disturbed conditions, with consistent reward values, loss minimization, and stable Q-value distributions. Although the proposed QRL demonstrates competitive results overall, classical RL can surpass them in convergence speed under specific conditions.
Key words: Quantum Deep Q-learning Network / Reinforcement Learning / Quantum Ansatz / Dynamic environments
© The Author(s) 2025
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.