Surface light-induced changes in thin polymer films
Andrew Skumanich
SPIE Optics Quebec 1993
In recent years, with the development of quantum machine learning, Quantum Neural Networks (QNNs) have gained increasing attention in the field of Natural Language Processing (NLP) and have achieved a series of promising results. However, most existing QNN models focus on the architectures of Quantum Recurrent Neural Network (QRNN) and Quantum Self-Attention Mechanism (QSAM). In this work, we propose a novel QNN model based on quantum convolution. We develop the quantum depthwise convolution that significantly reduces the number of parameters and lowers computational complexity. We also introduce the multi-scale feature fusion mechanism to enhance model performance by integrating word-level and sentence-level features. Additionally, we propose the quantum word embedding and quantum sentence embedding, which provide embedding vectors more efficiently. Through experiments on two benchmark text classification datasets, we demonstrate our model outperforms a wide range of state-of-the-art QNN models. Notably, our model achieves a new state-of-the-art test accuracy of 96.77% on the RP dataset. We also show the advantages of our quantum model over its classical counterparts in its ability to improve test accuracy using fewer parameters. Finally, an ablation test confirms the effectiveness of the multi-scale feature fusion mechanism and quantum depthwise convolution in enhancing model performance.
Andrew Skumanich
SPIE Optics Quebec 1993
Zhengxin Zhang, Ziv Goldfeld, et al.
Foundations of Computational Mathematics
Joy Y. Cheng, Daniel P. Sanders, et al.
SPIE Advanced Lithography 2008
Ligang Lu, Jack L. Kouloheris
IS&T/SPIE Electronic Imaging 2002