Abstract
Convolutional neural networks (CNNs) have rapidly risen in popularity for many machine learning applications, particularly in the field of image recognition. Much of the benefit generated from these networks comes from their ability to extract features from the data in a hierarchical manner. These features are extracted using various transformational layers, notably the convolutional layer which gives the model its name. In this work, we introduce a new type of transformational layer called a quantum convolution, or quanvolutional layer. Quanvolutional layers operate on input data by locally transforming the data using a number of random quantum circuits, in a way that is similar to the transformations performed by random convolutional filter layers. Provided these quantum transformations produce meaningful features for classification purposes, then this algorithm could be of practical use for near-term quantum computers as it requires small quantum circuits with little to no error correction. In this work, we empirically evaluated the potential benefit of these quantum transformations by comparing three types of models built on the MNIST dataset: CNNs, quantum convolutional neural networks (QNNs), and CNNs with additional non-linearities introduced. Our results showed that the QNN models had both higher test set accuracy as well as faster training compared with the purely classical CNNs.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
The QNN topology chosen in this work is not fixed by nature. As mentioned in Section 2.2, the QNN framework was designed to give users complete control over the number and order of quanvolutional layers in the architecture. The topology explored in this work was chosen because it was the simplest QNN architecture to use as a baseline for comparison against other purely classical networks. Future work would focus on exploring the impact of more complex architectural variations.
It is not a hard requirement in general that the number of qubits in the quanvolutional filter matches the dimensionality of the input data. In many quantum circuit applications, ancilla qubits are required to perform elements of an overall computation. Such transformations using ancilla qubits, however, are outside the scope of this work and will be an interesting future topic to explore.
References
Aaronson S (2015) Read the fine print. Nature Physics 11:291. https://www.nature.com/articles/nphys3272
Adachi SH, Henderson MP (2015) arXiv:1510.06356
Benedetti M, Realpe-Gómez J, Biswas R, Perdomo-Ortiz A (2017) Quantum-assisted learning of hardware-embedded probabilistic graphical models. Phys Rev X 7(4):41052. https://doi.org/10.1103/PhysRevX.7.041052
Bergholm V, Izaac J, Schuld M, Gogolin C, Blank C, McKiernan K, Killoran N (2018) arXiv:1811.04968
Bernstein E, Vazirani U (1997) Quantum complexity theory. SIAM J Comput 26(5):1411. https://doi.org/10.1137/S0097539796300921
Biamonte J, Wittek P, Pancotti N, Rebentrost P, Wiebe N, Lloyd S (2018) Quantum machine learning. Nature 549(7671):195. https://www.nature.com/articles/nature23474
Boixo S, Isakov SV, Smelyanskiy VN, Babbush R, Ding N, Jiang Z, Bremner MJ, Martinis JM, Neven H (2016) https://doi.org/10.1038/s41567-018-0124-x. arXiv:1608.00263
Ciliberto C, Herbster M, Ialongo AD, Pontil M, Rocchetto A, Severini S, Wossnig L (2018) Quantum machine learning: a classical perspective. Proc R Soc A 474(2209):20170551. https://royalsocietypublishing.org/doi/10.1098/rspa.2017.0551
Crooks GE (2018) QuantumFlow: a quantum algorithms development toolkit. https://quantumflow.readthedocs.io/en/latest/
Dunjko V, Briegel HJ (2018) Machine learning & artificial intelligence in the quantum domain: a review of recent progress. Reports on Progress in Physics 81(7):074001. https://doi.org/10.1088/1361-6633/aab406. http://stacks.iop.org/0034-4885/81/i=7/a=074001?key=crossref.484b39e1cdde454de1cbc5aba8d6de34
Dunjko V, Taylor JM, Briegel HJ (2016) Quantum-enhanced machine learning. Phys Rev Lett 117 (13):130501. https://doi.org/10.1103/PhysRevLett.117.130501
Fujii K, Nakajima K (2016) https://doi.org/10.1103/physrevapplied.8.024030 arXiv:1602.08159
Havlíček V, Córcoles AD, Temme K, Harrow AW, Kandala A, Chow JM, Gambetta JM (2019) Supervised learning with quantum-enhanced feature spaces. Nature 567(7747):209. https://doi.org/10.1038/s41586-019-0980-2
Huang C, Newman M, Szegedy M (2018) arXiv:1804.10368
Jaeger H, Haas H (2004) Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667):78. http://science.sciencemag.org/content/304/5667/78.abstract
Jordan S (2011) Quantum algorithm zoo. https://math.nist.gov/quantum/zoo/https://math.nist.gov/quantum/zoo/
Killoran N, Bromley TR, Arrazola JM, Schuld M, Quesada N, Lloyd S (2018) arXiv:1806.06871
LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proceedings of the IEEE 86(11):2278. https://doi.org/10.1109/5.726791. https://ieeexplore.ieee.org/document/726791
Mitarai K, Negoro M, Kitagawa M, Fujii K (2018) https://doi.org/10.1103/PhysRevA.98.032309. arXiv:1803.00745
Perdomo-Ortiz A, Benedetti M, Realpe-Gómez J, Biswas R (2017) Opportunities and challenges for quantum-assisted machine learning in near-term quantum computers. Quantum Science and Technology 3(3):030502. arXiv:1708.09757v2. https://iopscience.iop.org/article/10.1088/2058-9565/aab859/meta
Preskill J (2018) https://doi.org/10.22331/q-2018-08-06-79. arXiv:1801.00862
Ranzato M, Huang FJ, Boureau Y, LeCun Y (2007) Unsupervised learning of invariant feature hierarchies with applications to object recognition. In: 2007 IEEE conference on computer vision and pattern recognition, pp 1–8. https://ieeexplore.ieee.org/document/4270182, DOI https://doi.org/10.1109/CVPR.2007.383157
Sabour S, Frosst N, Hinton GE (2017) arXiv:1710.09829
Schuld M, Bergholm V, Gogolin C, Izaac J, Killoran N (2018) https://doi.org/10.1103/physreva.99.032331. arXiv:1811.11184
Wilson CM, Otterbach JS, Tezak N, Smith RS, Crooks GE, da Silva MP (2018) arXiv:1806.08321
Acknowledgments
We would like to thank Duncan Fletcher for your significant editing input, and helpful comments by Peter Wittek.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Henderson, M., Shakya, S., Pradhan, S. et al. Quanvolutional neural networks: powering image recognition with quantum circuits. Quantum Mach. Intell. 2, 2 (2020). https://doi.org/10.1007/s42484-020-00012-y
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42484-020-00012-y