Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Quanvolutional neural networks: powering image recognition with quantum circuits

  • Research Article
  • Published:
Quantum Machine Intelligence Aims and scope Submit manuscript

Abstract

Convolutional neural networks (CNNs) have rapidly risen in popularity for many machine learning applications, particularly in the field of image recognition. Much of the benefit generated from these networks comes from their ability to extract features from the data in a hierarchical manner. These features are extracted using various transformational layers, notably the convolutional layer which gives the model its name. In this work, we introduce a new type of transformational layer called a quantum convolution, or quanvolutional layer. Quanvolutional layers operate on input data by locally transforming the data using a number of random quantum circuits, in a way that is similar to the transformations performed by random convolutional filter layers. Provided these quantum transformations produce meaningful features for classification purposes, then this algorithm could be of practical use for near-term quantum computers as it requires small quantum circuits with little to no error correction. In this work, we empirically evaluated the potential benefit of these quantum transformations by comparing three types of models built on the MNIST dataset: CNNs, quantum convolutional neural networks (QNNs), and CNNs with additional non-linearities introduced. Our results showed that the QNN models had both higher test set accuracy as well as faster training compared with the purely classical CNNs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. The QNN topology chosen in this work is not fixed by nature. As mentioned in Section 2.2, the QNN framework was designed to give users complete control over the number and order of quanvolutional layers in the architecture. The topology explored in this work was chosen because it was the simplest QNN architecture to use as a baseline for comparison against other purely classical networks. Future work would focus on exploring the impact of more complex architectural variations.

  2. It is not a hard requirement in general that the number of qubits in the quanvolutional filter matches the dimensionality of the input data. In many quantum circuit applications, ancilla qubits are required to perform elements of an overall computation. Such transformations using ancilla qubits, however, are outside the scope of this work and will be an interesting future topic to explore.

References

Download references

Acknowledgments

We would like to thank Duncan Fletcher for your significant editing input, and helpful comments by Peter Wittek.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maxwell Henderson.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Henderson, M., Shakya, S., Pradhan, S. et al. Quanvolutional neural networks: powering image recognition with quantum circuits. Quantum Mach. Intell. 2, 2 (2020). https://doi.org/10.1007/s42484-020-00012-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s42484-020-00012-y

Keywords

Navigation