Photonic Reservoir Computer with Output Expansion for Unsupervized Parameter Drift Compensation
<p>Illustration of the output expansion scheme. In the example shown, 3 neurons are measured <math display="inline"><semantics> <msub> <mi>X</mi> <mi>m</mi> </msub> </semantics></math>, 1 random feature <math display="inline"><semantics> <msub> <mi>Y</mi> <mi>R</mi> </msub> </semantics></math> is constructed with random weights <math display="inline"><semantics> <msub> <mi>W</mi> <mi>R</mi> </msub> </semantics></math> and this auxiliary feature is mixed with <math display="inline"><semantics> <msub> <mi>X</mi> <mi>m</mi> </msub> </semantics></math> to obtain a total of 6 output features <math display="inline"><semantics> <msub> <mi>X</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> </semantics></math>. These output features are combined with trained readout weights <math display="inline"><semantics> <msub> <mi>W</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> </semantics></math> to form a task-solving reservoir output <math display="inline"><semantics> <msub> <mi>Y</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> </semantics></math>. This output expansion contains polynomial functions of <math display="inline"><semantics> <msub> <mi>X</mi> <mi>m</mi> </msub> </semantics></math> of first and second degree. The corresponding subsets of readout weights are labeled <math display="inline"><semantics> <msubsup> <mi>W</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </msubsup> </semantics></math> and <math display="inline"><semantics> <msubsup> <mi>W</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </msubsup> </semantics></math>. Larger numbers of neurons N and auxiliary features P are supported by the proposed scheme.</p> "> Figure 2
<p>Illustration of the weight-tuning scheme. The example shows 3 measured neural responses <math display="inline"><semantics> <msub> <mi>X</mi> <mi>m</mi> </msub> </semantics></math> which are combined with 3 time-dependent readout weights <math display="inline"><semantics> <msub> <mover accent="true"> <mi>W</mi> <mo stretchy="false">˜</mo> </mover> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> </semantics></math> following Equation (<a href="#FD16-entropy-23-00955" class="html-disp-formula">16</a>) to form 1 task-solving output <math display="inline"><semantics> <msub> <mi>Y</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> </semantics></math> following Equation (<a href="#FD17-entropy-23-00955" class="html-disp-formula">17</a>). The example has 1 random auxiliary feature <math display="inline"><semantics> <msub> <mi>Y</mi> <mi>R</mi> </msub> </semantics></math>, which is obtained with random weights <math display="inline"><semantics> <msub> <mi>W</mi> <mi>R</mi> </msub> </semantics></math> and used to tune <math display="inline"><semantics> <msub> <mover accent="true"> <mi>W</mi> <mo stretchy="false">˜</mo> </mover> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> </msub> </semantics></math>. Larger numbers of neurons <span class="html-italic">N</span> and auxiliary features <span class="html-italic">P</span> are supported by the proposed scheme. This example is equivalent with the scheme shown in <a href="#entropy-23-00955-f001" class="html-fig">Figure 1</a>.</p> "> Figure 3
<p>Schematic of the fiber-ring cavity of length <span class="html-italic">L</span> used to implement an optical reservoir. In the input layer, a polarization controller maps the input polarization onto a polarization eigenmode of the cavity. Data is injected by means of a Mach–Zehnder modulator (MZM). A coupler with power transmission coefficient <math display="inline"><semantics> <mrow> <mi>T</mi> <mo>=</mo> <mn>50</mn> <mo>%</mo> </mrow> </semantics></math> couples the input field <math display="inline"><semantics> <mrow> <msubsup> <mi>E</mi> <mrow> <mi>i</mi> <mi>n</mi> </mrow> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </msubsup> <mrow> <mo>(</mo> <mi>τ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> to the cavity field <math display="inline"><semantics> <mrow> <msup> <mi>E</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </msup> <mrow> <mo>(</mo> <mi>z</mi> <mo>,</mo> <mi>τ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> and couples to the output field <math display="inline"><semantics> <mrow> <msubsup> <mi>E</mi> <mrow> <mi>o</mi> <mi>u</mi> <mi>t</mi> </mrow> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </msubsup> <mrow> <mo>(</mo> <mi>τ</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, where <span class="html-italic">n</span> is the roundtrip index, <math display="inline"><semantics> <mi>τ</mi> </semantics></math> is time (with <math display="inline"><semantics> <mrow> <mn>0</mn> <mo><</mo> <mi>τ</mi> <mo><</mo> <msub> <mi>t</mi> <mi>R</mi> </msub> </mrow> </semantics></math>) and <span class="html-italic">z</span> is the longitudinal position in the ring cavity. A photodetector (PD) records the neural responses to be processed by a digital computer where the output expansion is realized.</p> "> Figure 4
<p>Example of experimental phase variations over different iterations of the experiment. The solid line is the measured phase, based on the pulse interference, and the dots represent the estimated phase using a linear combination of all 20 random features. The iterations take place approximately every second. The experiment is carefully shielded so that <math display="inline"><semantics> <mi>θ</mi> </semantics></math> varies slowly.</p> "> Figure 5
<p>Example of simulated phase variations within 1 iteration of the simulated experiment. The full line represents the true phase variations, covering the full <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>π</mi> </mrow> </semantics></math> range and with kHz bandwidth, and the dots represent the estimated phase, obtained using a linear combination of all 20 random features.</p> "> Figure 6
<p>Correlation coefficients obtained when mapping increasing sets of random features to <math display="inline"><semantics> <mrow> <mi>cos</mi> <mi>θ</mi> </mrow> </semantics></math> using linear regression. For the experimental comparison, an estimate of <math display="inline"><semantics> <mrow> <mi>cos</mi> <mi>θ</mi> </mrow> </semantics></math> is used, whereas in simulation, the known value of <math display="inline"><semantics> <mrow> <mi>cos</mi> <mi>θ</mi> </mrow> </semantics></math> is used. Error bars are obtained by running experiments/simulations for several iterations and using different sets of random weights for the construction of the random features.</p> "> Figure 7
<p>Experimental and simulated memory capacity of the reservoir computer when the number of random output features used is increased from 0 to 20. The stacked vertical bars are color-coded to represent (from the bottom up) the total memory capacities of degrees 1 (dark blue), 2 (red), 3 (orange), 4 (purple) and 5 (green) and all higher degrees combined (light blue). As such, the total height represents the total memory capacity of the system.</p> "> Figure 8
<p>Experimental and simulated results on the 4-level channel equalization benchmark task. The symbol error rate is reported as a function of the number of random features that are used to tune the task-related readout weights. Error bars are obtained by running experiments/simulations for several iterations and using different sets of random weights for the construction of the random features.</p> "> Figure 9
<p>Additional simulated results on the 4-level channel equalization benchmark task. The symbol error rate is reported as a function of the total number of readout weights <math display="inline"><semantics> <msup> <mi>N</mi> <mo>′</mo> </msup> </semantics></math> that must be optimized. Different curves represent reservoirs with different numbers of neurons <span class="html-italic">N</span>, ranging from 10 to 40. For each system, the number <span class="html-italic">P</span> of random features that are used in the output expansion is varied as 0, 1, 2, 5 and 10 from left to right, which affects <math display="inline"><semantics> <mrow> <msup> <mi>N</mi> <mo>′</mo> </msup> <mo>=</mo> <mi>N</mi> <mrow> <mo>(</mo> <mi>P</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </semantics></math>. Error bars are obtained by running experiments/simulations for several iterations and using different sets of random weights for the construction of the random features.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Reservoir Computing with Output Layer Expansion
2.2. Output Expansion with First and Second Degree Polynomials
2.3. Slow Noise and Feature Dependent Weights
2.4. Setup
3. Results
3.1. Ability of Random Features to Capture Parameter Variations
3.2. Memory Capacity
3.3. Nonlinear Channel Equalization Task
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Analysis of Phase-Drift in Simple Delay-Based Reservoir Computer
Appendix B. Construction of Slow Features
References
- Maass, W.; Natschläger, T.; Markram, H. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comput. 2002, 14, 2531–2560. [Google Scholar] [CrossRef] [PubMed]
- Jaeger, H.; Haas, H. Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science 2004, 304, 78–80. [Google Scholar] [CrossRef] [Green Version]
- Verstraeten, D.; Schrauwen, B.; d’Haene, M.; Stroobandt, D. An experimental unification of reservoir computing methods. Neural Netw. 2007, 20, 391–403. [Google Scholar] [CrossRef] [PubMed]
- Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I. Information processing using a single dynamical node as complex system. Nat. Commun. 2011, 2, 468. [Google Scholar] [CrossRef] [Green Version]
- Paquot, Y.; Duport, F.; Smerieri, A.; Dambre, J.; Schrauwen, B.; Haelterman, M.; Massar, S. Optoelectronic reservoir computing. Sci. Rep. 2012, 2, 287. [Google Scholar] [CrossRef] [PubMed]
- Larger, L.; Soriano, M.C.; Brunner, D.; Appeltant, L.; Gutiérrez, J.M.; Pesquera, L.; Mirasso, C.R.; Fischer, I. Photonic information processing beyond Turing: An optoelectronic implementation of reservoir computing. Opt. Express 2012, 20, 3241–3249. [Google Scholar] [CrossRef] [PubMed]
- Duport, F.; Schneider, B.; Smerieri, A.; Haelterman, M.; Massar, S. All-optical reservoir computing. Opt. Express 2012, 20, 22783–22795. [Google Scholar] [CrossRef] [PubMed]
- Brunner, D.; Soriano, M.C.; Mirasso, C.R.; Fischer, I. Parallel photonic information processing at gigabyte per second data rates using transient states. Nat. Commun. 2013, 4, 1364. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vinckier, Q.; Duport, F.; Smerieri, A.; Vandoorne, K.; Bienstman, P.; Haelterman, M.; Massar, S. High-performance photonic reservoir computer based on a coherently driven passive cavity. Optica 2015, 2, 438–446. [Google Scholar] [CrossRef]
- Duport, F.; Smerieri, A.; Akrout, A.; Haelterman, M.; Massar, S. Fully analogue photonic reservoir computer. Sci. Rep. 2016, 6, 22381. [Google Scholar] [CrossRef] [Green Version]
- Larger, L.; Baylón-Fuentes, A.; Martinenghi, R.; Udaltsov, V.S.; Chembo, Y.K.; Jacquot, M. High-speed photonic reservoir computing using a time-delay-based architecture: Million words per second classification. Phys. Rev. X 2017, 7, 011015. [Google Scholar] [CrossRef]
- Pauwels, J.; Verschaffelt, G.; Massar, S.; Van der Sande, G. Distributed Kerr Non-linearity in a Coherent All-Optical Fiber-Ring Reservoir Computer. Front. Phys. 2019, 7, 138. [Google Scholar] [CrossRef] [Green Version]
- Vandoorne, K.; Dambre, J.; Verstraeten, D.; Schrauwen, B.; Bienstman, P. Parallel reservoir computing using optical amplifiers. IEEE Trans. Neural Netw. 2011, 22, 1469–1481. [Google Scholar] [CrossRef]
- Vandoorne, K.; Mechet, P.; Van Vaerenbergh, T.; Fiers, M.; Morthier, G.; Verstraeten, D.; Schrauwen, B.; Dambre, J.; Bienstman, P. Experimental demonstration of reservoir computing on a silicon photonics chip. Nat. Commun. 2014, 5, 3541. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bueno, J.; Maktoobi, S.; Froehly, L.; Fischer, I.; Jacquot, M.; Larger, L.; Brunner, D. Reinforcement learning in a large-scale photonic recurrent neural network. Optica 2018, 5, 756–760. [Google Scholar] [CrossRef] [Green Version]
- Katumba, A.; Heyvaert, J.; Schneider, B.; Uvin, S.; Dambre, J.; Bienstman, P. Low-loss photonic reservoir computing with multimode photonic integrated circuits. Sci. Rep. 2018, 8, 2653. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Harkhoe, K.; Van der Sande, G. Dual-mode semiconductor lasers in reservoir computing. In Neuro-Inspired Photonic Computing; International Society for Optics and Photonics: Bellingham, WA, USA, 2018; Volume 10689, p. 106890B. [Google Scholar]
- Mesaritakis, C.; Syvridis, D. Reservoir computing based on transverse modes in a single optical waveguide. Opt. Lett. 2019, 44, 1218–1221. [Google Scholar] [CrossRef]
- Sunada, S.; Kanno, K.; Uchida, A. Using multidimensional speckle dynamics for high-speed, large-scale, parallel photonic computing. Opt. Express 2020, 28, 30349–30361. [Google Scholar] [CrossRef]
- Paudel, U.; Luengo-Kovac, M.; Shaw, T.J.; Valley, G.C. Optical reservoir computer using speckle in a multimode waveguide. In AI and Optical Data Sciences; Jalali, B., Kitayama, K., Eds.; International Society for Optics and Photonics, SPIE: Bellingham, WA, USA, 2020; Volume 11299, pp. 19–24. [Google Scholar] [CrossRef]
- Rafayelyan, M.; Dong, J.; Tan, Y.; Krzakala, F.; Gigan, S. Large-scale optical reservoir computing for spatiotemporal chaotic systems prediction. Phys. Rev. X 2020, 10, 041037. [Google Scholar]
- Van der Sande, G.; Brunner, D.; Soriano, M.C. Advances in photonic reservoir computing. Nanophotonics 2017, 6, 561–576. [Google Scholar] [CrossRef]
- Wyffels, F.; Schrauwen, B.; Stroobandt, D. Stable output feedback in reservoir computing using ridge regression. In International Conference on Artificial Neural Networks; Springer: Berlin/Heidelberg, Germany, 2008; pp. 808–817. [Google Scholar]
- Soriano, M.C.; Ortín, S.; Brunner, D.; Larger, L.; Mirasso, C.R.; Fischer, I.; Pesquera, L. Optoelectronic reservoir computing: Tackling noise-induced performance degradation. Opt. Express 2013, 21, 12–20. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Alata, R.; Pauwels, J.; Haelterman, M.; Massar, S. Phase noise robustness of a coherent spatially parallel optical reservoir. IEEE J. Sel. Top. Quantum Electron. 2019, 26, 1–10. [Google Scholar] [CrossRef]
- Wiskott, L.; Sejnowski, T.J. Slow feature analysis: Unsupervised learning of invariances. Neural Comput. 2002, 14, 715–770. [Google Scholar] [CrossRef] [PubMed]
- Jaeger, H. Short Term Memory in Echo State Networks; GMD-Forschungszentrum Informationstechnik, 2001; Volume 5. Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.720.3974 (accessed on 15 July 2021).
- Dambre, J.; Verstraeten, D.; Schrauwen, B.; Massar, S. Information processing capacity of dynamical systems. Sci. Rep. 2012, 2, 514. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pauwels, J.; Van der Sande, G.; Verschaffelt, G.; Massar, S. Photonic Reservoir Computer with Output Expansion for Unsupervized Parameter Drift Compensation. Entropy 2021, 23, 955. https://doi.org/10.3390/e23080955
Pauwels J, Van der Sande G, Verschaffelt G, Massar S. Photonic Reservoir Computer with Output Expansion for Unsupervized Parameter Drift Compensation. Entropy. 2021; 23(8):955. https://doi.org/10.3390/e23080955
Chicago/Turabian StylePauwels, Jaël, Guy Van der Sande, Guy Verschaffelt, and Serge Massar. 2021. "Photonic Reservoir Computer with Output Expansion for Unsupervized Parameter Drift Compensation" Entropy 23, no. 8: 955. https://doi.org/10.3390/e23080955
APA StylePauwels, J., Van der Sande, G., Verschaffelt, G., & Massar, S. (2021). Photonic Reservoir Computer with Output Expansion for Unsupervized Parameter Drift Compensation. Entropy, 23(8), 955. https://doi.org/10.3390/e23080955