Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Two Matrix-Type Projection Neural Networks for Matrix-Valued Optimization with Application to Image Restoration

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In recent years, matrix-valued optimization algorithms have been studied to enhance the computational performance of vector-valued optimization algorithms. This paper presents two matrix-type projection neural networks, continuous-time and discrete-time ones, for solving matrix-valued optimization problems. The proposed continuous-time neural network may be viewed as a significant extension to the vector-type double projection neural network. More importantly, the proposed discrete-time projection neural network is suitable for parallel implementation in terms of matrix state spaces. Under pseudo-monotonicity and Lipschitz continuous conditions, the proposed two matrix-type projection neural networks are guaranteed to be globally convergent to the optimal solution. Finally, the proposed matrix-type projection neural network is effectively applied to image restoration. Computed examples show that the two proposed matrix-type projection neural networks are much superior to the vector-type projection neural networks in terms of computation speed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Kalouptsidis N (1997) Signal processing systems: theory and design

  2. Mohammed JL, Hummel RA, Zucker SW (1983) A gradient projection algorithm for relaxation methods. IEEE Trans Pattern Anal Mach Intell 5:330–332

    Article  Google Scholar 

  3. Grant M, Boyd S, Ye Y (2006) Disciplined convex programming. Springer, New York

    Book  Google Scholar 

  4. Vanderbei RJ, Shanno DF (1999) An interior-point algorithm for nonconvex nonlinear programming. Comput Optim Appl 13:231–252

    Article  MathSciNet  Google Scholar 

  5. Zhang SC, Xia YS (2018) Solving nonlinear optimization problems of real functions in complex variables by complex-valued iterative methods. IEEE Trans Cybern 48:277–287

    Article  Google Scholar 

  6. Xia YS (1996) A new neural network for solving linear programming problems and its application. IEEE Trans Neural Netw 7:525–529

    Article  Google Scholar 

  7. Xia YS, Wang J (2004) A recurrent neural network for nonlinear convex optimization subject to nonlinear inequality constraints. IEEE Trans Circuits Syst I Regul Pap 51:1385–1394

    Article  MathSciNet  Google Scholar 

  8. Xia YS (2009) A compact cooperative recurrent neural network for computing general constrained L1 norm estimators. IEEE Press, Piscataway

    MATH  Google Scholar 

  9. Xia YS, Wang J (1998) A general methodology for designing globally convergent optimization neural networks. IEEE Trans Neural Netw 9:1331–1343

    Article  Google Scholar 

  10. Liu QS, Wang J (2008) A one-layer recurrent neural network for non-smooth convex optimization subject to linear equality constraints. In: International conference on neural information processing, pp 1003–1010

  11. Kennedy MP, Chua LO (1988) Neural networks for nonlinear programming. IEEE Trans Circuits Syst 35:554–562

    Article  MathSciNet  Google Scholar 

  12. Rodriguez-Vazquez A, Dominguez-Castro R, Rueda A, Huertas JL (1990) Nonlinear switched capacitor neural networks for optimization problems. IEEE Trans Circuits Syst 37:384–398

    Article  MathSciNet  Google Scholar 

  13. Zhang S, Constantinides AG (1992) Lagrange programming neural networks. IEEE Trans Circuits Syst II Analog Digit Signal Process 39:441–452

    Article  Google Scholar 

  14. Xia YS, Leung H, Wang J (2002) A projection neural network and its application to constrained optimization problems. IEEE Trans Circuits Syst I Fundam Theory Appl 49:447–458

    Article  MathSciNet  Google Scholar 

  15. Xia YS, Feng G, Wang J (2008) A novel recurrent neural network for solving nonlinear optimization problems with inequality constraints. IEEE Trans Neural Netw 19:1340–1353

    Article  Google Scholar 

  16. Xia YS (2004) An extended projection neural network for constrained optimization. Neural Comput 16:863–883

    Article  Google Scholar 

  17. Xia YS, Wang J (2007) Solving variational inequality problems with linear constraints based on a novel recurrent neural network. In: International symposium on advances in neural networks-ISNN, DBLP

  18. Xia YS, Wang J (2016) A bi-projection neural network for solving constrained quadratic optimization problems. IEEE Trans Neural Netw Learn Syst 27:214–224

    Article  MathSciNet  Google Scholar 

  19. Xia YS (2009) New cooperative projection neural network for nonlinearly constrained variational inequality. Sci China 52:1766–1777

    MathSciNet  MATH  Google Scholar 

  20. Cheng L, Hou ZG, Lin Y et al (2011) Recurrent neural network for non-smooth convex optimization problems with application to the identification of genetic regulatory networks. IEEE Trans Neural Netw 22:714–726

    Article  Google Scholar 

  21. Eshaghnezhad M, Effati S, Mansoori A (2016) A neurodynamic model to solve nonlinear pseudo-monotone projection equation and its applications. IEEE Trans Cybern 47:3050–3062

    Article  Google Scholar 

  22. Xia YS, Chen T, Shan J (2014) A novel iterative method for computing generalized inverse. MIT Press, Cambridge

    Book  Google Scholar 

  23. Xia YS, Deng ZP, Zheng WX (2013) Analysis and application of a novel fast algorithm for 2-D ARMA model parameter estimation. Automatica 49(10):3056–3064

    Article  MathSciNet  Google Scholar 

  24. Xia YS, Wang J (2018) Robust regression estimation based on low-dimensional recurrent neural networks. IEEE Trans Neural Netw Learn Syst 29(12):5935–5946

    Article  MathSciNet  Google Scholar 

  25. Li Z, Cheng H, Guo H (2017) General recurrent neural network for solving generalized linear matrix equation. Complexity 3:1–7

    MathSciNet  MATH  Google Scholar 

  26. Bouhamidi A, Jbilou K, Raydan M (2011) Convex constrained optimization for large-scale generalized Sylvester equations. Kluwer Academic Publishers, Dordrecht

    Book  Google Scholar 

  27. Shi QB, Xia YS (2014) Fast multi-channel image reconstruction using a novel two-dimensional algorithm. Multimed Tools Appl 71:2015–2028

    Article  Google Scholar 

  28. Li JF, Li W, Huang R (2016) An efficient method for solving a matrix least squares problem over a matrix inequality constraint. Kluwer Academic Publishers, Dordrecht

    Book  Google Scholar 

  29. Bouhamidi A (2012) A kronecker approximation with a convex constrained optimization method for blind image restoration. Optim Lett 6:1251–1264

    Article  MathSciNet  Google Scholar 

  30. Bertsekas DP, Tsitsiklis JN (1989) Parallel and distributed computation: numerical methods. Prentice Hall, Upper Saddle River

    MATH  Google Scholar 

  31. Kinderlehrer D, Stampcchia G (1980) An introduction to variational inequalities and their applications. Academic, New York

    Google Scholar 

  32. Xia YS, Leung H, Kamel MS (2016) A discrete-time learning algorithm for image restoration using a novel L2-norm noise constrained estimation. Neurocomputing 198:155–170

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Youshen Xia.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work is supported by the National Natural Science Foundation of China under Grant No. 61473330.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, L., Xia, Y., Huang, L. et al. Two Matrix-Type Projection Neural Networks for Matrix-Valued Optimization with Application to Image Restoration. Neural Process Lett 53, 1685–1707 (2021). https://doi.org/10.1007/s11063-019-10086-w

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-019-10086-w

Keywords

Navigation