Nothing Special   »   [go: up one dir, main page]

Skip to main content

Continuous Subgradient Method

  • Chapter
  • First Online:
Convex Optimization with Computational Errors

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 155))

  • 948 Accesses

Abstract

In this chapter we study the continuous subgradient algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex–concave functions, under the presence of computational errors. The problem is described by an objective function and a set of feasible points. For this algorithm we need a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two calculations there is a computational error produced by our computer system. In general, these two computational errors are different. We show that our algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if we know the computational errors for the two calculations of our algorithm, we find out what approximate solution can be obtained and how much time one needs for this.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

eBook
USD 15.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Antipin AS (1994) Minimization of convex functions on convex sets by means of differential equations. Differ Equ 30:1365–1375

    MathSciNet  MATH  Google Scholar 

  2. Baillon JB (1978) Un Exemple Concernant le Comportement Asymptotique de la Solution du Probleme 0 ∈ dudt + ∂ϕ(u). J Funct Anal 28:369–376

    Article  MathSciNet  Google Scholar 

  3. Barbu V, Precupanu T (2012) Convexity and optimization in Banach spaces. Springer Heidelberg, London, New York

    Book  Google Scholar 

  4. Bolte J (2003) Continuous gradient projection method in Hilbert spaces. J Optim Theory Appl 119:235–259

    Article  MathSciNet  Google Scholar 

  5. Brezis H (1973) Opérateurs maximaux monotones. North Holland, Amsterdam

    MATH  Google Scholar 

  6. Zaslavski AJ (2016) Numerical optimization with computational errors. Springer, Cham

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

J. Zaslavski, A. (2020). Continuous Subgradient Method. In: Convex Optimization with Computational Errors. Springer Optimization and Its Applications, vol 155. Springer, Cham. https://doi.org/10.1007/978-3-030-37822-6_6

Download citation

Publish with us

Policies and ethics