Bundle Enrichment Method for Nonsmooth Difference of Convex Programming Problems
<p>Performance profiles using the number of function evaluations (20 starting points).</p> "> Figure 2
<p>Performance profiles using the number of subgradient evaluations (20 starting points).</p> "> Figure 3
<p>Performance profiles using CPU time (20 starting points).</p> ">
Abstract
:1. Introduction
- The first category consists of the DCA and its modifications. The basic idea of the DCA is to linearize the concave part around the current iterate by using its subgradient while keeping the convex part as it is in the minimization process. To improve the convergence of the DCA, various modifications have been developed, for instance, in [21,22,23]. The boosted DC algorithm (BDCA), proposed in [21,22], accelerates the convergence of the DCA by using an additional line search step. The inertial DCA, introduced in [23], defines trial points whose sequence of functional values is not necessarily monotonically decreasing. This controls the algorithm from converging to a critical point that is not d-stationary. In [24], the BDCA is combined with a simple derivative-free optimization method. This allows one to force the d-stationarity (lack of descent direction) at the obtained point. To avoid the difficulty of solving the DCA’s subproblem, in [25], the first DC component is replaced with a convex model, and the second DC component is used without any approximation;
- The methods in the second category, which we refer to as DC-Bundle, are various extensions of the bundle methods for convex problems. The piecewise linear underestimates or subgradients of both DC components are utilized to compute search directions. The methods include the codifferential method [26], the proximal bundle method with the concave affine model [27,28], the proximal bundle method for DC optimization [29,30], the proximal point algorithm [31], the proximal linearized algorithm [32], the double bundle method [33], and the nonlinearly constrained DC bundle method [34];
- The methods in the third category are those that use the convex piecewise linear model of the first DC component and one subgradient of the second DC component to compute search directions at each iteration [35,36]. They differ from those in the first category as at each iteration of these methods, the model of the first DC component is updated and the new subgradient of the second component is calculated. They also differ from those in the second category as they use only one subgradient of the second DC component at each iteration whereas the DC-Bundle methods may use more than one subgradient of this component to build its piecewise linear model. Note that some overlapping is unavoidable when classifying methods into different categories: for instance, the proximal bundle method for DC optimization [30] may be classified into both the second and the third categories. The other methods in the third category include the aggregate subgradient method for DC optimization [35] and the augmented subgradient method [36]. In the former method, the aggregate subgradient of the first DC component and one subgradient of the second component are utilized to compute search directions. In the latter method, augmented subgradients of convex functions are defined and used to model the first DC component. Then, this model and one subgradient of the second DC component are used for computing search directions.
- developing a model function based on cutting-plane approximations of and, possibly, ;
- adding a regularization term (typically a proximity one) to the model for stabilization purposes.
- The information coming from the previous iterates contributes to the approximation of exactly one of the DC components and , depending on the sign of the linearization error relative to the current iterate. During the iterative process, every time a new point is generated, subgradients of both component functions and are calculated, but only one of them will enter the calculation of the search direction: the choice being driven, at each of the successive iterations, by the sign of the linearization error with respect to the current point. Such sign may change at each iteration. Therefore, we define a dynamic bundling strategy, which implies a consistent reduction (approximately one-half) of the size of the auxiliary problems to be solved;
- If the displacement suggested by the model minimization provides no sufficient decrease of the objective function (the null step in bundle parlance), the temporary enrichment exclusively of the cutting-plane approximation of takes place.
2. Notations and Background
3. The New Model Function
4. The Proposed BEM-DC Algorithm
Algorithm 1: BEM-DC |
|
- In BEM-DC, only one of the two subgradients and , gathered at any iteration j, enters into the calculation of the tentative displacement at each of the successive iterations; the choice is driven by the sign of the linearization error at the current iterate. This is not the case in PBDC;
- In BEM-DC, there exists a temporary bundle whose “birth and death” takes place within the procedure for escaping from the null step and does not increase the bundle size once a descent is achieved, while PBDC adds more information to the bundles at each step and, thus, increases the bundle sizes at every iteration;
- In BEM-DC, a second direction is checked for the descent before the null step is declared (see Remark 2), whereas there is only one possible direction used in each step of PBDC.
5. Termination Property of Algorithm BEM-DC
6. Numerical Experiments
6.1. Solvers and Parameters
- Augmented subgradient method for nonsmooth DC optimization (ASM-DC) [36];
- Aggregate subgradient method for nonsmooth DC optimization (AggSub) [35];
- Proximal bundle method for nonsmooth DC optimization (PBDC) [29];
- Proximal bundle method for nonsmooth DC programming (PBMDC) [30];
- Sequential DC programming with cutting-plane models (SDCA-LP) [25].
6.2. Evaluation Measures and Notations
- Prob. is the label of the problem;
- n is the number of variables;
- is the number of function evaluations for the objective function f;
- is the number of subgradient evaluations defined as the average of subgradient evaluations for the DC components and ;
- CPU is the computational time in seconds;
- is the best value of the objective function obtained by algorithm A.
6.3. Results
6.3.1. Results with One Starting Point
6.3.2. Results with 20 Starting Points
6.3.3. Results for Large-Scale Problems
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Bertsekas, D.P. Nonlinear programming. In Theoretical Solutions Manual, 3rd ed.; Athena Scientific: Nashua, NH, USA, 2016. [Google Scholar]
- Hiriart-Urruty, J.B. Generalized differentiability/ duality and optimization for problems dealing with differences of convex functions. In Lecture Notes in Economics and Mathematical Systems; Springer: Berlin/Heidelberg, Germany, 1986; Volume 256, pp. 37–70. [Google Scholar]
- Strekalovsky, A.S. Global optimality conditions for nonconvex optimization. J. Glob. Optim. 1998, 12, 415–434. [Google Scholar] [CrossRef]
- Strekalovsky, A.S. On a Global Search in D.C. Optimization Problems. In Optimization and Applications; Springer: Cham, Swizerland, 2020; pp. 222–236. [Google Scholar]
- Strekalovsky, A.S. Local Search for Nonsmooth DC Optimization with DC Equality and Inequality Constraints. In Numerical Nonsmooth Optimization; Springer: Cham, Swizerland, 2020; pp. 229–261. [Google Scholar]
- de Oliveira, W. The ABC of DC programming. Set-Valued Var. Anal. 2020, 28, 679–706. [Google Scholar] [CrossRef]
- Horst, R.; Thoai, N.V. DC programming: Overview. J. Optim. Theory Appl. 1999, 103, 1–43. [Google Scholar] [CrossRef]
- An, L.T.H.; Tao, P.D. The DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems. Ann. Oper. Res. 2005, 133, 23–46. [Google Scholar] [CrossRef]
- Holmberg, K.; Tuy, H. A production-transportation problem with stochastic demand and concave production costs. Math. Program. 1999, 85, 157–179. [Google Scholar] [CrossRef]
- Pey-Chun, C.; Hansen, P.; Jaumard, B.; Tuy, H. Solution of the multisource Weber and conditional Weber problems by DC programming. Oper. Res. 1998, 46, 548–562. [Google Scholar] [CrossRef]
- Khalaf, W.; Astorino, A.; D’Alessandro, P.; Gaudioso, M. A DC optimization-based clustering technique for edge detection. Optim. Lett. 2017, 11, 627–640. [Google Scholar] [CrossRef]
- Sun, X.K. Regularity conditions characterizing Fenchel–Lagrange duality and Farkas-type results in DC infinite programming. J. Math. Anal. Appl. 2014, 414, 590–611. [Google Scholar] [CrossRef]
- Bagirov, A.M.; Karmitsa, N.; Taheri, S. Partitional Clustering via Nonsmooth Optimization; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Bagirov, A.M.; Taheri, S.; Cimen, E. Incremental DC Optimization Algorithm for Large-Scale Clusterwise Linear Regression. J. Comput. Appl. Math. 2021, 389, 113323. [Google Scholar] [CrossRef]
- Sun, X.; Teo, K.L.; Zeng, J.; Liu, L. Robust approximate optimal solutions for nonlinear semi-infinite programming with uncertainty. Optimization 2020, 69, 2109–2129. [Google Scholar] [CrossRef]
- Sun, X.; Tan, W.; and Teo, K.L. Characterizing a Class of Robust Vector Polynomial Optimization via Sum of Squares Conditions. J. Optim. Theory Appl. 2023, 197, 737–764. [Google Scholar] [CrossRef]
- Sun, X.; Teo, K.L.; Long, X.J. Some characterizations of approximate solutions for robust semi-infinite optimization problems. J. Optim. Theory Appl. 2021, 191, 281–310. [Google Scholar] [CrossRef]
- Tuy, H. Convex Analysis and Global Optimization; Kluwer: Dordrecht, The Netherlands, 1998. [Google Scholar]
- Tao, P.D.; Souad, E.B. Algorithms for solving a class of nonconvex optimization problems: Methods of subgradient. North-Holl. Math. Stud. 1986, 129, 249–271. [Google Scholar]
- An, L.T.H.; Tao, P.D.; Ngai, H.V. Exact penalty and error bounds in DC programming. J. Glob. Optim. 2012, 52, 509–535. [Google Scholar]
- Artacho, F.J.A.; Fleming, R.M.T.; Vuong, P.T. Accelerating the DC algorithm for smooth functions. Math. Program. 2018, 169, 95–118. [Google Scholar] [CrossRef]
- Artacho, F.J.A.; Vuong, P.T. The Boosted Difference of Convex Functions Algorithm for Nonsmooth Functions. Siam J. Optim. 2020, 30, 980–1006. [Google Scholar] [CrossRef]
- de Oliveira, W.; Tcheou, M.P. An inertial algorithm for DC programming. Set-Valued Var. Anal. 2019, 27, 895–919. [Google Scholar] [CrossRef]
- Artacho, F.J.A.; Campoy, R.; Vuong, P.T. Using positive spanning sets to achieve d-stationarity with the boosted DC algorithm. Vietnam. J. Math. 2020, 48, 363–376. [Google Scholar] [CrossRef]
- de Oliveira, W. Sequential difference-of-convex programming. J. Optim. Theory Appl. 2020, 186, 936–959. [Google Scholar] [CrossRef]
- Dolgopolik, M.V. A convergence analysis of the method of codifferential descent. Comput. Optim. Appl. 2018, 71, 879–913. [Google Scholar] [CrossRef]
- Gaudioso, M.; Giallombardo, G.; Miglionico, G. Minimizing piecewise-concave functions over polytopes. Math. Oper. Res. 2018, 43, 580–597. [Google Scholar] [CrossRef]
- Gaudioso, M.; Giallombardo, G.; Miglionico, G.; Bagirov, A.M. Minimizing nonsmooth DC functions via successive DC piecewise-affine approximations. J. Glob. Optim. 2018, 71, 37–55. [Google Scholar] [CrossRef]
- Joki, K.; Bagirov, A.M.; Karmitsa, N.; Mäkelä, M.M. A proximal bundle method for nonsmooth DC optimization utilizing nonconvex cutting planes. J. Glob. Optim. 2017, 68, 501–535. [Google Scholar] [CrossRef]
- de Oliveira, W. Proximal bundle methods for nonsmooth DC programming. J. Glob. Optim. 2019, 75, 523–563. [Google Scholar] [CrossRef]
- Sun, W.Y.; Sampaio, R.J.B.; Candido, M.A.B. Proximal point algorithm for minimization of DC functions. J. Comput. Math. 2003, 21, 451–462. [Google Scholar]
- Souza, J.C.O.; Oliveira, P.R.; Soubeyran, A. Global convergence of a proximal linearized algorithm for difference of convex functions. Optim. Lett. 2016, 10, 1529–1539. [Google Scholar] [CrossRef]
- Joki, K.; Bagirov, A.M.; Karmitsa, N.; Mäkelä, M.M.; Taheri, S. Double bundle method for finding Clarke stationary points in nonsmooth DC programming. Siam J. Optim. 2018, 28, 1892–1919. [Google Scholar] [CrossRef]
- Ackooij, W.; Demassey, S.; Javal, P.; Morais, H.; de Oliveira, W.; Swaminathan, B. A bundle method for nonsmooth dc programming with application to chance–constrained problems. Comput. Optim. Appl. 2021, 78, 451–490. [Google Scholar] [CrossRef]
- Bagirov, A.M.; Taheri, S.; Joki, K.; Karmitsa, N.; Mäkelä, M.M. Aggregate subgradient method for nonsmooth DC optimization. Optim. Lett. 2020, 15, 83–96. [Google Scholar] [CrossRef]
- Bagirov, A.M.; Hoseini, M.N.; Taheri, S. An augmented subgradient method for minimizing nonsmooth DC functions. Comput. Optim. Appl. 2021, 80, 411–438. [Google Scholar] [CrossRef]
- Astorino, A.; Frangioni, A.; Gaudioso, M.; Gorgone, E. Piecewise quadratic approximations in convex numerical optimization. SIAM J. Optim. 2011, 21, 1418–1438. [Google Scholar] [CrossRef]
- Bagirov, A.M.; Gaudioso, M.; Karmitsa, N.; Mäkelä, M.M.; Taheri, S. (Eds.) Numerical Nonsmooth Optimization, State of the Art Algorithms; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Gaudioso, M.; Monaco, M.F. Variants to the cutting plane approach for convex nondifferentiable optimization. Optimization 1992, 25, 65–75. [Google Scholar] [CrossRef]
- Hiriart–Urruty, J.B.; Lemaréchal, C. Convex Analysis and Minimization Algorithms; Springer: Berlin, Germany, 1993; Volume I and II. [Google Scholar]
- Bagirov, A.M.; Karmitsa, N.; Mäkelä, M.M. Introduction to Nonsmooth Optimization: Theory, Practice and Software; Springer: New York, NY, USA, 2014. [Google Scholar]
- Mäkelä, M.M.; Neittaanmäki, P. Nonsmooth Optimization; World Scientific: Singapore, 1992. [Google Scholar]
- Clarke, F.H. Optimization and Nonsmooth Analysis; John Wiley & Sons: New York, NY, USA, 1983. [Google Scholar]
- Demyanov, V.F.; Vasilev, L.V. Nondifferentiable Optimization; Springer: New York, NY, USA, 1985. [Google Scholar]
- Polyak, B.T. Minimization of unsmooth functionals. Ussr Comput. Math. Math. Phys. 1969, 9, 14–29. [Google Scholar] [CrossRef]
- Kiwiel, K.C. Proximity control in bundle methods for convex nondifferentiable minimization. Math. Program. 1990, 46, 105–122. [Google Scholar] [CrossRef]
- Dolan, E.D.; More, J.J. Benchmarking optimization software with performance profiles. Math. Program. 2002, 91, 201–213. [Google Scholar] [CrossRef]
- Gould, N.; Scott, J. A note of performance profiles for benchmarking software. Acm Trans. Math. Softw. 2016, 43, 1–5. [Google Scholar] [CrossRef]
Prob. | n | BEM-DC | ASM-DC | AggSub | PBDC | DCA | PBMDC | SDCA-LP |
---|---|---|---|---|---|---|---|---|
1 | 2 | 2.00000 | 2.00000 | 2.00000 | 2.00000 | 2.00000 | 2.00000 | – |
2 | 2 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 1.00000 | 0.00000 | 0.00000 |
3 | 4 | 0.00000 | 0.00001 | 0.00001 | 0.00000 | 0.00000 | 0.00000 | 2.00000 |
4 | 2 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
4 | 5 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
4 | 10 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
4 | 50 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
4 | 100 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
4 | 200 | 0.00000 | – | – | 0.00000 | – | 0.00000 | 0.00000 |
5 | 2 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
5 | 5 | 0.00000 | 0.00000 | – | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
5 | 10 | 0.00001 | 0.00000 | – | 0.00000 | 0.00000 | 0.00000 | 0.00001 |
5 | 50 | 0.00004 | 0.00001 | – | 0.00000 | 0.00011 | 0.00000 | 0.00002 |
5 | 100 | 0.00000 | 0.00001 | – | 0.00000 | 0.00031 | 0.00000 | 0.00007 |
5 | 200 | 0.00000 | 0.00001 | – | 0.00000 | 0.00003 | 0.00000 | 0.00008 |
6 | 2 | −2.50000 | −2.50000 | −2.50000 | −2.50000 | −2.50000 | −2.50000 | −2.50000 |
7 | 2 | 0.50000 | 0.50000 | 0.50000 | 0.50000 | 1.00000 | – | 0.50000 |
8 | 3 | 3.50000 | 3.50000 | 3.50000 | 3.50000 | – | 3.50000 | – |
9 | 4 | 9.20000 | 9.20000 | 9.20000 | 1.83333 | 9.20000 | 1.83333 | – |
11 | 3 | 116.33333 | 116.33333 | 116.33377 | 116.33333 | 116.33333 | 116.33330 | 116.33330 |
12 | 2 | 0.61804 | 0.61804 | 0.61804 | 1.61803 | 0.61803 | – | 1.61809 |
12 | 5 | 0.61803 | 0.61804 | 0.61804 | 1.61803 | 0.61803 | – | 0.61803 |
12 | 10 | 0.61803 | 0.61804 | 0.61804 | 0.61803 | 0.61803 | – | – |
12 | 50 | 0.61803 | 0.61804 | 0.61804 | – | 0.61804 | – | – |
12 | 100 | 0.61803 | 0.61803 | 0.61804 | – | 0.61804 | – | – |
12 | 200 | 0.61803 | 0.61803 | 0.61804 | – | 0.61804 | – | – |
13 | 10 | 0.00000 | 0.00000 | 0.00000 | – | 0.00000 | – | – |
14 | 2 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 |
14 | 5 | 0.00001 | 0.00000 | – | 0.00000 | 0.00000 | 0.00001 | 0.00001 |
14 | 10 | 0.00000 | 0.00000 | – | 0.00000 | 0.00000 | 0.00001 | 0.00003 |
14 | 50 | 0.00002 | 0.00000 | – | 0.00000 | 0.00000 | 0.00002 | 0.00005 |
14 | 100 | 0.00007 | 0.00001 | – | 0.00000 | 0.00000 | – | 0.00006 |
14 | 200 | 0.00003 | 0.00003 | – | 0.00000 | 0.00000 | – | 0.00003 |
Prob. | n | BEM-DC | ASM-DC | AggSub | PBDC | DCA | PBMDC | SDCA-LP | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 2 | 105 | 25 | 189 | 52 | 162 | 75 | 22 | 17 | 60 | 41 | 10 | 16 | – |
2 | 2 | 175 | 25 | 109 | 41 | 255 | 92 | 18 | 13 | 72 | 54 | 3 | 5 | 7 |
3 | 4 | 87 | 17 | 149 | 53 | 391 | 174 | 23 | 11 | 222 | 181 | 5 | 8 | 16 |
4 | 2 | 75 | 17 | 43 | 19 | 64 | 31 | 6 | 3 | 52 | 28 | 2 | 2 | 6 |
4 | 5 | 40 | 9 | 126 | 44 | 235 | 120 | 13 | 6 | 165 | 124 | 3 | 4 | 13 |
4 | 10 | 70 | 18 | 256 | 88 | 545 | 273 | 16 | 10 | 361 | 309 | 5 | 8 | 26 |
4 | 50 | 1865 | 874 | 3430 | 1124 | 3206 | 1597 | 52 | 32 | 2915 | 2807 | 25 | 38 | 103 |
4 | 100 | 9049 | 4379 | 10,878 | 3539 | 6824 | 3405 | 102 | 66 | 6312 | 6232 | 50 | 75 | 204 |
4 | 200 | 40,713 | 19,626 | – | – | – | – | 475 | 546 | – | – | 101 | 151 | 403 |
5 | 2 | 8 | 3 | 231 | 50 | 75 | 31 | 18 | 4 | 37 | 24 | 2 | 3 | 8 |
5 | 5 | 64 | 26 | 118 | 44 | – | – | 15 | 7 | 474 | 297 | 6 | 8 | 9 |
5 | 10 | 87 | 42 | 335 | 111 | – | – | 23 | 15 | 54,116 | 53,287 | 12 | 24 | 34 |
5 | 50 | 526 | 235 | 1413 | 457 | – | – | 135 | 124 | 247,813 | 246,256 | 17 | 24 | 55 |
5 | 100 | 141 | 71 | 1337 | 438 | – | – | 47 | 25 | 471,198 | 469,845 | 20 | 29 | 63 |
5 | 200 | 129 | 65 | 1885 | 594 | – | – | 108 | 52 | 46,2821 | 461,267 | 18 | 28 | 66 |
6 | 2 | 39 | 10 | 135 | 43 | 105 | 55 | 22 | 14 | 41 | 33 | 17 | 24 | 29 |
7 | 2 | 303 | 48 | 388 | 111 | 285 | 107 | 72 | 47 | 3368 | 2471 | – | – | 49 |
8 | 3 | 125 | 49 | 271 | 75 | 176 | 88 | 60 | 32 | – | – | – | – | – |
9 | 4 | 9 | 3 | 199 | 63 | 169 | 80 | 86 | 72 | 63 | 56 | 17 | 25 | – |
11 | 3 | 146 | 28 | 174 | 61 | 258 | 126 | 10 | 7 | 18 | 17 | 5 | 8 | 14 |
12 | 2 | 130 | 26 | 290 | 79 | 127 | 64 | 20 | 15 | 61 | 60 | – | – | 37 |
12 | 5 | 202 | 52 | 457 | 122 | 213 | 103 | 58 | 39 | 67 | 65 | – | – | 696 |
12 | 10 | 393 | 105 | 831 | 257 | 359 | 170 | 1415 | 1257 | 76 | 74 | – | – | – |
12 | 50 | 739 | 156 | 3015 | 771 | 1411 | 694 | – | – | 99 | 97 | – | – | – |
12 | 100 | 1032 | 173 | 6998 | 2519 | 1239 | 907 | – | – | 117 | 115 | – | – | – |
12 | 200 | 2757 | 174 | 15,302 | 5090 | 2850 | 1397 | – | – | 338 | 337 | – | – | – |
13 | 10 | 145 | 21 | 26 | 13 | 70 | 38 | – | – | 20 | 19 | – | – | – |
14 | 2 | 19 | 4 | 50 | 23 | 71 | 34 | 9 | 5 | 56 | 54 | 11 | 16 | 29 |
14 | 5 | 145 | 26 | 206 | 73 | 226 | 100 | 109 | 105 | 63 | 61 | 22 | 31 | 58 |
14 | 10 | 249 | 59 | 357 | 107 | 322 | 155 | 6169 | 6151 | 61 | 60 | 34 | 50 | 107 |
14 | 50 | 1168 | 439 | 672 | 211 | 1653 | 790 | 991 | 966 | 16 | 14 | 59 | 89 | 280 |
14 | 100 | 2273 | 708 | 1779 | 549 | – | – | 780 | 740 | 18 | 16 | – | – | 552 |
14 | 200 | 2591 | 724 | 1746 | 541 | – | – | 983 | 961 | 28 | 26 | – | – | 619 |
n | 2 | 3 | 4 | 5 | 10 | 50 | 100 |
1 starting point | 0.000 | 0.000 | 0.001 | 0.012 | 0.009 | 0.672 | 16.328 |
20 starting points | 0.000 | 0.000 | 0.002 | 0.018 | 0.616 | 14.613 | 42.461 |
P | n | BEM-DC | ASM-DC | AggSub | PBDC | DCA | PBMDC | SDCA-LP |
---|---|---|---|---|---|---|---|---|
200 | 0.00000 | 0.00006 | 0.00007 | 98.56721 | 153.29050 | − | − | |
500 | 0.00000 | 0.00009 | 0.00007 | 249.00000 | 539.99295 | − | − | |
1000 | 0.00000 | 0.00032 | 0.00032 | 499.00000 | 510.84581 | − | − | |
1500 | 0.00000 | 0.00072 | 0.00051 | 749.00000 | − | − | ||
2000 | 0.00000 | 0.00247 | 0.00083 | 999.00000 | − | − | ||
200 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | ||
500 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | ||
1000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00001 | ||
1500 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00002 | ||
2000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00000 | 0.00003 | ||
200 | 0.00004 | 0.00000 | 4.66526 | 0.00000 | 0.16124 | 0.00001 | 0.00005 | |
500 | 0.00001 | 0.00001 | 7.91275 | 0.00000 | 0.08352 | 0.00002 | 0.00015 | |
1000 | 0.00006 | 0.00037 | 17.32160 | 0.00000 | 0.05060 | 0.00001 | 0.00004 | |
1500 | 0.00002 | 0.00157 | 10.50161 | 0.00000 | 2.78735 | 0.00001 | 0.00011 | |
2000 | 0.00008 | 0.00056 | 12.63033 | 0.00000 | 26.17493 | 0.00002 | 0.00003 |
Prob. | n | BEM-DC | ASM-DC | AggSub | PBDC | DCA | PBMDC | SDCA-LP | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
200 | 49 | 3 | 2328 | 628 | 1075 | 277 | 60 | 48 | 2508 | 2502 | − | − | − | |
500 | 94 | 3 | 2083 | 456 | 1400 | 364 | 146 | 115 | 2006 | 2002 | − | − | − | |
1000 | 158 | 3 | 2165 | 497 | 2127 | 607 | 50 | 37 | 3511 | 3503 | − | − | − | |
1500 | 218 | 4 | 2691 | 664 | 1734 | 479 | 102 | 80 | 1003 | 1001 | − | − | − | |
2000 | 244 | 4 | 3123 | 793 | 1299 | 631 | 49 | 34 | 4526 | 4515 | − | − | − | |
200 | 863 | 248 | 1835 | 593 | 1604 | 787 | 974 | 780 | 5010 | 5005 | 108 | 177 | 4509 | |
500 | 1430 | 424 | 2161 | 705 | 2342 | 1157 | 1355 | 1057 | 5010 | 5005 | 144 | 23 | 2264 | |
1000 | 3589 | 1057 | 3153 | 1041 | 2277 | 1125 | 1664 | 1416 | 5904 | 5896 | 219 | 360 | 1655 | |
1500 | 4293 | 1329 | 3966 | 1310 | 2722 | 1344 | 2044 | 1740 | 4158 | 4153 | 295 | 482 | 1509 | |
2000 | 6433 | 1817 | 4433 | 1466 | 2936 | 1459 | 2244 | 1914 | 5511 | 5505 | 299 | 497 | 955 | |
200 | 1989 | 536 | 1044 | 333 | 14639 | 7151 | 1928 | 1845 | 3006 | 3003 | 55 | 97 | 97 | |
500 | 4598 | 729 | 1444 | 469 | 12869 | 6252 | 2539 | 2513 | 3507 | 3503 | 58 | 99 | 3555 | |
1000 | 13,844 | 3006 | 3445 | 1107 | 21,456 | 10,555 | 2980 | 2958 | 3006 | 3003 | 73 | 126 | 370 | |
1500 | 12,384 | 2291 | 2408 | 730 | 41,096 | 20,239 | 2763 | 2747 | 1503 | 1501 | 69 | 121 | 1130 | |
2000 | 17,520 | 3156 | 4017 | 1283 | 34,758 | 17,149 | 3766 | 3741 | 1503 | 1501 | 73 | 127 | 1090 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gaudioso, M.; Taheri, S.; Bagirov, A.M.; Karmitsa, N. Bundle Enrichment Method for Nonsmooth Difference of Convex Programming Problems. Algorithms 2023, 16, 394. https://doi.org/10.3390/a16080394
Gaudioso M, Taheri S, Bagirov AM, Karmitsa N. Bundle Enrichment Method for Nonsmooth Difference of Convex Programming Problems. Algorithms. 2023; 16(8):394. https://doi.org/10.3390/a16080394
Chicago/Turabian StyleGaudioso, Manlio, Sona Taheri, Adil M. Bagirov, and Napsu Karmitsa. 2023. "Bundle Enrichment Method for Nonsmooth Difference of Convex Programming Problems" Algorithms 16, no. 8: 394. https://doi.org/10.3390/a16080394
APA StyleGaudioso, M., Taheri, S., Bagirov, A. M., & Karmitsa, N. (2023). Bundle Enrichment Method for Nonsmooth Difference of Convex Programming Problems. Algorithms, 16(8), 394. https://doi.org/10.3390/a16080394