A new global optimization method for univariate constrained twice-differentiable NLP problems
DOI10.1007/s10898-006-9121-1zbMath1156.65057OpenAlexW1978390448MaRDI QIDQ946347
Young Cheol Park, Tai-Yong Lee, Min Ho Chang
Publication date: 23 September 2008
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10898-006-9121-1
global optimizationnumerical examplesnonconvex programmingnonlinear programmingconvex cut functiondifference of convex underestimatorindex branch-and-bound algorithm
Numerical mathematical programming methods (65K05) Polyhedral combinatorics, branch-and-bound, branch-and-cut (90C57) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Global optimization of univariate Lipschitz functions. II: New algorithms and computational comparison
- Global one-dimensional optimization using smooth auxiliary functions
- On using estimates of Lipschitz constants in global optimization
- Index information algorithm with local tuning for solving multidimensional global optimization problems with multiextremal constraints
- Computational experience with a new class of convex underestimators: Box-constrained NLP problems
- Trigonometric convex underestimator for the base functions in Fourier space
- A deterministic algorithm for global optimization
- Rigorous convex underestimators for general twice-differentiable problems
- Outer approximation algorithm for nondifferentiable optimization problems
- Index branch-and-bound algorithm for Lipschitz univariate global optimization with multiextremal constraints
- Trilinear monomials with mixed sign domains: Facets of the convex and concave envelopes
- \(\alpha BB\): A global optimization method for general constrained nonconvex problems
- Sequential and parallel algorithms for global minimizing functions with Lipschitzian derivatives
- A new class of improved convex underestimators for twice continuously differentiable constrained NLPs
- Convex underestimation of twice continuously differentiable functions by piecewise quadratic perturbation: spline \(\alpha\)BB underestimators
- Iterative Methods for the Localization of the Global Maximum
- A Fortran 90 environment for research and prototyping of enclosure algorithms for nonlinear equations and global optimization
- An algorithm for finding the absolute extremum of a function
- Quasiconvex relaxations based on interval arithmetic
This page was built for publication: A new global optimization method for univariate constrained twice-differentiable NLP problems