Pages that link to "Item:Q877590"
From MaRDI portal
The following pages link to Newton's method and its use in optimization (Q877590):
Displaying 33 items.
- A multi-layer line search method to improve the initialization of optimization algorithms (Q320070) (← links)
- Efficient optimal eighth-order derivative-free methods for nonlinear equations (Q351456) (← links)
- Accelerating the convergence in the single-source and multi-source Weber problems (Q428088) (← links)
- Split Newton iterative algorithm and its application (Q606762) (← links)
- Nonmonotone adaptive trust region method (Q621656) (← links)
- Geometrically constructed families of Newton's method for unconstrained optimization and nonlinear equations (Q638098) (← links)
- Categorizing with catastrophic radii in numerical minimization (Q829871) (← links)
- Zonotopes and the LP-Newton method (Q833464) (← links)
- A generalization of Müller's iteration method based on standard information (Q937186) (← links)
- Newton waveform relaxation method for solving algebraic nonlinear equations (Q945285) (← links)
- Two-step relaxation Newton algorithm for solving nonlinear algebraic equations (Q980444) (← links)
- Some modifications of King's family with optimal eighth order of convergence (Q1931065) (← links)
- A generalized univariate Newton method motivated by proximal regularization (Q1935256) (← links)
- Extended Newton-type method for nonlinear functions with values in a cone (Q1993494) (← links)
- On the bang-bang control approach via a component-wise line search strategy for unconstrained optimization (Q2061320) (← links)
- On Newton's method for solving generalized equations (Q2099273) (← links)
- A generalized multivariable Newton method (Q2138447) (← links)
- A generalized Newton method for a class of discrete-time linear complementarity systems (Q2184087) (← links)
- Multiview attenuation estimation and correction (Q2320441) (← links)
- Laplace approximation and natural gradient for Gaussian process regression with heteroscedastic Student-\(t\) model (Q2329797) (← links)
- Improvements of the Newton-Raphson method (Q2668036) (← links)
- What, if anything, is new in optimization? (Q2760091) (← links)
- Blind deconvolution by a Newton method on the non-unitary hypersphere (Q2847023) (← links)
- A modification of the Newton method from a viewpoint of statistical testing methods (Q3988303) (← links)
- (Q5224221) (← links)
- Kantorovich's Theorem on Newton's Method for Solving Strongly Regular Generalized Equation (Q5737734) (← links)
- Optimal data splitting in distributed optimization for machine learning (Q6124406) (← links)
- Super-Universal Regularized Newton Method (Q6136654) (← links)
- Newton's method for interval-valued multiobjective optimization problem (Q6189860) (← links)
- On the inversion-free Newton's method and its applications (Q6612368) (← links)
- Developing a new conjugate gradient algorithm with the benefit of some desirable properties of the Newton algorithm for unconstrained optimization (Q6615082) (← links)
- On the complexity of extending the convergence domain of Newton's method under the weak majorant condition (Q6618694) (← links)
- Integrating differential evolution into gazelle optimization for advanced global optimization and engineering applications (Q6669080) (← links)