расширенный поиск

Книга: Quasi-Newton Method: Maxima and Minima, Newton's Method in Optimization, Stationary Point, Hessian Matrix, Gradient, Argonne National Laboratory, Positive-Definite Matrix

Товар № 10196255
Вес: 0.340 кг.
Год издания: 2010
Страниц: 196 Переплет: Мягкая обложка
Товар отсутствует
Узнать о поступлении

High Quality Content by WIKIPEDIA articles! In optimization, quasi-Newton methods (also known as variable metric methods) are well-known algorithms for finding local maxima and minima of functions. Quasi-Newton methods are based on Newton's method to find the stationary point of a function, where the gradient is 0. Newton's method assumes that the function can be locally approximated as a quadratic in the region around the optimum, and use the first and second derivatives (gradient and Hessian) to find the stationary point. In Quasi-Newton methods the Hessian matrix of second derivatives of the function to be minimized does not need to be computed. The Hessian is updated by analyzing successive gradient vectors instead. Quasi-Newton methods are a generalization of the secant method to find the root of the first derivative for multidimensional problems. In multi-dimensions the secant equation is under-determined, and quasi-Newton methods differ in how they constrain the solution, typically by adding a simple low-rank update to the current estimate of the Hessian.

Читать далее