Power Magnetic Devices. Scott D. Sudhoff. Читать онлайн. Newlib. NEWLIB.NET

Автор: Scott D. Sudhoff
Издательство: John Wiley & Sons Limited
Серия:
Жанр произведения: Техническая литература
Год издания: 0
isbn: 9781119674634
Скачать книгу
on the case where f(x) ∈ ℝ and x ∈ ℝn. Algorithms to solve this problem include gradient methods, Newton’s method, conjugate direction methods, quasi‐Newton methods, and the Nelder–Mead simplex method, to name a few. Let us focus on Newton’s method as being somewhat representative.

      In order to set the stage for Newton’s method, let us first define some operators. The first derivative or gradient of our objective function is denoted ∇f(x) and is defined as

      (1.3-1)equation

      The second derivative or Hessian of f(x)is defined as

      (1.3-2)equation

      If x* is a local minimizer of f, and if x* is in the interior of Ω, it can be shown that

      At this point, we are posed to set forth Newton’s method of finding function minimizers. This method is iterative and is based on a kth estimate of the solution denoted x[k]. Then an update formula is applied to generate a (hopefully) improved estimate x[k + 1] of the minimizer. The update formula is derived by first approximating f as a Taylor series about the current estimated solution x[k]. In particular,

      which is Newton’s method.

      Clearly, Newton’s method requires f ⊂ C2, which is to say that f is in the set of twice differentiable functions. Note that the selection of the initial solution, x[1], can have a significant impact on which (if any) local solution is found. Also note that method is equally likely to yield a local maximizer as a local minimizer.

      Example 1.3A

      Let us apply Newton’s method to find the minimizer of the function

      (1.3A-2)equation

      and

      (1.3A-3)equation


e-mail: [email protected]

k x[k] f(x[k]) ∇f(x[k]) F(x[k])
1 images 43 images images
2 images 14.2 images images
3 images 9.25 images images