Visible to Intel only — GUID: GUID-58B2A241-9D6D-4DBF-8A0A-2EF8E19EAC01
Visible to Intel only — GUID: GUID-58B2A241-9D6D-4DBF-8A0A-2EF8E19EAC01
Optimizers
This chapter describes optimizers implemented in oneDAL.
The Newton-CG optimizer minimizes the convex function iteratively using its gradient and hessian-product operator.
Mathematical Formulation
Computing
The Newton-CG optimizer, also known as the hessian-free optimizer, minimizes convex functions without calculating the Hessian matrix. Instead, it uses a Hessian product matrix operator. In the Newton method, the descent direction is calculated using the formula , where are the gradient and hessian matrix of the loss function on the k-th iteration. The Newton-CG method uses the Conjugate Gradients solver to find the approximate solution to the equation . This solver can find solutions to the system of linear equations taking vector b and functor as input.
Computation Method: dense
The method defines the Newton-CG optimizer used by other algorithms for convex optimization. There is no separate computation mode to minimize a function manually.
Programming Interface
Refer to API Reference: Newton-CG optimizer.