Visible to Intel only — GUID: GUID-1005DD19-19E7-4FD6-A1B8-76AB3D9F069E
Visible to Intel only — GUID: GUID-1005DD19-19E7-4FD6-A1B8-76AB3D9F069E
Elastic Net
Elastic Net is a method for modeling relationship between a dependent variable (which may be a vector) and one or more explanatory variables by fitting regularized least squares model. Elastic Net regression model has the special penalty, a sum of L1 and L2 regularizations, that takes advantage of both Ridge Regression and LASSO algorithms. This penalty is particularly useful in a situation with many correlated predictor variables [Friedman2010].
Details
Let be a vector of input variables and be the response. For each , the Elastic Net model has the form similar to linear and ridge regression models [Hoerl70] with one exception: the coefficients are estimated by minimizing mean squared error (MSE) objective function that is regularized by and penalties.
Here , , are referred to as independent variables, , , is referred to as dependent variable or response.
Training Stage
Let be a set of training data (for regression task, , and for feature selection p could be greater than n). The matrix X of size contains observations , , of independent variables.
For each , , the Elastic Net regression estimates by minimizing the objective function:
In the equation above, the first term is a mean squared error function, the second and the third are regularization terms that penalize the and norms of vector , where , , .
For more details, see [Hastie2009] and [Friedman2010].
By default, Coordinate Descent iterative solver is used to minimize the objective function. SAGA solver is also applicable for minimization.
Prediction Stage
Prediction based on Elastic Net regression is done for input vector using the equation for each .