Continuous Generalized Gradient Descent



CGGD is a method for calculating trajectories in high-dimensional parameter spaces for variable selection and prediction in regression models. Examples include proportional gradient shrinkage as an extension of LASSO and LARS, threshold gradient descent with right-continuous variable selectors, threshold ridge regression, and many more with proper combinations of variable selectors and functional forms of a kernel. In all these problems, general gradient descent trajectories are continuous piecewise analytic vector-valued curves as solutions to matrix differential equations. The algorithm is monotone and converges in the loss or negative likelihood functions.

The paper: Continuous Generalized Gradient Descent, Cun-Hui Zhang, Journal of Computational and Graphical Statistics 2007

An R package that implements CGGD, written by Ofer Melnik and Cun-Hui Zhang is available from CRAN.
The authors take no responsibility stated or implied for the contents, use, applicability, or results of the CGGD software package.