Alexander Genkin

Regularization, Sparsity, and Logistic Regression
This talk will include:
  • Introductory example of regularization and sparsity: 3D restoration of neural circuits
  • Logistic regression; classification and class probability estimation; applications
  • L2 regularization; ridge regression; decorrelation and shrinkage; bias-variance trade-off
  • Ridge logistic regression
  • Sparsity and feature selection. Lasso and its properties
  • Lasso logistic regression
  • Comparing ridge and lasso logistic regression. Elastic net
  • Homotopy methods. LARS. Logistic analogy
  • Bregman regularization path
  • Feature selection consistency. Consistency conditions for lasso. Non-convex penalties (SCAD, MDP)
  • Prediction accuracy estimation. Covariance penalty. Efron’s conditional bootstrap, connection to cross-validation
  • Search for optimal regularization parameter. Cross-validation and its properties
  • Automated CV search
  • Other sparsity methods: grouped lasso; graphical lasso