A Phase Transition for Finding Needles in Nonlinear Haystacks with LASSO Artificial Neural Networks

Publication Type
Journal Article
Publication Year
2022
Authors
Ma, Xiaoyu
Sardy, Sylvain
Hengartner, Nick
Bobenko, Nikolai
Lin, Yen Ting
Abstract

To fit sparse linear associations, a LASSO sparsity inducing penalty with a single hyperparameter allows the community to recover the important features (needles) with high probability in certain regimes even if the sample size is smaller than the dimension of the input vector (haystack). More recently learners known as artificial neural networks (ANN) have shown great successes in many machine learning tasks, in particular fitting nonlinear associations. Small learning rate, stochastic gradient descent algorithm, and large training set help the community to cope with the explosion in the number of parameters present in deep neural networks. Yet the community has developed and studied few ANN learners to find needles in nonlinear haystacks. Driven by a single hyperparameter, the authors’ ANN learner, for sparse linear associations, exhibits a phase transition in the probability of retrieving the needles, which the authors do not observe with other ANN learners. To select a penalty parameter, the authors generalize the universal threshold of Donoho and Johnstone (Biometrika 81(3):425–455, 1994) which is a better rule than the conservative (too many false detections) and expensive cross-validation. In the spirit of simulated annealing, the authors propose a warm-start sparsity inducing algorithm to solve the high-dimensional, non-convex, and non-differentiable optimization problem. The authors perform simulated and real data Monte Carlo experiments to quantify the effectiveness of their approach.

Citation
Date
Issue
6
Volume
32
Publication Title
Statistics and Computing
ISSN
1573-1375
DOI
10.1007/s11222-022-10169-0
Publication Tags
Manual Tags
JDACS4C
Automatic Tags
model selection
neural networks
phase transition
sparsity
universal threshold