Abstract
In this paper, a two-step sparse learning approach is proposed for variable selection and model parameter estimation with optimally tuned hyperparameters in each step. In Step one, a sparse learning algorithm is applied on all data to produce a sequence of candidate subsets of selected variables by varying the hyperparameter value. In Step two, for each subset of the selected variables from Step one, Lasso, ridge regression, elastic-net, or adaptive Lasso is employed to find the optimal hyperparameters with the best cross-validation error. Among all subsets, the one with the overall minimum cross-validation error is selected as globally optimal. The effectiveness of the proposed approach is demonstrated using an industrial NOx emission dataset and the Dow challenge dataset to predict product impurity.
| Original language | English |
|---|---|
| Pages (from-to) | 57-64 |
| Number of pages | 8 |
| Journal | IFAC-PapersOnLine |
| Volume | 55 |
| Issue number | 7 |
| Early online date | 5 Aug 2022 |
| DOIs | |
| Publication status | Published - 2022 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2022 Elsevier B.V.. All rights reserved.
Keywords
- Inferential modeling
- industrial applications
- regularization
- sparse statistical learning
- variable selection