A Novel Two-step Sparse Learning Approach for Variable Selection and Optimal Predictive Modeling

Yiren LIU, S. Joe QIN*

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

4 Citations (Scopus)

Abstract

In this paper, a two-step sparse learning approach is proposed for variable selection and model parameter estimation with optimally tuned hyperparameters in each step. In Step one, a sparse learning algorithm is applied on all data to produce a sequence of candidate subsets of selected variables by varying the hyperparameter value. In Step two, for each subset of the selected variables from Step one, Lasso, ridge regression, elastic-net, or adaptive Lasso is employed to find the optimal hyperparameters with the best cross-validation error. Among all subsets, the one with the overall minimum cross-validation error is selected as globally optimal. The effectiveness of the proposed approach is demonstrated using an industrial NOx emission dataset and the Dow challenge dataset to predict product impurity.
Original languageEnglish
Pages (from-to)57-64
Number of pages8
JournalIFAC-PapersOnLine
Volume55
Issue number7
Early online date5 Aug 2022
DOIs
Publication statusPublished - 2022
Externally publishedYes

Fingerprint

Dive into the research topics of 'A Novel Two-step Sparse Learning Approach for Variable Selection and Optimal Predictive Modeling'. Together they form a unique fingerprint.

Cite this