In this paper, a two-step sparse learning approach is proposed for variable selection and model parameter estimation with optimally tuned hyperparameters in each step. In Step one, a sparse learning algorithm is applied on all data to produce a sequence of candidate subsets of selected variables by varying the hyperparameter value. In Step two, for each subset of the selected variables from Step one, Lasso, ridge regression, elastic-net, or adaptive Lasso is employed to find the optimal hyperparameters with the best cross-validation error. Among all subsets, the one with the overall minimum cross-validation error is selected as globally optimal. The effectiveness of the proposed approach is demonstrated using an industrial NOx emission dataset and the Dow challenge dataset to predict product impurity.