Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling

S. Joe QIN*, Yiren LIU, Shiqin TANG

*Corresponding author for this work

Research output: Journal PublicationsJournal Article (refereed)peer-review

7 Citations (Scopus)

Abstract

In this article, we explore the connection of partial least squares (PLS) to other regularized regression algorithms including the Lasso and ridge regression, and consider a steepest descent alternative to the PLS algorithm. First, the PLS latent variable analysis is emphasized and formulated as a standalone procedure. The PLS connections to the conjugate gradient, Krylov space, and the Cayley–Hamilton theorem for matrix pseudo-inverse are explored based on known results in the literature. Comparison of PLS with the Lasso and ridge regression are given in terms of the different resolutions along the regularization paths, leading to an explanation of why PLS sometimes does not outperform the Lasso and ridge regression. As an attempt to increase resolutions along the regularization paths, a steepest descent PLS is formulated as a regularized regression alternative to PLS and is compared to other regularized algorithms via simulations and an industrial case study.

Original languageEnglish
Article numbere17992
Number of pages16
JournalAICHE Journal
Volume69
Issue number4
Early online date10 Dec 2022
DOIs
Publication statusPublished - Apr 2023

Bibliographical note

Funding Information:
The work described in this article is partially supported by a Collaborative Research Fund grant (Project No. CityU C1115‐20G) and a General Research Fund grant (No. 11303421) from the Research Grants Council of the Hong Kong Special Administrative Region, China, a Natural Science Foundation of China grant (U20A20189), an ITF‐Guangdong Technology Cooperation Funding Scheme (No. GHP/145/20), a Math and Application project (2021YFA1003504) under the National Key R&D Program, and a Shenzhen‐Hong Kong‐Macau Science and Technology Project Category C (9240086).

Funding Information:
Collaborative Research Fund of Hong Kong RGC, Grant/Award Number: C1115‐20G; General Research Fund of Hong Kong RGC, Grant/Award Number: 11303421; Natural Science Foundation of China, Grant/Award Number: U20A20189; Math and Application Project under the National Key R&D Program, Grant/Award Number: 2021YFA1003504; ITF ‐ Guangdong‐Hong Kong Technology Cooperation Funding Scheme, Grant/Award Number: GHP/145/20; Shenzhen‐Hong Kong‐Macau Science and Technology Project Category C Funding information

Publisher Copyright:
© 2022 American Institute of Chemical Engineers.

Keywords

  • conjugate gradient
  • latent variable analysis
  • partial least squares analysis
  • partial least squares regression
  • regularized regression
  • steepest descent

Fingerprint

Dive into the research topics of 'Partial least squares, steepest descent, and conjugate gradient for regularized predictive modeling'. Together they form a unique fingerprint.

Cite this