Deep learning models such as the long short-term memory (LSTM) network have been applied for dynamic inferential modeling. However, many studies apply LSTM as a black-box approach without examining the necessity and usefulness of the internal LSTM gates for inferential modeling. In this paper, we use LSTM as a state space realization and compare it with linear state space modeling and statistical learning methods, including N4SID, partial least squares, the Lasso, and support vector regression. Two case studies on an industrial 660MW boiler and a debutanizer column process indicate that LSTM underperforms all other methods. LSTM is shown to be capable of outperforming linear methods for a simulated reactor process with severely excited nonlinearity in the data. By dissecting the sub-components of a simple LSTM model, the effectiveness of the LSTM gates and nonlinear activation functions is scrutinized.
|Journal||Computers and Chemical Engineering|
|Early online date||26 Apr 2023|
|Publication status||E-pub ahead of print - 26 Apr 2023|
Bibliographical noteFunding Information:
Partial financial support for this work is acknowledged from a Natural Science Foundation of China Project ( U20A20189 ), a General Research Fund by the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. 11303421 ), an ITF - Guangdong-Hong Kong Technology Cooperation Funding Scheme (Project Ref. No. GHP/145/20 ), a Math and Application Project ( 2021YFA1003504 ) under the National Key R&D Program, a Shenzhen-Hong Kong-Macau Science and Technology Project Category C ( 9240086 ), an InnoHK initiative of The Government of the HKSAR for the Laboratory for AI-Powered Financial Technologies , and a Collaborative Research Fund (No. C1143-20G ) by RGC of Hong Kong.
© 2023 Elsevier Ltd
- Dynamic inferential modeling
- Partial least squares
- Regularized learning
- Subspace identification