Abstract
Recently, deep neural networks have been successfully applied to battery degradation modeling, and onboard graphic processing units can potentially be used to host these models. However, most existing degradation models rely on recursive operations, which are slow on hardware optimized for vectorized computations. This article proposes a two-encoder, one-decoder conditional temporal convolutional network that uses no recursive operations, thus allowing for the conditional modeling of battery degradation and improving upon the prediction speed of long short-term memory encoder-decoder by a factor of up to 70 for long tests. Datasets containing 115 batteries from multiple sources and 195 000 steps are used to verify the proposed model. The data are augmented by randomly selecting the prediction starting point during training. The proposed model achieves an average prediction error of less than 4% of the maximum available capacity based on large datasets. In addition, prediction results and convolution weight analysis indicate that the model can learn the temporal and conditional dynamics of typical lithium-ion batteries and is robust to the selection of the prediction starting point. We further demonstrate a prediction under hypothetical future conditions and a Monte Carlo prediction under unknown future conditions, and the degradation trends are accurately predicted.
Original language | English |
---|---|
Pages (from-to) | 1695-1709 |
Number of pages | 15 |
Journal | IEEE Transactions on Transportation Electrification |
Volume | 8 |
Issue number | 2 |
DOIs | |
Publication status | Published - Jun 2022 |
Externally published | Yes |
Bibliographical note
The authors would like to thank Prof. Jingjing Li, School of Computer Science and Engineering, University of Electronic Science and Technology of China, for useful discussions. They would also like to thank the editors and anonymous reviewers for useful suggestions to improve the article.Keywords
- Lithium-ion battery (LIB)
- recurrent neural network (RNN)
- remaining useful life (RUL)
- sequence-to-sequence (seq2seq)
- temporal convolutional network (TCN)