Investigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series data
Research output: Contribution to journal › Conference article › Contributed › peer-review
Contributors
Abstract
Predicting the Remaining Useful Life of a machine components or systems is a pivotal technology in condition-based maintenance and essential for ensuring the reliability and safety in various production-engineering applications. The influx of extensive industrial data has notably enhanced the efficacy of data-driven Remaining Useful Life prediction models, especially deep learning models. One of the promising deep learning model architectures is the Transformer-based model with the Self-Attention mechanism at its core. However, inherent limitation arise when applying Self-Attention to high- frequency time-series data with large window sizes. Due to its high computational complexity, hardware limitations hinder the practical implementation of Transformer with Self-Attention in production-engineering applications. This study looks into the utilization of alternative Attention modules with reduced complexity, making it more applicable to high-frequency time-series data. In order to allow comparability, this study uses the well-known C-MPASS dataset for benchmarking Remaining Useful Life approaches. Although this dataset does not consist of high-frequency data, it demonstrates the usefulness of alternative Attention modules without noteworthy losses in model accuracy.
Details
Original language | English |
---|---|
Pages (from-to) | 85-90 |
Number of pages | 6 |
Journal | Procedia CIRP |
Volume | 122 |
Publication status | Published - 2024 |
Peer-reviewed | Yes |
Conference
Title | 31st CIRP Conference on Life Cycle Engineering |
---|---|
Abbreviated title | LCE 2024 |
Conference number | 31 |
Duration | 19 - 21 June 2024 |
Website | |
Location | Politecnico di Torino |
City | Turin |
Country | Italy |
External IDs
ORCID | /0000-0001-7540-4235/work/160952782 |
---|
Keywords
ASJC Scopus subject areas
Keywords
- Attention, Complexity, Deep learning, Remaining useful life, Transformer