Investigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series data
Publikation: Beitrag in Fachzeitschrift › Konferenzartikel › Beigetragen › Begutachtung
Beitragende
Abstract
Predicting the Remaining Useful Life of a machine components or systems is a pivotal technology in condition-based maintenance and essential for ensuring the reliability and safety in various production-engineering applications. The influx of extensive industrial data has notably enhanced the efficacy of data-driven Remaining Useful Life prediction models, especially deep learning models. One of the promising deep learning model architectures is the Transformer-based model with the Self-Attention mechanism at its core. However, inherent limitation arise when applying Self-Attention to high- frequency time-series data with large window sizes. Due to its high computational complexity, hardware limitations hinder the practical implementation of Transformer with Self-Attention in production-engineering applications. This study looks into the utilization of alternative Attention modules with reduced complexity, making it more applicable to high-frequency time-series data. In order to allow comparability, this study uses the well-known C-MPASS dataset for benchmarking Remaining Useful Life approaches. Although this dataset does not consist of high-frequency data, it demonstrates the usefulness of alternative Attention modules without noteworthy losses in model accuracy.
Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 85-90 |
Seitenumfang | 6 |
Fachzeitschrift | Procedia CIRP |
Jahrgang | 122 |
Publikationsstatus | Veröffentlicht - 2024 |
Peer-Review-Status | Ja |
Konferenz
Titel | 31st CIRP Conference on Life Cycle Engineering |
---|---|
Kurztitel | LCE 2024 |
Veranstaltungsnummer | 31 |
Dauer | 19 - 21 Juni 2024 |
Webseite | |
Ort | Politecnico di Torino |
Stadt | Turin |
Land | Italien |
Externe IDs
ORCID | /0000-0001-7540-4235/work/160952782 |
---|
Schlagworte
ASJC Scopus Sachgebiete
Schlagwörter
- Attention, Complexity, Deep learning, Remaining useful life, Transformer