Investigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series data

Publikation: Beitrag in FachzeitschriftKonferenzartikelBeigetragenBegutachtung

Beitragende

Abstract

Predicting the Remaining Useful Life of a machine components or systems is a pivotal technology in condition-based maintenance and essential for ensuring the reliability and safety in various production-engineering applications. The influx of extensive industrial data has notably enhanced the efficacy of data-driven Remaining Useful Life prediction models, especially deep learning models. One of the promising deep learning model architectures is the Transformer-based model with the Self-Attention mechanism at its core. However, inherent limitation arise when applying Self-Attention to high- frequency time-series data with large window sizes. Due to its high computational complexity, hardware limitations hinder the practical implementation of Transformer with Self-Attention in production-engineering applications. This study looks into the utilization of alternative Attention modules with reduced complexity, making it more applicable to high-frequency time-series data. In order to allow comparability, this study uses the well-known C-MPASS dataset for benchmarking Remaining Useful Life approaches. Although this dataset does not consist of high-frequency data, it demonstrates the usefulness of alternative Attention modules without noteworthy losses in model accuracy.

Details

OriginalspracheEnglisch
Seiten (von - bis)85-90
Seitenumfang6
FachzeitschriftProcedia CIRP
Jahrgang122
PublikationsstatusVeröffentlicht - 2024
Peer-Review-StatusJa

Konferenz

Titel31st CIRP Conference on Life Cycle Engineering, LCE 2024
Dauer19 - 21 Juni 2024
StadtTurin
LandItalien

Externe IDs

ORCID /0000-0001-7540-4235/work/160952782

Schlagworte

Schlagwörter

  • Attention, Complexity, Deep learning, Remaining useful life, Transformer