Investigation of alternative attention modules in transformer models for remaining useful life predictions: Addressing challenges in high-frequency time-series data

Research output: Contribution to journalConference articleContributedpeer-review

Contributors

Abstract

Predicting the Remaining Useful Life of a machine components or systems is a pivotal technology in condition-based maintenance and essential for ensuring the reliability and safety in various production-engineering applications. The influx of extensive industrial data has notably enhanced the efficacy of data-driven Remaining Useful Life prediction models, especially deep learning models. One of the promising deep learning model architectures is the Transformer-based model with the Self-Attention mechanism at its core. However, inherent limitation arise when applying Self-Attention to high- frequency time-series data with large window sizes. Due to its high computational complexity, hardware limitations hinder the practical implementation of Transformer with Self-Attention in production-engineering applications. This study looks into the utilization of alternative Attention modules with reduced complexity, making it more applicable to high-frequency time-series data. In order to allow comparability, this study uses the well-known C-MPASS dataset for benchmarking Remaining Useful Life approaches. Although this dataset does not consist of high-frequency data, it demonstrates the usefulness of alternative Attention modules without noteworthy losses in model accuracy.

Details

Original languageEnglish
Pages (from-to)85-90
Number of pages6
JournalProcedia CIRP
Volume122
Publication statusPublished - 2024
Peer-reviewedYes

Conference

Title31st CIRP Conference on Life Cycle Engineering
Abbreviated titleLCE 2024
Conference number31
Duration19 - 21 June 2024
Website
LocationPolitecnico di Torino
CityTurin
CountryItaly

External IDs

ORCID /0000-0001-7540-4235/work/160952782

Keywords

Keywords

  • Attention, Complexity, Deep learning, Remaining useful life, Transformer