Abstracting local transformer attention for enhancing interpretability on time series data

DC ElementWertSprache
dc.contributor.authorSchwenke, L.
dc.contributor.authorAtzmueller, M.
dc.contributor.editorSeidl, T.
dc.contributor.editorFromm, M.
dc.contributor.editorObermeier, S.
dc.date.accessioned2021-12-23T16:35:07Z-
dc.date.available2021-12-23T16:35:07Z-
dc.date.issued2020
dc.identifier.issn16130073
dc.identifier.urihttps://osnascholar.ub.uni-osnabrueck.de/handle/unios/18338-
dc.descriptionConference of 2021 Learning, Knowledge, Data, Analytics Workshops, LWDA 2021 ; Conference Date: 1 September 2021 Through 3 September 2021; Conference Code:173242
dc.description.abstractTransformers have demonstrated considerable performance on sequential data, recently also towards time series data. However, enhancing their interpretability and explainability is still a major research problem, similar to other prominent deep learning approaches. In this paper, we tackle this issue specifically for time series data, where we build on our previous research regarding attention abstraction, aggregation and visualization. In particular, we combine two of our initial attention aggregation techniques and perform a detailed evaluation of this extended scope with our previously used local attention abstraction technique, demonstrating its efficacy on one synthetic as well as three real-world datasets. © 2021 Copyright for this paper by its authors.
dc.description.sponsorshipInterregInterreg; This work has been funded by the Interreg North-West Europe program (Interreg NWE), project Di-Plast - Digital Circular Economy for the Plastics Industry (NWE729).
dc.language.isoen
dc.publisherCEUR-WS
dc.relation.ispartofCEUR Workshop Proceedings
dc.subjectAbstracting
dc.subjectAttention
dc.subjectData visualization
dc.subjectDeep Learning
dc.subjectInterpretability
dc.subjectLearning approach
dc.subjectPerformance
dc.subjectPetroleum reservoir evaluation, Attention
dc.subjectResearch problems
dc.subjectSequential data
dc.subjectTime Series Analysis
dc.subjectTime-series analysis
dc.subjectTime-series data
dc.subjectTransformer
dc.subjectTransformer, Time series analysis
dc.titleAbstracting local transformer attention for enhancing interpretability on time series data
dc.typeconference paper
dc.identifier.scopus2-s2.0-85118864405
dc.identifier.urlhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85118864405&partnerID=40&md5=fe0191bb6a00b9dbb952f875884b8ffa
dc.description.volume2993
dc.description.startpage205
dc.description.endpage218
dcterms.isPartOf.abbreviationCEUR Workshop Proc.
Zur Kurzanzeige

Seitenaufrufe

25
Letzte Woche
0
Letzter Monat
2
geprüft am 09.06.2024

Google ScholarTM

Prüfen