Show simple item record

dc.contributor.authorLiang, Jiaming
dc.contributor.authorCao, Lei
dc.contributor.authorMadden, Samuel
dc.contributor.authorIves, Zack
dc.contributor.authorLi, Guoliang
dc.date.accessioned2024-04-04T17:42:23Z
dc.date.available2024-04-04T17:42:23Z
dc.date.issued2024-03-12
dc.identifier.issn2836-6573
dc.identifier.urihttps://hdl.handle.net/1721.1/154072
dc.description.abstractTimeseries analytics is important in many real-world applications. Recently, the Transformer model, popular in natural language processing, has been leveraged to learn high quality feature embeddings from timeseries: embeddings are key to the performance of various timeseries analytics tasks such as similarity-based timeseries queries within vector databases. However, quadratic time and space complexities limit Transformers' scalability, especially for long timeseries. To address these issues, we develop a timeseries analytics tool, RITA, which uses a novel attention mechanism, named group attention, to address this scalability issue. Group attention dynamically clusters the objects based on their similarity into a small number of groups and approximately computes the attention at the coarse group granularity. It thus significantly reduces the time and space complexity, yet provides a theoretical guarantee on the quality of the computed attention. The dynamic scheduler of RITA continuously adapts the number of groups and the batch size in the training process, ensuring group attention always uses the fewest groups needed to meet the approximation quality requirement. Extensive experiments on various timeseries datasets and analytics tasks demonstrate that RITA outperforms the state-of-the-art in accuracy and is significantly faster --- with speedups of up to 63X.en_US
dc.publisherAssociation for Computing Machinery (ACM)en_US
dc.relation.isversionof10.1145/3639317en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleRITA: Group Attention is All You Need for Timeseries Analyticsen_US
dc.typeArticleen_US
dc.identifier.citationJiaming Liang, Lei Cao, Samuel Madden, Zachary Ives, and Guoliang Li. 2024. RITA: Group Attention is All You Need for Timeseries Analytics. Proc. ACM Manag. Data 2, 1 (SIGMOD), Article 62 (February 2024), 28 pages.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.relation.journalProceedings of the ACM on Management of Dataen_US
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2024-04-01T07:49:13Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2024-04-01T07:49:13Z
mit.journal.volume2en_US
mit.journal.issue1en_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record