• Skip to main content
  • Skip to primary sidebar
AAAI

AAAI

Association for the Advancement of Artificial Intelligence

    • AAAI

      AAAI

      Association for the Advancement of Artificial Intelligence

  • About AAAIAbout AAAI
    • News
    • AAAI Officers and Committees
    • AAAI Staff
    • Bylaws of AAAI
    • AAAI Awards
      • Fellows Program
      • Classic Paper Award
      • Dissertation Award
      • Distinguished Service Award
      • Allen Newell Award
      • Outstanding Paper Award
      • Award for Artificial Intelligence for the Benefit of Humanity
      • Feigenbaum Prize
      • Patrick Henry Winston Outstanding Educator Award
      • Engelmore Award
      • AAAI ISEF Awards
      • Senior Member Status
      • Conference Awards
    • AAAI Resources
    • AAAI Mailing Lists
    • Past AAAI Presidential Addresses
    • Presidential Panel on Long-Term AI Futures
    • Past AAAI Policy Reports
      • A Report to ARPA on Twenty-First Century Intelligent Systems
      • The Role of Intelligent Systems in the National Information Infrastructure
    • AAAI Logos
  • aaai-icon_ethics-diversity-line-yellowEthics & Diversity
  • Conference talk bubbleConferences & Symposia
    • AAAI Conference
    • AIES AAAI/ACM
    • AIIDE
    • IAAI
    • ICWSM
    • HCOMP
    • Spring Symposia
    • Summer Symposia
    • Fall Symposia
    • Code of Conduct for Conferences and Events
  • PublicationsPublications
    • AAAI Press
    • AI Magazine
    • Conference Proceedings
    • AAAI Publication Policies & Guidelines
    • Request to Reproduce Copyrighted Materials
  • aaai-icon_ai-magazine-line-yellowAI Magazine
    • Issues and Articles
    • Author Guidelines
    • Editorial Focus
  • MembershipMembership
    • Member Login
    • Developing Country List
    • AAAI Chapter Program

  • Career CenterCareer Center
  • aaai-icon_ai-topics-line-yellowAITopics
  • aaai-icon_contact-line-yellowContact

  • Twitter
  • Facebook
  • LinkedIn
Home / Proceedings / Proceedings of the AAAI Conference on Artificial Intelligence, 35 / No. 16: AAAI-21 Technical Tracks 16

Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention

February 1, 2023

Download PDF

Abstract:

Transformers have emerged as a powerful tool for a broad range of natural language processing tasks. A key component that drives the impressive performance of Transformers is the self-attention mechanism that encodes the influence or dependence of other tokens on each specific token. While beneficial, the quadratic complexity of self-attention on the input sequence length has limited its application to longer sequences - a topic being actively studied in the community. To address this limitation, we propose Nyströmformer - a model that exhibits favorable scalability as a function of sequence length. Our idea is based on adapting the Nyström method to approximate standard self-attention with O(n) complexity. The scalability of Nyströmformer enables application to longer sequences with thousands of tokens. We perform evaluations on multiple downstream tasks on the GLUE benchmark and IMDB reviews with standard sequence length, and find that our Nyströmformer performs comparably, or in a few cases, even slightly better, than standard self-attention. On longer sequence tasks in the Long Range Arena (LRA) benchmark, Nyströmformer performs favorably relative to other efficient self-attention methods. Our code is available at https://github.com/mlpen/Nystromformer.

Authors

Yunyang Xiong

University of Wisconsin-Madison


Zhanpeng Zeng

University of Wisconsin-Madison


Rudrasis Chakraborty

UC Berkeley


Mingxing Tan

Google Brain


Glenn Fung

American Family Insurance


Yin Li

University of Wisconsin-Madison


Vikas Singh

University of Wisconsin-Madison


DOI:

10.1609/aaai.v35i16.17664


Topics: AAAI

Primary Sidebar

HOW TO CITE:

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention Proceedings of the AAAI Conference on Artificial Intelligence, 35 (2021) 14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention AAAI 2021, 14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh (2021). Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 35, 14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh. Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 35 2021 p.14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh. 2021. Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention. "Proceedings of the AAAI Conference on Artificial Intelligence, 35". 14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh. (2021) "Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention", Proceedings of the AAAI Conference on Artificial Intelligence, 35, p.14138-14148

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh, "Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention", AAAI, p.14138-14148, 2021.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh. "Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention". Proceedings of the AAAI Conference on Artificial Intelligence, 35, 2021, p.14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh. "Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention". Proceedings of the AAAI Conference on Artificial Intelligence, 35, (2021): 14138-14148.

Yunyang Xiong||Zhanpeng Zeng||Rudrasis Chakraborty||Mingxing Tan||Glenn Fung||Yin Li||Vikas Singh. Nyströmformer: A Nyström-based Algorithm for Approximating Self-Attention. AAAI[Internet]. 2021[cited 2023]; 14138-14148.


ISSN: 2374-3468


Published by AAAI Press, Palo Alto, California USA
Copyright 2022, Association for the Advancement of
Artificial Intelligence 1900 Embarcadero Road, Suite
101, Palo Alto, California 94303 All Rights Reserved

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
Cookie SettingsAccept All
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT