AdapterHub Documentation

AdapterHub is a framework simplifying the integration, training and usage of adapter modules for Transformer-based language models. It integrates adapters for downstream tasks (Houlsby et al., 2019), adapters for cross-lingual transfer (Pfeiffer et al., 2020a) and AdapterFusion (Pfeiffer et al., 2020b).

The framework consists of two main components:

  • adapter-transformers, an extension of Huggingface’s Transformers library that adds adapter components to transformer models

  • The Hub, a central repository collecting pre-trained adapter modules

The adapter-transformers section documents the integration of adapters into the transformers library and how training adapters works.

The section on Adapter-Hub describes the fundamentals of the pre-trained adapter repository and how to contribute new adapters.

Currently, we support the PyTorch versions of all models listed in the Supported Models section.

Supported Models


   title={AdapterHub: A Framework for Adapting Transformers},
   author={Jonas Pfeiffer and
            Andreas R\"uckl\'{e} and
            Clifton Poth and
            Aishwarya Kamath and
            Ivan Vuli\'{c} and
            Sebastian Ruder and
            Kyunghyun Cho and
            Iryna Gurevych},
   journal={arXiv preprint},

Indices and tables