AdapterHub Documentation

Note

This documentation is based on the new Adapters library.

The documentation based on the legacy adapter-transformers library can be found at: https://docs-legacy.adapterhub.ml.

AdapterHub is a framework simplifying the integration, training and usage of adapters and other efficient fine-tuning methods for Transformer-based language models. For a full list of currently implemented methods, see the table in our repository.

The framework consists of two main components:

Adapters

AdapterHub.ml

an add-on to Hugging Face’s Transformers library that adds adapters into transformer models

a central collection of pre-trained adapter modules

Currently, we support the PyTorch versions of all models as listed on the Model Overview page.

Citation

If you use the _Adapters_ library in your work, please consider citing our library paper Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning <https://arxiv.org/abs/2311.11077)>

@misc{poth2023adapters,
      title={Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning},
      author={Clifton Poth and Hannah Sterz and Indraneil Paul and Sukannya Purkayastha and Leon Engländer and Timo Imhof and Ivan Vulić and Sebastian Ruder and Iryna Gurevych and Jonas Pfeiffer},
      year={2023},
      eprint={2311.11077},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

Alternatively, for the predecessor adapter-transformers, the Hub infrastructure and adapters uploaded by the AdapterHub team, please consider citing our initial paper: AdapterHub: A Framework for Adapting Transformers

@inproceedings{pfeiffer2020AdapterHub,
   title={AdapterHub: A Framework for Adapting Transformers},
   author={Jonas Pfeiffer and
            Andreas R\"uckl\'{e} and
            Clifton Poth and
            Aishwarya Kamath and
            Ivan Vuli\'{c} and
            Sebastian Ruder and
            Kyunghyun Cho and
            Iryna Gurevych},
   booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020): Systems Demonstrations},
   year={2020},
   address = "Online",
   publisher = "Association for Computational Linguistics",
   url = "https://www.aclweb.org/anthology/2020.emnlp-demos.7",
   pages = "46--54",
}

Indices and tables