AdapterHub Documentation¶
AdapterHub is a framework simplifying the integration, training and usage of adapter modules for Transformer-based language models. It integrates adapters for downstream tasks (Houlsby et al., 2019), adapters for cross-lingual transfer (Pfeiffer et al., 2020a) and AdapterFusion (Pfeiffer et al., 2020b).
The framework consists of two main components:
adapter-transformers
, an extension of Huggingface’s Transformers library that adds adapter components to transformer modelsThe Hub, a central repository collecting pre-trained adapter modules
The adapter-transformers section documents the integration of adapters into the transformers
library and how training adapters works.
The section on Adapter-Hub describes the fundamentals of the pre-trained adapter repository and how to contribute new adapters.
Currently, we support the PyTorch versions of all models listed in the Supported Models section.
adapter-transformers
Adapter-Hub
Adapter-Related Classes
Supported Models
Citation¶
@inproceedings{pfeiffer2020AdapterHub,
title={AdapterHub: A Framework for Adapting Transformers},
author={Jonas Pfeiffer and
Andreas R\"uckl\'{e} and
Clifton Poth and
Aishwarya Kamath and
Ivan Vuli\'{c} and
Sebastian Ruder and
Kyunghyun Cho and
Iryna Gurevych},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020): Systems Demonstrations},
year={2020},
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-demos.7",
pages = "46--54",
}