adapters library is the successor to the
adapter-transformers library. It differs essentially in that
adapters is now a stand-alone package, i.e., the package is disentangled from the
transformers package from Hugging Face and is no longer a drop-in replacement.
This results in some breaking changes. To transition your code from
adapters you need to consider the following changes:
Package and Namespace¶
To use the library you need to install
adapters in the same environment (unlike
adapter-transformers which contained
transformers and could not be installed in the same environment).
Run the following to install both (installing
adapters will automatically trigger the installation of
transformers if it is not yet installed in th environment):
pip install adapters
This also changes the namespace to
adapters. For all imports of adapter classes change the import from
This mainly affects the following classes:
The Hugging Face model classes, such as
BertModel, cannot be used directly with adapters. They must first be initialised for adding adapters:
from transformers import AutoModel
model = AutoModel.from_pretrained("bert-base-uncased")
adapters.init(model) # prepare model for use with adapters
The necessary change is the call of the
Note that no additional initialisation is required to use the AdapterModel classes such as the
BertAdapterModel’. These classes are provided by the
adapters library and are already prepared for using adapters in training and inference.
Bottleneck Configuration Names¶
adapters library supports the configuration of adapters using config strings. Compared to the
adapter-transformers library, we have changed some of the strings to make them more consistent and intuitive:
For a complete list of config strings and classes see here. We strongly recommend using the new config strings, but we will continue to support the old config strings for the time being to make the transition easier.
Note that with the config strings the coresponding adapter config classes have changed, e.g.
Another consequence of this that the
AdapterConfig class is now not only for the bottleneck adapters anymore, but the base class of all the configurations (previously
AdapterConfigBase). Hence the function this class serves has changed. However, you can still load adapter configs with:
adapter_config = AdapterConfig.load("lora")
Features that are not supported by
adapter-transformers, there are a few features that are no longer supported by the
transformerspipelines with adapters.
Using invertible adapters in the Hugging Face model classes. To use invertible adapters you must use the AdapterModel class.
Loading model and adapter checkpoints saved with
save_pretrainedusing Hugging Face classes. This is only supported by the AdapterModel classes.
What has remained the same¶
The functionality for adding, activating, and training adapters has not changed, except for the renaming of some adapter configs. You still add and activate adapters as follows:
# add adapter to the model
# activate adapter
# freeze model weights and activate adapter