AdapterHub Logo

Getting Started

  • Installation
  • Quick Start
  • Adapter Training
  • Transitioning from adapter-transformers

Adapter Methods

  • Overview and Configuration
  • Adapter Methods
  • Method Combinations
  • Multi Task Methods

Advanced

  • Adapter Activation and Composition
  • Merging Adapters
  • Prediction Heads
  • Embeddings

Loading and Sharing

  • Loading Pre-Trained Adapters
  • Integration with Hugging Face’s Model Hub

Supported Models

  • Model Overview
  • Custom Models
  • ALBERT
  • Auto Classes
  • BART
  • BEiT
  • BERT
  • BertGeneration
  • CLIP
  • DeBERTa
  • DeBERTa-v2
  • DistilBERT
  • ELECTRA
  • Encoder Decoder Models
  • OpenAI GPT2
  • EleutherAI GPT-J-6B
  • LLaMA
  • Mistral
  • MBart
  • MT5
  • PLBART
  • RoBERTa
  • T5
  • Vision Transformer (ViT)
  • Whisper
  • XLM-RoBERTa
  • X-MOD

Adapter-Related Classes

  • Adapter Configuration
  • Model Adapters Config
  • Adapter Implementation
  • Adapter Model Interface
  • Model Mixins
  • Adapter Training
  • Adapter Utilities

Contributing

  • Contributing to AdapterHub
  • Adding Adapter Methods
  • Adding Adapters to a Model
  • Extending the Library
AdapterHub
  • Search


© Copyright 2020-2024, AdapterHub Team.

Built with Sphinx using a theme provided by Read the Docs.
Versions v: main
Branches
main