MBart

The MBart model was presented in Multilingual Denoising Pre-training for Neural Machine Translation by Yinhan Liu, Jiatao Gu, Naman Goyal, Xian Li, Sergey Edunov Marjan Ghazvininejad, Mike Lewis, Luke Zettlemoyer.

According to the abstract, MBART is a sequence-to-sequence denoising auto-encoder pretrained on large-scale monolingual corpora in many languages using the BART objective. mBART is one of the first methods for pretraining a complete sequence-to-sequence model by denoising full texts in multiple languages, while previous approaches have focused only on the encoder, decoder, or reconstructing parts of the text.

Note

This class is nearly identical to the PyTorch implementation of MBart in Huggingface Transformers. For more information, visit the corresponding section in their documentation.

MBartConfig

MBartTokenizer

MBartTokenizerFast

MBart50Tokenizer

MBart50TokenizerFast

MBartModel

MBartModelWithHeads

MBartForConditionalGeneration

MBartForQuestionAnswering

MBartForSequenceClassification

MBartForCausalLM