XLM-RoBERTa

The XLM-RoBERTa model was proposed in Unsupervised Cross-lingual Representation Learning at Scale by Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Edouard Grave, Myle Ott, Luke Zettlemoyer and Veselin Stoyanov. It is based on Facebook’s RoBERTa model released in 2019. It is a large multi-lingual language model, trained on 2.5TB of filtered CommonCrawl data.

Note

This class is nearly identical to the PyTorch implementation of XLM-RoBERTa in Huggingface Transformers. For more information, visit the corresponding section in their documentation.

XLMRobertaConfig

class transformers.XLMRobertaConfig(pad_token_id=1, bos_token_id=0, eos_token_id=2, **kwargs)

This class overrides RobertaConfig. Please check the superclass for the appropriate documentation alongside usage examples.

XLMRobertaTokenizer

class transformers.XLMRobertaTokenizer(*args, **kwargs)

XLMRobertaModel

class transformers.XLMRobertaModel(config, add_pooling_layer=True)

The bare XLM-RoBERTa Model transformer outputting raw hidden-states without any specific head on top.

This model inherits from PreTrainedModel. Check the superclass documentation for the generic methods the library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads etc.)

This model is also a PyTorch torch.nn.Module subclass. Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general usage and behavior.

Parameters

config (XLMRobertaConfig) – Model configuration class with all the parameters of the model. Initializing with a config file does not load the weights associated with the model, only the configuration. Check out the from_pretrained() method to load the model weights.

This class overrides RobertaModel. Please check the superclass for the appropriate documentation alongside usage examples.

config_class

alias of transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig

XLMRobertaModelWithHeads

class transformers.XLMRobertaModelWithHeads(config)

XLM-RoBERTa Model with the option to add multiple flexible heads on top.

This model inherits from PreTrainedModel. Check the superclass documentation for the generic methods the library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads etc.)

This model is also a PyTorch torch.nn.Module subclass. Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general usage and behavior.

Parameters

config (XLMRobertaConfig) – Model configuration class with all the parameters of the model. Initializing with a config file does not load the weights associated with the model, only the configuration. Check out the from_pretrained() method to load the model weights.

This class overrides RobertaModelWithHeads. Please check the superclass for the appropriate documentation alongside usage examples.

config_class

alias of transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig

XLMRobertaForMaskedLM

class transformers.XLMRobertaForMaskedLM(config)

XLM-RoBERTa Model with a language modeling head on top.

This model inherits from PreTrainedModel. Check the superclass documentation for the generic methods the library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads etc.)

This model is also a PyTorch torch.nn.Module subclass. Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general usage and behavior.

Parameters

config (XLMRobertaConfig) – Model configuration class with all the parameters of the model. Initializing with a config file does not load the weights associated with the model, only the configuration. Check out the from_pretrained() method to load the model weights.

This class overrides RobertaForMaskedLM. Please check the superclass for the appropriate documentation alongside usage examples.

config_class

alias of transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig

XLMRobertaForSequenceClassification

class transformers.XLMRobertaForSequenceClassification(config)

XLM-RoBERTa Model transformer with a sequence classification/regression head on top (a linear layer on top of the pooled output) e.g. for GLUE tasks.

This model inherits from PreTrainedModel. Check the superclass documentation for the generic methods the library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads etc.)

This model is also a PyTorch torch.nn.Module subclass. Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general usage and behavior.

Parameters

config (XLMRobertaConfig) – Model configuration class with all the parameters of the model. Initializing with a config file does not load the weights associated with the model, only the configuration. Check out the from_pretrained() method to load the model weights.

This class overrides RobertaForSequenceClassification. Please check the superclass for the appropriate documentation alongside usage examples.

config_class

alias of transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig

XLMRobertaForMultipleChoice

class transformers.XLMRobertaForMultipleChoice(config)

XLM-RoBERTa Model with a multiple choice classification head on top (a linear layer on top of the pooled output and a softmax) e.g. for RocStories/SWAG tasks.

This model inherits from PreTrainedModel. Check the superclass documentation for the generic methods the library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads etc.)

This model is also a PyTorch torch.nn.Module subclass. Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general usage and behavior.

Parameters

config (XLMRobertaConfig) – Model configuration class with all the parameters of the model. Initializing with a config file does not load the weights associated with the model, only the configuration. Check out the from_pretrained() method to load the model weights.

This class overrides RobertaForMultipleChoice. Please check the superclass for the appropriate documentation alongside usage examples.

config_class

alias of transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig

XLMRobertaForTokenClassification

class transformers.XLMRobertaForTokenClassification(config)

XLM-RoBERTa Model with a token classification head on top (a linear layer on top of the hidden-states output) e.g. for Named-Entity-Recognition (NER) tasks.

This model inherits from PreTrainedModel. Check the superclass documentation for the generic methods the library implements for all its model (such as downloading or saving, resizing the input embeddings, pruning heads etc.)

This model is also a PyTorch torch.nn.Module subclass. Use it as a regular PyTorch Module and refer to the PyTorch documentation for all matter related to general usage and behavior.

Parameters

config (XLMRobertaConfig) – Model configuration class with all the parameters of the model. Initializing with a config file does not load the weights associated with the model, only the configuration. Check out the from_pretrained() method to load the model weights.

This class overrides RobertaForTokenClassification. Please check the superclass for the appropriate documentation alongside usage examples.

config_class

alias of transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig