Unverified Commit 2965b204 authored by Manuel Faysse's avatar Manuel Faysse Committed by GitHub
Browse files

add no split modules for xlmrobertaxl (#31223)

parent 821b772a
......@@ -572,6 +572,7 @@ class XLMRobertaXLPreTrainedModel(PreTrainedModel):
config_class = XLMRobertaXLConfig
base_model_prefix = "roberta"
_no_split_modules = ["XLMRobertaXLEmbeddings", "XLMRobertaXLSelfAttention"]
# Copied from transformers.models.bert.modeling_bert.BertPreTrainedModel._init_weights
def _init_weights(self, module):
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment