Unverified Commit 78c1e7d2 authored by Jing Hua's avatar Jing Hua Committed by GitHub
Browse files

xlm roberta xl config for doctest (#19610)


Co-authored-by: default avatarSylvain Gugger <35901082+sgugger@users.noreply.github.com>
parent 10ea45b9
......@@ -86,12 +86,12 @@ class XLMRobertaXLConfig(PretrainedConfig):
Examples:
```python
>>> from transformers import XLMRobertaXLModel, XLMRobertaXLConfig
>>> from transformers import XLMRobertaXLConfig, XLMRobertaXLModel
>>> # Initializing a XLM_ROBERTA_XL bert-base-uncased style configuration
>>> configuration = XLMRobertaXLConfig()
>>> # Initializing a model from the bert-base-uncased style configuration
>>> # Initializing a model (with random weights) from the bert-base-uncased style configuration
>>> model = XLMRobertaXLModel(configuration)
>>> # Accessing the model configuration
......
......@@ -131,6 +131,7 @@ src/transformers/models/whisper/configuration_whisper.py
src/transformers/models/whisper/modeling_whisper.py
src/transformers/models/whisper/modeling_tf_whisper.py
src/transformers/models/xlm_roberta/configuration_xlm_roberta.py
src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py
src/transformers/models/yolos/configuration_yolos.py
src/transformers/models/yolos/modeling_yolos.py
src/transformers/models/x_clip/modeling_x_clip.py
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment