Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
4cc1bf81
Commit
4cc1bf81
authored
Jul 27, 2019
by
thomwolf
Browse files
typos
parent
ac27548b
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
6 additions
and
6 deletions
+6
-6
pytorch_transformers/modeling_auto.py
pytorch_transformers/modeling_auto.py
+2
-2
pytorch_transformers/modeling_utils.py
pytorch_transformers/modeling_utils.py
+2
-2
pytorch_transformers/tokenization_bert.py
pytorch_transformers/tokenization_bert.py
+2
-2
No files found.
pytorch_transformers/modeling_auto.py
View file @
4cc1bf81
...
@@ -157,7 +157,7 @@ class AutoModel(object):
...
@@ -157,7 +157,7 @@ class AutoModel(object):
- contains `xlnet`: XLNetConfig (XLNet model)
- contains `xlnet`: XLNetConfig (XLNet model)
- contains `xlm`: XLMConfig (XLM model)
- contains `xlm`: XLMConfig (XLM model)
The model is set in evaluation mode by default using `model.eval()` (Dropout modules are de
s
activated)
The model is set in evaluation mode by default using `model.eval()` (Dropout modules are deactivated)
To train the model, you should first set it back in training mode with `model.train()`
To train the model, you should first set it back in training mode with `model.train()`
Params:
Params:
...
@@ -179,7 +179,7 @@ class AutoModel(object):
...
@@ -179,7 +179,7 @@ class AutoModel(object):
- the model was saved using the `save_pretrained(save_directory)` (loaded by suppling the save directory).
- the model was saved using the `save_pretrained(save_directory)` (loaded by suppling the save directory).
**state_dict**: an optional state dictionnary for the model to use instead of a state dictionary loaded
**state_dict**: an optional state dictionnary for the model to use instead of a state dictionary loaded
from saved weights file.
from saved weights file.
This option can be used if you want to create a model from a pretrained configuraton but load your own weights.
This option can be used if you want to create a model from a pretrained configurat
i
on but load your own weights.
In this case though, you should check if using `save_pretrained(dir)` and `from_pretrained(save_directory)` is not
In this case though, you should check if using `save_pretrained(dir)` and `from_pretrained(save_directory)` is not
a simpler option.
a simpler option.
**cache_dir**: (`optional`) string:
**cache_dir**: (`optional`) string:
...
...
pytorch_transformers/modeling_utils.py
View file @
4cc1bf81
...
@@ -324,7 +324,7 @@ class PreTrainedModel(nn.Module):
...
@@ -324,7 +324,7 @@ class PreTrainedModel(nn.Module):
def
from_pretrained
(
cls
,
pretrained_model_name_or_path
,
*
model_args
,
**
kwargs
):
def
from_pretrained
(
cls
,
pretrained_model_name_or_path
,
*
model_args
,
**
kwargs
):
r
"""Instantiate a pretrained pytorch model from a pre-trained model configuration.
r
"""Instantiate a pretrained pytorch model from a pre-trained model configuration.
The model is set in evaluation mode by default using `model.eval()` (Dropout modules are de
s
activated)
The model is set in evaluation mode by default using `model.eval()` (Dropout modules are deactivated)
To train the model, you should first set it back in training mode with `model.train()`
To train the model, you should first set it back in training mode with `model.train()`
Params:
Params:
...
@@ -346,7 +346,7 @@ class PreTrainedModel(nn.Module):
...
@@ -346,7 +346,7 @@ class PreTrainedModel(nn.Module):
- the model was saved using the `save_pretrained(save_directory)` (loaded by suppling the save directory).
- the model was saved using the `save_pretrained(save_directory)` (loaded by suppling the save directory).
**state_dict**: an optional state dictionnary for the model to use instead of a state dictionary loaded
**state_dict**: an optional state dictionnary for the model to use instead of a state dictionary loaded
from saved weights file.
from saved weights file.
This option can be used if you want to create a model from a pretrained configuraton but load your own weights.
This option can be used if you want to create a model from a pretrained configurat
i
on but load your own weights.
In this case though, you should check if using `save_pretrained(dir)` and `from_pretrained(save_directory)` is not
In this case though, you should check if using `save_pretrained(dir)` and `from_pretrained(save_directory)` is not
a simpler option.
a simpler option.
**cache_dir**: (`optional`) string:
**cache_dir**: (`optional`) string:
...
...
pytorch_transformers/tokenization_bert.py
View file @
4cc1bf81
...
@@ -119,7 +119,7 @@ class BertTokenizer(PreTrainedTokenizer):
...
@@ -119,7 +119,7 @@ class BertTokenizer(PreTrainedTokenizer):
Only has an effect when do_basic_tokenize=True
Only has an effect when do_basic_tokenize=True
**tokenize_chinese_chars**: (`optional`) boolean (default True)
**tokenize_chinese_chars**: (`optional`) boolean (default True)
Whether to tokenize Chinese characters.
Whether to tokenize Chinese characters.
This should likely be de
s
activated for Japanese:
This should likely be deactivated for Japanese:
see: https://github.com/huggingface/pytorch-pretrained-BERT/issues/328
see: https://github.com/huggingface/pytorch-pretrained-BERT/issues/328
"""
"""
super
(
BertTokenizer
,
self
).
__init__
(
unk_token
=
unk_token
,
sep_token
=
sep_token
,
super
(
BertTokenizer
,
self
).
__init__
(
unk_token
=
unk_token
,
sep_token
=
sep_token
,
...
@@ -214,7 +214,7 @@ class BasicTokenizer(object):
...
@@ -214,7 +214,7 @@ class BasicTokenizer(object):
List of token not to split.
List of token not to split.
**tokenize_chinese_chars**: (`optional`) boolean (default True)
**tokenize_chinese_chars**: (`optional`) boolean (default True)
Whether to tokenize Chinese characters.
Whether to tokenize Chinese characters.
This should likely be de
s
activated for Japanese:
This should likely be deactivated for Japanese:
see: https://github.com/huggingface/pytorch-pretrained-BERT/issues/328
see: https://github.com/huggingface/pytorch-pretrained-BERT/issues/328
"""
"""
if
never_split
is
None
:
if
never_split
is
None
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment