Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
d7f98cd3
Commit
d7f98cd3
authored
Mar 18, 2020
by
Junyi_Li
Committed by
Julien Chaumond
Mar 18, 2020
Browse files
Init card for model
parent
38a555a8
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
39 additions
and
0 deletions
+39
-0
model_cards/clue/roberta_chinese_3L312_clue_tiny/README.md
model_cards/clue/roberta_chinese_3L312_clue_tiny/README.md
+39
-0
No files found.
model_cards/clue/roberta_chinese_3L312_clue_tiny/README.md
0 → 100644
View file @
d7f98cd3
# Introduction
This model was trained on TPU and the details are as follows:
## Model
##
| Model_name | params | size | Training_corpus | Vocab |
| :------------------------------------------ | :----- | :------- | :----------------- | :-----------: |
|
**`RoBERTa-tiny-clue`**
<br/>
Super_small_model | 7.5M | 28.3M |
**CLUECorpus2020**
|
**CLUEVocab**
|
|
**`RoBERTa-tiny-pair`**
<br/>
Super_small_sentence_pair_model | 7.5M | 28.3M |
**CLUECorpus2020**
|
**CLUEVocab**
|
|
**`RoBERTa-tiny3L768-clue`**
<br/>
small_model | 38M | 110M |
**CLUECorpus2020**
|
**CLUEVocab**
|
|
**`RoBERTa-tiny3L312-clue`**
<br/>
small_model | <7.5M | 24M |
**CLUECorpus2020**
|
**CLUEVocab**
|
|
**`RoBERTa-large-clue`**
<br/>
Large_model | 290M | 1.20G |
**CLUECorpus2020**
|
**CLUEVocab**
|
|
**`RoBERTa-large-pair`**
<br/>
Large_sentence_pair_model | 290M | 1.20G |
**CLUECorpus2020**
|
**CLUEVocab**
|
### Usage
With the help of
[
Huggingface-Transformers 2.5.1
](
https://github.com/huggingface/transformers
)
, you could use these model as follows
```
tokenizer
=
BertTokenizer
.
from_pretrained
(
"MODEL_NAME"
)
model
=
BertModel
.
from_pretrained
(
"MODEL_NAME"
)
```
`MODEL_NAME`
:
| Model_NAME | MODEL_LINK |
| -------------------------- | ------------------------------------------------------------ |
|
**RoBERTa-tiny-clue**
|
[
`clue/roberta_chinese_clue_tiny`
](
https://huggingface.co/clue/roberta_chinese_clue_tiny
)
|
|
**RoBERTa-tiny-pair**
|
[
`clue/roberta_chinese_pair_tiny`
](
https://huggingface.co/clue/roberta_chinese_pair_tiny
)
|
|
**RoBERTa-tiny3L768-clue**
|
[
`clue/roberta_chinese_3L768_clue_tiny`
](
https://huggingface.co/clue/roberta_chinese_3L768_clue_tiny
)
|
|
**RoBERTa-tiny3L312-clue**
|
[
`clue/roberta_chinese_3L312_clue_tiny`
](
https://huggingface.co/clue/roberta_chinese_3L312_clue_tiny
)
|
|
**RoBERTa-large-clue**
|
[
`clue/roberta_chinese_clue_large`
](
https://huggingface.co/clue/roberta_chinese_clue_large
)
|
|
**RoBERTa-large-pair**
|
[
`clue/roberta_chinese_pair_large`
](
https://huggingface.co/clue/roberta_chinese_pair_large
)
|
## Details
Please read
<a
href=
'https://arxiv.org/pdf/2003.01355'
>
https://arxiv.org/pdf/2003.01355.
Please visit our repository: https://github.com/CLUEbenchmark/CLUEPretrainedModels.git
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment