contents.rst 3.62 KB
Newer Older
yuguo-Jack's avatar
yuguo-Jack committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30


------------------------------------
NeZha模型汇总
------------------------------------



下表汇总介绍了目前PaddleNLP支持的NeZha模型对应预训练权重。
关于模型的具体细节可以参考对应链接。

+----------------------------------------------------------------------------------+--------------+----------------------------------------------------------------------------------+
| Pretrained Weight                                                                | Language     | Details of the model                                                             |
+==================================================================================+==============+==================================================================================+
|``nezha-base-chinese``                                                            | Chinese      | 12-layer, 768-hidden,                                                            |
|                                                                                  |              | 12-heads, 108M parameters.                                                       |
|                                                                                  |              | Trained on Chinese text.                                                         |
+----------------------------------------------------------------------------------+--------------+----------------------------------------------------------------------------------+
|``nezha-large-chinese``                                                           | Chinese      | 24-layer, 1024-hidden,                                                           |
|                                                                                  |              | 16-heads, 336M parameters.                                                       |
|                                                                                  |              | Trained on Chinese text.                                                         |
+----------------------------------------------------------------------------------+--------------+----------------------------------------------------------------------------------+
|``nezha-base-wwm-chinese``                                                        | Chinese      | 12-layer, 768-hidden,                                                            |
|                                                                                  |              | 16-heads, 108M parameters.                                                       |
|                                                                                  |              | Trained on Chinese text.                                                         |
+----------------------------------------------------------------------------------+--------------+----------------------------------------------------------------------------------+
|``nezha-large-wwm-chinese``                                                       | Chinese      | 24-layer, 1024-hidden,                                                           |
|                                                                                  |              | 16-heads, 336M parameters.                                                       |
|                                                                                  |              | Trained on Chinese text.                                                         |
+----------------------------------------------------------------------------------+--------------+----------------------------------------------------------------------------------+