Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
db40e086
Commit
db40e086
authored
Oct 05, 2023
by
Zhongkai Zhao
Browse files
[test] modify model supporting part of low_level_zero plugin (including correspoding docs)
parent
d1fcc0fa
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
0 additions
and
22 deletions
+0
-22
docs/source/en/basics/booster_plugins.md
docs/source/en/basics/booster_plugins.md
+0
-6
docs/source/zh-Hans/basics/booster_plugins.md
docs/source/zh-Hans/basics/booster_plugins.md
+0
-6
tests/kit/model_zoo/torchrec/torchrec.py
tests/kit/model_zoo/torchrec/torchrec.py
+0
-10
No files found.
docs/source/en/basics/booster_plugins.md
View file @
db40e086
...
...
@@ -44,12 +44,6 @@ We've tested compatibility on some famous models, following models may not be su
-
`timm.models.convit_base`
-
dlrm and deepfm models in
`torchrec`
-
`diffusers.VQModel`
-
`transformers.AlbertModel`
-
`transformers.AlbertForPreTraining`
-
`transformers.BertModel`
-
`transformers.BertForPreTraining`
-
`transformers.GPT2DoubleHeadsModel`
Compatibility problems will be fixed in the future.
...
...
docs/source/zh-Hans/basics/booster_plugins.md
View file @
db40e086
...
...
@@ -42,12 +42,6 @@ Zero-2 不支持局部梯度累积。如果您坚持使用,虽然可以积累
-
`timm.models.convit_base`
-
dlrm and deepfm models in
`torchrec`
-
`diffusers.VQModel`
-
`transformers.AlbertModel`
-
`transformers.AlbertForPreTraining`
-
`transformers.BertModel`
-
`transformers.BertForPreTraining`
-
`transformers.GPT2DoubleHeadsModel`
兼容性问题将在未来修复。
...
...
tests/kit/model_zoo/torchrec/torchrec.py
View file @
db40e086
...
...
@@ -53,16 +53,6 @@ def output_transform_fn(x):
return
dict
(
output
=
x
)
def
output_transform_fn
(
x
):
if
isinstance
(
x
,
KeyedTensor
):
output
=
dict
()
for
key
in
x
.
keys
():
output
[
key
]
=
x
[
key
]
return
output
else
:
return
dict
(
output
=
x
)
def
get_ebc
():
# EmbeddingBagCollection
eb1_config
=
EmbeddingBagConfig
(
name
=
"t1"
,
embedding_dim
=
SHAPE
,
num_embeddings
=
SHAPE
,
feature_names
=
[
"f1"
])
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment