Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
05545bfe
Unverified
Commit
05545bfe
authored
Dec 09, 2022
by
Jiarui Fang
Committed by
GitHub
Dec 09, 2022
Browse files
[ColoTensor] throw error when ColoInitContext meets meta parameter. (#2105)
parent
d87baa85
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
2 deletions
+7
-2
colossalai/utils/model/colo_init_context.py
colossalai/utils/model/colo_init_context.py
+7
-2
No files found.
colossalai/utils/model/colo_init_context.py
View file @
05545bfe
...
@@ -36,8 +36,13 @@ def _convert_to_coloparam(param: torch.nn.Parameter,
...
@@ -36,8 +36,13 @@ def _convert_to_coloparam(param: torch.nn.Parameter,
return
param
return
param
# detaching tensor is necessary for optimizers.
# detaching tensor is necessary for optimizers.
requires_grad
=
param
.
requires_grad
requires_grad
=
param
.
requires_grad
# param is the global tensor.
colo_param
=
ColoParameter
(
param
.
to
(
device
=
device
,
dtype
=
dtype
),
requires_grad
=
requires_grad
)
if
param
.
device
.
type
==
'meta'
:
raise
NotImplemented
(
"ColoInitContext is initializing a model with meta parameters! This is not allowed right now!"
)
else
:
# param is the global tensor.
colo_param
=
ColoParameter
(
param
.
to
(
device
=
device
,
dtype
=
dtype
),
requires_grad
=
requires_grad
)
# if default_shard_plan exists, shard the param during initialization.
# if default_shard_plan exists, shard the param during initialization.
# This can reduce the model size after initialization.
# This can reduce the model size after initialization.
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment