Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
b09adff7
Unverified
Commit
b09adff7
authored
Apr 04, 2023
by
Yuanchen
Committed by
GitHub
Apr 04, 2023
Browse files
[chat]fix sft training for bloom, gpt and opt (#3418)
fix sft training for bloom, gpt and opt
parent
638a07a7
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
9 additions
and
0 deletions
+9
-0
applications/Chat/coati/models/bloom/bloom_lm.py
applications/Chat/coati/models/bloom/bloom_lm.py
+3
-0
applications/Chat/coati/models/gpt/gpt_lm.py
applications/Chat/coati/models/gpt/gpt_lm.py
+3
-0
applications/Chat/coati/models/opt/opt_lm.py
applications/Chat/coati/models/opt/opt_lm.py
+3
-0
No files found.
applications/Chat/coati/models/bloom/bloom_lm.py
View file @
b09adff7
...
@@ -33,3 +33,6 @@ class BLOOMLM(LM):
...
@@ -33,3 +33,6 @@ class BLOOMLM(LM):
if
checkpoint
:
if
checkpoint
:
model
.
gradient_checkpointing_enable
()
model
.
gradient_checkpointing_enable
()
super
().
__init__
(
model
,
lora_rank
,
lora_train_bias
)
super
().
__init__
(
model
,
lora_rank
,
lora_train_bias
)
def
forward
(
self
,
input_ids
,
attention_mask
=
None
,
labels
=
None
,
**
kwargs
):
return
self
.
model
(
input_ids
,
attention_mask
=
attention_mask
,
labels
=
labels
,
**
kwargs
)
applications/Chat/coati/models/gpt/gpt_lm.py
View file @
b09adff7
...
@@ -33,3 +33,6 @@ class GPTLM(LM):
...
@@ -33,3 +33,6 @@ class GPTLM(LM):
if
checkpoint
:
if
checkpoint
:
model
.
gradient_checkpointing_enable
()
model
.
gradient_checkpointing_enable
()
super
().
__init__
(
model
,
lora_rank
,
lora_train_bias
)
super
().
__init__
(
model
,
lora_rank
,
lora_train_bias
)
def
forward
(
self
,
input_ids
,
attention_mask
=
None
,
labels
=
None
,
**
kwargs
):
return
self
.
model
(
input_ids
,
attention_mask
=
attention_mask
,
labels
=
labels
,
**
kwargs
)
applications/Chat/coati/models/opt/opt_lm.py
View file @
b09adff7
...
@@ -33,3 +33,6 @@ class OPTLM(LM):
...
@@ -33,3 +33,6 @@ class OPTLM(LM):
if
checkpoint
:
if
checkpoint
:
model
.
gradient_checkpointing_enable
()
model
.
gradient_checkpointing_enable
()
super
().
__init__
(
model
,
lora_rank
,
lora_train_bias
)
super
().
__init__
(
model
,
lora_rank
,
lora_train_bias
)
def
forward
(
self
,
input_ids
,
attention_mask
=
None
,
labels
=
None
,
**
kwargs
):
return
self
.
model
(
input_ids
,
attention_mask
=
attention_mask
,
labels
=
labels
,
**
kwargs
)
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment