Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
ColossalAI
Commits
956b561b
Commit
956b561b
authored
Jan 30, 2024
by
Hongxin Liu
Committed by
ver217
Feb 07, 2024
Browse files
[moe] fix mixtral forward default value (#5329)
parent
b60be18d
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
applications/ColossalMoE/colossal_moe/models/mixtral_policy.py
...cations/ColossalMoE/colossal_moe/models/mixtral_policy.py
+1
-1
No files found.
applications/ColossalMoE/colossal_moe/models/mixtral_policy.py
View file @
956b561b
...
@@ -437,7 +437,7 @@ class MixtralPipelineForwards:
...
@@ -437,7 +437,7 @@ class MixtralPipelineForwards:
use_cache
:
Optional
[
bool
]
=
None
,
use_cache
:
Optional
[
bool
]
=
None
,
output_attentions
:
Optional
[
bool
]
=
None
,
output_attentions
:
Optional
[
bool
]
=
None
,
output_hidden_states
:
Optional
[
bool
]
=
None
,
output_hidden_states
:
Optional
[
bool
]
=
None
,
output_router_logits
:
Optional
[
bool
]
=
Tru
e
,
output_router_logits
:
Optional
[
bool
]
=
Non
e
,
return_dict
:
Optional
[
bool
]
=
None
,
return_dict
:
Optional
[
bool
]
=
None
,
stage_manager
:
Optional
[
PipelineStageManager
]
=
None
,
stage_manager
:
Optional
[
PipelineStageManager
]
=
None
,
hidden_states
:
Optional
[
torch
.
FloatTensor
]
=
None
,
hidden_states
:
Optional
[
torch
.
FloatTensor
]
=
None
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment