Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
93b54368
Unverified
Commit
93b54368
authored
Dec 08, 2022
by
Younes Belkada
Committed by
GitHub
Dec 08, 2022
Browse files
[`BiT`] Small patch fix (#20657)
* patch fix for `fp16` * use `np` instead
parent
0526a075
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
src/transformers/models/bit/modeling_bit.py
src/transformers/models/bit/modeling_bit.py
+3
-1
No files found.
src/transformers/models/bit/modeling_bit.py
View file @
93b54368
...
...
@@ -18,6 +18,7 @@ import collections
import
math
from
typing
import
Optional
,
Tuple
import
numpy
as
np
import
torch
import
torch.utils.checkpoint
from
torch
import
Tensor
,
nn
...
...
@@ -592,7 +593,8 @@ class BitEncoder(nn.Module):
dilation
=
1
layer_dropouts
=
[
x
.
tolist
()
for
x
in
torch
.
linspace
(
0
,
config
.
drop_path_rate
,
sum
(
config
.
depths
)).
split
(
config
.
depths
)
x
.
tolist
()
for
x
in
torch
.
Tensor
(
np
.
linspace
(
0
,
config
.
drop_path_rate
,
sum
(
config
.
depths
))).
split
(
config
.
depths
)
]
for
stage_idx
,
(
current_depth
,
current_hidden_size
,
layer_dropout
)
in
enumerate
(
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment