Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
OpenFold
Commits
30d50a18
"vscode:/vscode.git/clone" did not exist on "175c1762d916431134a4ba3537d175c10ef9dfd2"
Commit
30d50a18
authored
Jun 30, 2023
by
Geoffrey Yu
Browse files
remove some the printed out messages
parent
1b4fc115
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
0 additions
and
2 deletions
+0
-2
openfold/utils/loss.py
openfold/utils/loss.py
+0
-2
No files found.
openfold/utils/loss.py
View file @
30d50a18
...
@@ -1606,7 +1606,6 @@ def masked_msa_loss(logits, true_msa, bert_mask, num_classes, eps=1e-8, **kwargs
...
@@ -1606,7 +1606,6 @@ def masked_msa_loss(logits, true_msa, bert_mask, num_classes, eps=1e-8, **kwargs
Returns:
Returns:
Masked MSA loss
Masked MSA loss
"""
"""
print
(
f
"line 1609 logits shape:
{
logits
.
shape
}
and num_classes:
{
num_classes
}
"
)
errors
=
softmax_cross_entropy
(
errors
=
softmax_cross_entropy
(
logits
,
torch
.
nn
.
functional
.
one_hot
(
true_msa
,
num_classes
=
num_classes
)
logits
,
torch
.
nn
.
functional
.
one_hot
(
true_msa
,
num_classes
=
num_classes
)
)
)
...
@@ -1997,7 +1996,6 @@ class AlphaFoldLoss(nn.Module):
...
@@ -1997,7 +1996,6 @@ class AlphaFoldLoss(nn.Module):
loss
=
loss
.
new_tensor
(
0.
,
requires_grad
=
True
)
loss
=
loss
.
new_tensor
(
0.
,
requires_grad
=
True
)
cum_loss
=
cum_loss
+
weight
*
loss
cum_loss
=
cum_loss
+
weight
*
loss
losses
[
loss_name
]
=
loss
.
detach
().
clone
()
losses
[
loss_name
]
=
loss
.
detach
().
clone
()
losses
[
"unscaled_loss"
]
=
cum_loss
.
detach
().
clone
()
losses
[
"unscaled_loss"
]
=
cum_loss
.
detach
().
clone
()
# Scale the loss by the square root of the minimum of the crop size and
# Scale the loss by the square root of the minimum of the crop size and
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment