Commit 79f9f03d authored by Gustaf Ahdritz's avatar Gustaf Ahdritz
Browse files

Fix distillation bug

parent b32bfeec
......@@ -208,6 +208,10 @@ and supports the full range of training options that entails, including
multi-node distributed training. For more information, consult PyTorch
Lightning documentation and the `--help` flag of the training script.
Note that the data directory can also contain PDB files previously output by
the model. These are treated as members of the self-distillation set and are
subjected to distillation-set-only preprocessing steps.
## Testing
To run unit tests, use
......
......@@ -202,7 +202,7 @@ def sample_msa(protein, max_seq, keep_extra, seed=None):
@curry1
def sample_msa_distillation(protein, max_seq):
if(protein["is_distillation"] == 1):
protein = sample_msa(protein, max_seq, keep_extra=False)
protein = sample_msa(max_seq, keep_extra=False)(protein)
return protein
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment