Commit f7ed4a4b authored by Gustaf Ahdritz's avatar Gustaf Ahdritz
Browse files

Merge branch 'main' of ssh://github.com/aqlaboratory/openfold into main

parents 98b2c663 0fb5b743
...@@ -89,9 +89,11 @@ where `data` is the same directory as in the previous step. If `jackhmmer`, `hhb ...@@ -89,9 +89,11 @@ where `data` is the same directory as in the previous step. If `jackhmmer`, `hhb
### Training ### Training
After activating the OpenFold environment with `source scripts/activate_conda_env.sh`, install OpenFold by running After activating the OpenFold environment with `source scripts/activate_conda_env.sh`, install OpenFold by running
```bash ```bash
python setup.py install python setup.py install
``` ```
To train the model, you will first need to precompute protein alignments. Create `mmcif_dir/` and download `.cif` files from the PDB (e.g., `wget https://files.rcsb.org/download/4DSN.cif`). Then run: To train the model, you will first need to precompute protein alignments. Create `mmcif_dir/` and download `.cif` files from the PDB (e.g., `wget https://files.rcsb.org/download/4DSN.cif`). Then run:
```bash ```bash
......
...@@ -449,7 +449,7 @@ class OpenFoldDataModule(pl.LightningDataModule): ...@@ -449,7 +449,7 @@ class OpenFoldDataModule(pl.LightningDataModule):
class DummyDataset(torch.utils.data.Dataset): class DummyDataset(torch.utils.data.Dataset):
def __init__(self, batch_path): def __init__(self, batch_path):
with open(batch_path, "rb") as f: with open(batch_path, "rb") as f:
batch = pickle.load(f) self.batch = pickle.load(f)
def __getitem__(self, idx): def __getitem__(self, idx):
return copy.deepcopy(self.batch) return copy.deepcopy(self.batch)
......
...@@ -120,9 +120,6 @@ def compute_fape( ...@@ -120,9 +120,6 @@ def compute_fape(
return normed_error return normed_error
# DISCREPANCY: From the way this function is written, it's possible that
# DeepMind clamped 90% of individual residue losses, not 90% of all batches.
# We defer to the text, which seems to imply the latter.
def backbone_loss( def backbone_loss(
backbone_affine_tensor: torch.Tensor, backbone_affine_tensor: torch.Tensor,
backbone_affine_mask: torch.Tensor, backbone_affine_mask: torch.Tensor,
...@@ -164,7 +161,6 @@ def backbone_loss( ...@@ -164,7 +161,6 @@ def backbone_loss(
1 - use_clamped_fape 1 - use_clamped_fape
) )
# Take the mean over the layer dimension
fape_loss = torch.mean(fape_loss) fape_loss = torch.mean(fape_loss)
return fape_loss return fape_loss
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment