Commit 2c278ff0 authored by Nayan Singhal's avatar Nayan Singhal Committed by Facebook Github Bot
Browse files

Alignment Training task using minibatch

Summary:
1. Define a EpochMinibatchIterator which extends the EpochBatchIterator. It has same functionality as EpochBatchIterator except two major changes: use static batching and use MiniBatchIterator for getting the indices.

2. SplitSeqCollater is used instead of Seq2SeqCollater.
3. LSTM_subsample started storing the previous states and reset it once the sample is over.

Reviewed By: jay-mahadeokar

Differential Revision: D15209023

fbshipit-source-id: 900b8bd1f25159ffc77f8106e26729a3e7422a1f
parent cd1e5c09
...@@ -325,7 +325,7 @@ class FairseqEncoderModel(BaseFairseqModel): ...@@ -325,7 +325,7 @@ class FairseqEncoderModel(BaseFairseqModel):
Returns: Returns:
the encoder's output, typically of shape `(batch, seq_len, vocab)` the encoder's output, typically of shape `(batch, seq_len, vocab)`
""" """
return self.encoder(src_tokens, src_lengths) return self.encoder(src_tokens, src_lengths, **kwargs)
def get_normalized_probs(self, net_output, log_probs, sample=None): def get_normalized_probs(self, net_output, log_probs, sample=None):
"""Get normalized probabilities (or log probs) from a net's output.""" """Get normalized probabilities (or log probs) from a net's output."""
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment