"test/vscode:/vscode.git/clone" did not exist on "c1f401fc580c8b7875a5b7ac415058b31c7a4331"
Commit fd33fb97 authored by Benjamin Fattori's avatar Benjamin Fattori Committed by lintangsutawika
Browse files

comment on padding method for encoder

parent 26a9a445
......@@ -186,11 +186,12 @@ class Seq2SeqHFLM(LM):
)
)
#TODO: Right now, we pass single EOT token to the Encoder and the full context to the decoder
rolling_token_windows = [(None,) + x for x in rolling_token_windows]
pad_amnt = 0
if self.world_size > 1:
# TODO: Comment on what we do here
# We pad out the external document-level iterator so the inner iterator doesn't hang
mytensor = torch.tensor(len(rolling_token_windows), device=self.device)
gathered = (
self.accelerator.gather(mytensor).cpu().detach().numpy().tolist()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment