Commit 473389a3 authored by Jiajun Shen's avatar Jiajun Shen Committed by Facebook Github Bot
Browse files

Small bug fix for generation when batch_size is small

Summary: Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/727

Differential Revision: D16332742

Pulled By: myleott

fbshipit-source-id: becedd573c2c071fd21fcb5e55fead554c9bd9d1
parent 61e328cc
......@@ -135,7 +135,7 @@ def main(args):
print('T-{}\t{}'.format(sample_id, target_str))
# Process top predictions
for j, hypo in enumerate(hypos[i][:min(len(hypos), args.nbest)]):
for j, hypo in enumerate(hypos[i][:args.nbest]):
hypo_tokens, hypo_str, alignment = utils.post_process_prediction(
hypo_tokens=hypo['tokens'].int().cpu(),
src_str=src_str,
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment