Commit 5df85022 authored by Mohammad Shoeybi's avatar Mohammad Shoeybi
Browse files

Merge branch 'fixfp16gensamples' into 'master'

fixed a bug on fp16 while generating samples

See merge request ADLR/megatron-lm!29
parents a4cb4153 cfc6924b
......@@ -98,6 +98,10 @@ def get_batch(context_tokens, args):
args.reset_attention_mask,
False)
# Fp16 conversion.
if args.fp16:
attention_mask = attention_mask.half()
return tokens, attention_mask, position_ids
def top_k_logits(logits, top_k=0, top_p=0.0, filter_value=-float('Inf')):
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment