Unverified Commit e980377a authored by Lyu Han's avatar Lyu Han Committed by GitHub
Browse files

Fix side effect brought by supporting codellama: `sequence_start` is always...

Fix side effect brought by supporting codellama: `sequence_start` is always true when calling `model.get_prompt` (#466)
parent 71945001
......@@ -123,7 +123,7 @@ def main(model_path,
step = 0
seed = random.getrandbits(64)
else:
prompt = model.get_prompt(prompt, nth_round)
prompt = model.get_prompt(prompt, nth_round == 1)
input_ids = tokenizer.encode(prompt)
if step + len(input_ids) >= tm_model.session_len:
print('WARNING: exceed session max length.'
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment