Unverified Commit 0085e712 authored by Sylvain Gugger's avatar Sylvain Gugger Committed by GitHub
Browse files

Don't stop at num_epochs when using IterableDataset (#12561)

parent 6f1adc43
...@@ -1110,7 +1110,8 @@ class Trainer: ...@@ -1110,7 +1110,8 @@ class Trainer:
else: else:
# see __init__. max_steps is set when the dataset has no __len__ # see __init__. max_steps is set when the dataset has no __len__
max_steps = args.max_steps max_steps = args.max_steps
num_train_epochs = int(args.num_train_epochs) # Setting a very large number of epochs so we go as many times as necessary over the iterator.
num_train_epochs = sys.maxsize
num_update_steps_per_epoch = max_steps num_update_steps_per_epoch = max_steps
num_train_samples = args.max_steps * total_train_batch_size num_train_samples = args.max_steps * total_train_batch_size
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment