Unverified Commit 4861c258 authored by Quentin Meeus's avatar Quentin Meeus Committed by GitHub
Browse files

Add thousands separator in training summary (#22583)

The logger prints a summary at the beginning of training that displays some info such as number of examples, number of parameters, total number of steps, etc. Those numbers can be quite large and difficult to read. I added a thousand separator to improve readability for the following:
- num_examples
- num_train_epochs
- per_device_train_batch_size
- total_train_batch_size
- max_steps
- num_trainable_params
parent 2a91a9ef
...@@ -1764,13 +1764,13 @@ class Trainer: ...@@ -1764,13 +1764,13 @@ class Trainer:
# Train! # Train!
logger.info("***** Running training *****") logger.info("***** Running training *****")
logger.info(f" Num examples = {num_examples}") logger.info(f" Num examples = {num_examples:,}")
logger.info(f" Num Epochs = {num_train_epochs}") logger.info(f" Num Epochs = {num_train_epochs:,}")
logger.info(f" Instantaneous batch size per device = {args.per_device_train_batch_size}") logger.info(f" Instantaneous batch size per device = {args.per_device_train_batch_size:,}")
logger.info(f" Total train batch size (w. parallel, distributed & accumulation) = {total_train_batch_size}") logger.info(f" Total train batch size (w. parallel, distributed & accumulation) = {total_train_batch_size:,}")
logger.info(f" Gradient Accumulation steps = {args.gradient_accumulation_steps}") logger.info(f" Gradient Accumulation steps = {args.gradient_accumulation_steps}")
logger.info(f" Total optimization steps = {max_steps}") logger.info(f" Total optimization steps = {max_steps:,}")
logger.info(f" Number of trainable parameters = {get_model_param_count(model, trainable_only=True)}") logger.info(f" Number of trainable parameters = {get_model_param_count(model, trainable_only=True):,}")
self.state.epoch = 0 self.state.epoch = 0
start_time = time.time() start_time = time.time()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment