Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
2834c17a
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "d1eb88f42def4f7eafcf316feed137912ed522fa"
Unverified
Commit
2834c17a
authored
Jun 22, 2023
by
Sylvain Gugger
Committed by
GitHub
Jun 22, 2023
Browse files
Clarify batch size displayed when using DataParallel (#24430)
parent
b6295b26
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
src/transformers/trainer.py
src/transformers/trainer.py
+3
-1
No files found.
src/transformers/trainer.py
View file @
2834c17a
...
@@ -1671,7 +1671,9 @@ class Trainer:
...
@@ -1671,7 +1671,9 @@ class Trainer:
logger
.
info
(
"***** Running training *****"
)
logger
.
info
(
"***** Running training *****"
)
logger
.
info
(
f
" Num examples =
{
num_examples
:,
}
"
)
logger
.
info
(
f
" Num examples =
{
num_examples
:,
}
"
)
logger
.
info
(
f
" Num Epochs =
{
num_train_epochs
:,
}
"
)
logger
.
info
(
f
" Num Epochs =
{
num_train_epochs
:,
}
"
)
logger
.
info
(
f
" Instantaneous batch size per device =
{
self
.
_train_batch_size
:,
}
"
)
logger
.
info
(
f
" Instantaneous batch size per device =
{
self
.
args
.
per_device_train_batch_size
:,
}
"
)
if
self
.
args
.
per_device_train_batch_size
!=
self
.
_train_batch_size
:
logger
.
info
(
f
" Training with DataParallel so batch size has been adjusted to:
{
self
.
_train_batch_size
:,
}
"
)
logger
.
info
(
f
" Total train batch size (w. parallel, distributed & accumulation) =
{
total_train_batch_size
:,
}
"
)
logger
.
info
(
f
" Total train batch size (w. parallel, distributed & accumulation) =
{
total_train_batch_size
:,
}
"
)
logger
.
info
(
f
" Gradient Accumulation steps =
{
args
.
gradient_accumulation_steps
}
"
)
logger
.
info
(
f
" Gradient Accumulation steps =
{
args
.
gradient_accumulation_steps
}
"
)
logger
.
info
(
f
" Total optimization steps =
{
max_steps
:,
}
"
)
logger
.
info
(
f
" Total optimization steps =
{
max_steps
:,
}
"
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment