Unverified Commit 8b5da9fc authored by Phuc Van Phan's avatar Phuc Van Phan Committed by GitHub
Browse files

refactor: change default block_size in block size > max position embeddings (#26069)

* refactor: change default block_size when not initialize

* reformat: add the min of block size
parent c63e2701
...@@ -574,9 +574,9 @@ def main(): ...@@ -574,9 +574,9 @@ def main():
if block_size > config.max_position_embeddings: if block_size > config.max_position_embeddings:
logger.warning( logger.warning(
f"The tokenizer picked seems to have a very large `model_max_length` ({tokenizer.model_max_length}). " f"The tokenizer picked seems to have a very large `model_max_length` ({tokenizer.model_max_length}). "
"Picking 1024 instead. You can change that default value by passing --block_size xxx." f"Using block_size={min(1024, config.max_position_embeddings)} instead. You can change that default value by passing --block_size xxx."
) )
block_size = 1024 block_size = min(1024, config.max_position_embeddings)
else: else:
if data_args.block_size > tokenizer.model_max_length: if data_args.block_size > tokenizer.model_max_length:
logger.warning( logger.warning(
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment