"tests/git@developer.sourcefind.cn:tianlh/lightgbm-dcu.git" did not exist on "00d1e6938a3f8566526d470057f8d26ec1289208"
Unverified Commit 6f54ec3d authored by Guolin Ke's avatar Guolin Ke Committed by GitHub
Browse files

[doc] better doc for `keep_training_booster` (#3275)



* [doc] better doc for `keep_training_booster`

* Update python-package/lightgbm/engine.py
Co-authored-by: default avatarNikita Titov <nekit94-08@mail.ru>
Co-authored-by: default avatarNikita Titov <nekit94-08@mail.ru>
parent 9b263735
......@@ -128,6 +128,7 @@ def train(params, train_set, num_boost_round=100,
keep_training_booster : bool, optional (default=False)
Whether the returned Booster will be used to keep training.
If False, the returned value will be converted into _InnerPredictor before returning.
When your model is very large and cause the memory error, you can try to set this param to ``True`` to avoid the model conversion performed during the internal call of ``model_to_string``.
You can still use _InnerPredictor as ``init_model`` for future continue training.
callbacks : list of callables or None, optional (default=None)
List of callback functions that are applied at each iteration.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment