"...git@developer.sourcefind.cn:tianlh/lightgbm-dcu.git" did not exist on "01c7b97d2a2511a4c5ecca4ceac7d4972206afe2"
Commit 6a1a538f authored by leasunhy's avatar leasunhy Committed by Nikita Titov
Browse files

[docs] remove duplicated param in Python-Intro.rst (#2181)

`num_round` is redundant here because it will be overrideen by `num_trees` in the `param` dictionary.
parent a66c3440
...@@ -141,7 +141,7 @@ For instance: ...@@ -141,7 +141,7 @@ For instance:
.. code:: python .. code:: python
param = {'num_leaves': 31, 'num_trees': 100, 'objective': 'binary'} param = {'num_leaves': 31, 'objective': 'binary'}
param['metric'] = 'auc' param['metric'] = 'auc'
- You can also specify multiple eval metrics: - You can also specify multiple eval metrics:
...@@ -185,7 +185,6 @@ Training with 5-fold CV: ...@@ -185,7 +185,6 @@ Training with 5-fold CV:
.. code:: python .. code:: python
num_round = 10
lgb.cv(param, train_data, num_round, nfold=5) lgb.cv(param, train_data, num_round, nfold=5)
Early Stopping Early Stopping
...@@ -196,7 +195,7 @@ Early stopping requires at least one set in ``valid_sets``. If there is more tha ...@@ -196,7 +195,7 @@ Early stopping requires at least one set in ``valid_sets``. If there is more tha
.. code:: python .. code:: python
bst = lgb.train(param, train_data, num_round, valid_sets=valid_sets, early_stopping_rounds=10) bst = lgb.train(param, train_data, num_round, valid_sets=valid_sets, early_stopping_rounds=5)
bst.save_model('model.txt', num_iteration=bst.best_iteration) bst.save_model('model.txt', num_iteration=bst.best_iteration)
The model will train until the validation score stops improving. The model will train until the validation score stops improving.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment