Commit 87392e79 authored by Guolin Ke's avatar Guolin Ke
Browse files

update doc for v2

parent 4f77bd28
...@@ -16,6 +16,8 @@ For more details, please refer to [Features](https://github.com/Microsoft/LightG ...@@ -16,6 +16,8 @@ For more details, please refer to [Features](https://github.com/Microsoft/LightG
News News
---- ----
02/20/2017 : Update to LightGBM v2. Add two new features: Gradient-based One-Side Sampling(GOSS) and Exclusive Feature Bundling(EFB). With GOSS and EFB, LightGBM can further speed up the training. Details are avaliable in [Features](https://github.com/Microsoft/LightGBM/wiki/Features).
01/08/2017 : Release [**R-package**](./R-package) beta version, welcome to have a try and provide feedback. 01/08/2017 : Release [**R-package**](./R-package) beta version, welcome to have a try and provide feedback.
12/05/2016 : [deprecated in v2]**Categorical Features as input directly**(without one-hot coding). Experiment on [Expo data](http://stat-computing.org/dataexpo/2009/) shows about 8x speed-up with same accuracy compared with one-hot coding (refer to [categorical log]( https://github.com/guolinke/boosting_tree_benchmarks/blob/master/lightgbm/lightgbm_dataexpo_speed.log) and [one-hot log]( https://github.com/guolinke/boosting_tree_benchmarks/blob/master/lightgbm/lightgbm_dataexpo_onehot_speed.log)). 12/05/2016 : [deprecated in v2]**Categorical Features as input directly**(without one-hot coding). Experiment on [Expo data](http://stat-computing.org/dataexpo/2009/) shows about 8x speed-up with same accuracy compared with one-hot coding (refer to [categorical log]( https://github.com/guolinke/boosting_tree_benchmarks/blob/master/lightgbm/lightgbm_dataexpo_speed.log) and [one-hot log]( https://github.com/guolinke/boosting_tree_benchmarks/blob/master/lightgbm/lightgbm_dataexpo_onehot_speed.log)).
......
...@@ -29,6 +29,7 @@ The parameter format is ```key1=value1 key2=value2 ... ``` . And parameters can ...@@ -29,6 +29,7 @@ The parameter format is ```key1=value1 key2=value2 ... ``` . And parameters can
* ```boosting```, default=```gbdt```, type=enum, options=```gbdt```,```dart```, alias=```boost```,```boosting_type``` * ```boosting```, default=```gbdt```, type=enum, options=```gbdt```,```dart```, alias=```boost```,```boosting_type```
* ```gbdt```, traditional Gradient Boosting Decision Tree * ```gbdt```, traditional Gradient Boosting Decision Tree
* ```dart```, [Dropouts meet Multiple Additive Regression Trees](https://arxiv.org/abs/1505.01866) * ```dart```, [Dropouts meet Multiple Additive Regression Trees](https://arxiv.org/abs/1505.01866)
* ```goss```, Gradient-based One-Side Sampling
* ```data```, default=```""```, type=string, alias=```train```,```train_data``` * ```data```, default=```""```, type=string, alias=```train```,```train_data```
* training data, LightGBM will train from this data * training data, LightGBM will train from this data
* ```valid```, default=```""```, type=multi-string, alias=```test```,```valid_data```,```test_data``` * ```valid```, default=```""```, type=multi-string, alias=```test```,```valid_data```,```test_data```
...@@ -95,6 +96,10 @@ The parameter format is ```key1=value1 key2=value2 ... ``` . And parameters can ...@@ -95,6 +96,10 @@ The parameter format is ```key1=value1 key2=value2 ... ``` . And parameters can
* only used in ```dart```, true if want to use xgboost dart mode * only used in ```dart```, true if want to use xgboost dart mode
* ```drop_seed```, default=```4```, type=int * ```drop_seed```, default=```4```, type=int
* only used in ```dart```, used to random seed to choose dropping models. * only used in ```dart```, used to random seed to choose dropping models.
* ```top_rate```, default=```0.2```, type=double
* only used in ```goss```, the retain ratio of large gradient data
* ```other_rate```, default=```0.1```, type=int
* only used in ```goss```, the retain ratio of small gradient data
## IO parameters ## IO parameters
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment