Commit ca30afe1 authored by Nikita Titov's avatar Nikita Titov Committed by Qiwei Ye
Browse files

[python][examples] Added example of custom feval with sklearn wrapper (#1503)

* added example about custom feval with sklearn wrapper

* fixed pylint

* added new example in contents

* unified run example commands for Windows and Linux

* added note about comments in examples conf files
parent c647b66f
......@@ -3,6 +3,8 @@ Examples
You can learn how to use LightGBM by these examples.
Comments in configuration files might be outdated. Actual information about parameters always can be found [here](https://github.com/Microsoft/LightGBM/blob/master/docs/Parameters.rst).
Machine Learning Challenge Winning Solutions
============================================
......
......@@ -5,19 +5,13 @@ Here is an example for LightGBM to run binary classification task.
***You should copy executable file to this folder first.***
Trainin
-------
Training
--------
For Windows, by running following command in this folder:
Run the following command in this folder:
```
lightgbm.exe config=train.conf
```
For Linux, by running following command in this folder:
```
./lightgbm config=train.conf
"./lightgbm" config=train.conf
```
Prediction
......@@ -25,14 +19,8 @@ Prediction
You should finish training first.
For Windows, by running following command in this folder:
```
lightgbm.exe config=predict.conf
```
For Linux, by running following command in this folder:
Run the following command in this folder:
```
./lightgbm config=predict.conf
"./lightgbm" config=predict.conf
```
......@@ -8,16 +8,10 @@ Here is an example for LightGBM to run lambdarank task.
Training
--------
For Windows, by running following command in this folder:
Run the following command in this folder:
```
lightgbm.exe config=train.conf
```
For Linux, by running following command in this folder:
```
./lightgbm config=train.conf
"./lightgbm" config=train.conf
```
Prediction
......@@ -25,14 +19,8 @@ Prediction
You should finish training first.
For Windows, by running following command in this folder:
```
lightgbm.exe config=predict.conf
```
For Linux, by running following command in this folder:
Run the following command in this folder:
```
./lightgbm config=predict.conf
"./lightgbm" config=predict.conf
```
......@@ -8,16 +8,10 @@ Here is an example for LightGBM to run multiclass classification task.
Training
--------
For Windows, by running following command in this folder:
Run the following command in this folder:
```
lightgbm.exe config=train.conf
```
For Linux, by running following command in this folder:
```
./lightgbm config=train.conf
"./lightgbm" config=train.conf
```
Prediction
......@@ -25,14 +19,8 @@ Prediction
You should finish training first.
For Windows, by running following command in this folder:
```
lightgbm.exe config=predict.conf
```
For Linux, by running following command in this folder:
Run the following command in this folder:
```
./lightgbm config=predict.conf
"./lightgbm" config=predict.conf
```
......@@ -3,7 +3,7 @@ Parallel Learning Example
Here is an example for LightGBM to perform parallel learning for 2 machines.
1. Edit mlist.txt, write the ip of these 2 machines that you want to run application on.
1. Edit [mlist.txt](./mlist.txt): write the ip of these 2 machines that you want to run application on.
```
machine1_ip 12400
......@@ -14,10 +14,8 @@ Here is an example for LightGBM to perform parallel learning for 2 machines.
3. Run command in this folder on both 2 machines:
For Windows: ```lightgbm.exe config=train.conf```
```"./lightgbm" config=train.conf```
For Linux: ```./lightgbm config=train.conf```
This parallel learning example is based on socket. LightGBM also support parallel learning based on mpi.
This parallel learning example is based on socket. LightGBM also supports parallel learning based on mpi.
For more details about the usage of parallel learning, please refer to [this](https://github.com/Microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst).
......@@ -29,6 +29,7 @@ Examples include:
- Create data for learning with sklearn interface
- Basic train and predict with sklearn interface
- Feature importances with sklearn interface
- Self-defined eval metric with sklearn interface
- Find best parameters for the model with sklearn's GridSearchCV
- [advanced_example.py](https://github.com/Microsoft/LightGBM/blob/master/examples/python-guide/advanced_example.py)
- Set feature names
......
......@@ -144,7 +144,7 @@ def loglikelood(preds, train_data):
# self-defined eval metric
# f(preds: array, train_data: Dataset) -> name: string, value: array, is_higher_better: bool
# f(preds: array, train_data: Dataset) -> name: string, eval_result: float, is_higher_better: bool
# binary error
def binary_error(preds, train_data):
labels = train_data.get_label()
......
# coding: utf-8
# pylint: disable = invalid-name, C0111
import lightgbm as lgb
import numpy as np
import pandas as pd
import lightgbm as lgb
from sklearn.metrics import mean_squared_error
from sklearn.model_selection import GridSearchCV
......@@ -35,6 +37,27 @@ print('The rmse of prediction is:', mean_squared_error(y_test, y_pred) ** 0.5)
# feature importances
print('Feature importances:', list(gbm.feature_importances_))
# self-defined eval metric
# f(y_true: array, y_pred: array) -> name: string, eval_result: float, is_higher_better: bool
# Root Mean Squared Logarithmic Error (RMSLE)
def rmsle(y_true, y_pred):
return 'RMSLE', np.sqrt(np.mean(np.power(np.log1p(y_pred) - np.log1p(y_true), 2))), False
print('Start training with custom eval function...')
# train
gbm.fit(X_train, y_train,
eval_set=[(X_test, y_test)],
eval_metric=rmsle,
early_stopping_rounds=5)
print('Start predicting...')
# predict
y_pred = gbm.predict(X_test, num_iteration=gbm.best_iteration_)
# eval
print('The rmsle of prediction is:', rmsle(y_test, y_pred)[1])
# other scikit-learn modules
estimator = lgb.LGBMRegressor(num_leaves=31)
......
......@@ -8,16 +8,10 @@ Here is an example for LightGBM to run regression task.
Training
--------
For Windows, by running following command in this folder:
Run the following command in this folder:
```
lightgbm.exe config=train.conf
```
For Linux, by running following command in this folder:
```
./lightgbm config=train.conf
"./lightgbm" config=train.conf
```
Prediction
......@@ -25,14 +19,8 @@ Prediction
You should finish training first.
For Windows, by running following command in this folder:
```
lightgbm.exe config=predict.conf
```
For Linux, by running following command in this folder:
Run the following command in this folder:
```
./lightgbm config=predict.conf
"./lightgbm" config=predict.conf
```
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment