"...git@developer.sourcefind.cn:tianlh/lightgbm-dcu.git" did not exist on "bbeecc09af946c5ff9b84d1ada4749a9f26bca31"
Unverified Commit 6356e659 authored by Qingyun Wu's avatar Qingyun Wu Committed by GitHub
Browse files

[docs] Add FLAML for efficient hyperparameter optimization (#4013)



* add FLAML for HPO in DOC

* add FLAML for HPO

* revise FLAML phasing

* Update docs/Parameters-Tuning.rst
Co-authored-by: default avatarNikita Titov <nekit94-08@mail.ru>

* Update README.md
Co-authored-by: default avatarNikita Titov <nekit94-08@mail.ru>
Co-authored-by: default avatarNikita Titov <nekit94-08@mail.ru>
parent 3ab6bbf9
...@@ -42,7 +42,8 @@ Next you may want to read: ...@@ -42,7 +42,8 @@ Next you may want to read:
- [**Parameters**](https://github.com/microsoft/LightGBM/blob/master/docs/Parameters.rst) is an exhaustive list of customization you can make. - [**Parameters**](https://github.com/microsoft/LightGBM/blob/master/docs/Parameters.rst) is an exhaustive list of customization you can make.
- [**Distributed Learning**](https://github.com/microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst) and [**GPU Learning**](https://github.com/microsoft/LightGBM/blob/master/docs/GPU-Tutorial.rst) can speed up computation. - [**Distributed Learning**](https://github.com/microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst) and [**GPU Learning**](https://github.com/microsoft/LightGBM/blob/master/docs/GPU-Tutorial.rst) can speed up computation.
- [**Laurae++ interactive documentation**](https://sites.google.com/view/lauraepp/parameters) is a detailed guide for hyperparameters. - [**Laurae++ interactive documentation**](https://sites.google.com/view/lauraepp/parameters) is a detailed guide for hyperparameters.
- [**Optuna Hyperparameter Tuner**](https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258) provides automated tuning for LightGBM hyperparameters ([code examples](https://github.com/optuna/optuna/blob/master/examples/)). - [**FLAML**](https://www.microsoft.com/en-us/research/project/fast-and-lightweight-automl-for-large-scale-data/articles/flaml-a-fast-and-lightweight-automl-library/) provides automated tuning for LightGBM ([code examples](https://github.com/microsoft/FLAML/blob/main/notebook/flaml_lightgbm.ipynb)).
- [**Optuna Hyperparameter Tuner**](https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258) provides automated tuning for LightGBM hyperparameters ([code examples](https://github.com/optuna/optuna/tree/master/examples/lightgbm)).
Documentation for contributors: Documentation for contributors:
...@@ -59,6 +60,8 @@ Some old update logs are available at [Key Events](https://github.com/microsoft/ ...@@ -59,6 +60,8 @@ Some old update logs are available at [Key Events](https://github.com/microsoft/
External (Unofficial) Repositories External (Unofficial) Repositories
---------------------------------- ----------------------------------
FLAML (AutoML library for hyperparameter optimization): https://github.com/microsoft/FLAML
Optuna (hyperparameter optimization framework): https://github.com/optuna/optuna Optuna (hyperparameter optimization framework): https://github.com/optuna/optuna
Julia-package: https://github.com/IQVIA-ML/LightGBM.jl Julia-package: https://github.com/IQVIA-ML/LightGBM.jl
......
...@@ -7,6 +7,7 @@ This page contains parameters tuning guides for different scenarios. ...@@ -7,6 +7,7 @@ This page contains parameters tuning guides for different scenarios.
- `Parameters <./Parameters.rst>`__ - `Parameters <./Parameters.rst>`__
- `Python API <./Python-API.rst>`__ - `Python API <./Python-API.rst>`__
- `FLAML`_ for automated hyperparameter tuning
- `Optuna`_ for automated hyperparameter tuning - `Optuna`_ for automated hyperparameter tuning
Tune Parameters for the Leaf-wise (Best-first) Tree Tune Parameters for the Leaf-wise (Best-first) Tree
...@@ -214,3 +215,5 @@ Deal with Over-fitting ...@@ -214,3 +215,5 @@ Deal with Over-fitting
- Try increasing ``path_smooth`` - Try increasing ``path_smooth``
.. _Optuna: https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258 .. _Optuna: https://medium.com/optuna/lightgbm-tuner-new-optuna-integration-for-hyperparameter-optimization-8b7095e99258
.. _FLAML: https://github.com/microsoft/FLAML
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment