LightGBM, Light Gradient Boosting Machine ========== LightGBM is a gradient boosting framework that using tree based learning algorithms. It is designed to be distributed and efficient with following advantages: - Fast training efficiency - Low memory usage - Better accuracy - Parallel learning supported - Deal with large scale of data For the details, please refer to [Features](https://github.com/Microsoft/LightGBM/wiki/Features). The [experiments](https://github.com/Microsoft/LightGBM/wiki/Experiments#comparison-experiment) on the public data also shows that LightGBM can outperform other existing boosting tools on both learning efficiency and accuracy, with significant lower memory consumption. What's more, the [experiments](https://github.com/Microsoft/LightGBM/wiki/Experiments#parallel-experiment) shows that LightGBM can achieve linear speed-up by using multiple machines for training in specific settings. Get Started ------------ For a quick start, please follow the [Installation Guide](https://github.com/Microsoft/LightGBM/wiki/Installation-Guide) and [Quick Start](https://github.com/Microsoft/LightGBM/wiki/Quick-Start). Documents ------------ * [**Wiki**](https://github.com/Microsoft/LightGBM/wiki) * [**Installation Guide**](https://github.com/Microsoft/LightGBM/wiki/Installation-Guide) * [**Quick Start**](https://github.com/Microsoft/LightGBM/wiki/Quick-Start) * [**Examples**](https://github.com/Microsoft/LightGBM/tree/master/examples) * [**Features**](https://github.com/Microsoft/LightGBM/wiki/Features) * [**Parallel Learning Guide**](https://github.com/Microsoft/LightGBM/wiki/Parallel-Learning-Guide) * [**Configuration**](https://github.com/Microsoft/LightGBM/wiki/Configuration) Microsoft Open Source Code of Conduct ------------ This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.