README.md 6.47 KB
Newer Older
Guolin Ke's avatar
Guolin Ke committed
1
LightGBM, Light Gradient Boosting Machine
wxchan's avatar
wxchan committed
2
=========================================
The Gitter Badger's avatar
The Gitter Badger committed
3
4

[![Join the chat at https://gitter.im/Microsoft/LightGBM](https://badges.gitter.im/Microsoft/LightGBM.svg)](https://gitter.im/Microsoft/LightGBM?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Qiwei Ye's avatar
Qiwei Ye committed
5
[![Build Status](https://travis-ci.org/Microsoft/LightGBM.svg?branch=master)](https://travis-ci.org/Microsoft/LightGBM)
Nikita Titov's avatar
Nikita Titov committed
6
[![Windows Build Status](https://ci.appveyor.com/api/projects/status/1ys5ot401m0fep6l/branch/master?svg=true)](https://ci.appveyor.com/project/guolinke/lightgbm/branch/master)
7
[![Documentation Status](https://readthedocs.org/projects/lightgbm/badge/?version=latest)](https://lightgbm.readthedocs.io/)
Nikita Titov's avatar
Nikita Titov committed
8
9
[![GitHub Issues](https://img.shields.io/github/issues/Microsoft/LightGBM.svg)](https://github.com/Microsoft/LightGBM/issues)
[![License](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/Microsoft/LightGBM/blob/master/LICENSE)
10
[![Python Versions](https://img.shields.io/pypi/pyversions/lightgbm.svg)](https://pypi.python.org/pypi/lightgbm)
Nikita Titov's avatar
Nikita Titov committed
11
[![PyPI Version](https://badge.fury.io/py/lightgbm.svg)](https://badge.fury.io/py/lightgbm)
Guolin Ke's avatar
Guolin Ke committed
12

13
LightGBM is a gradient boosting framework that uses tree based learning algorithms. It is designed to be distributed and efficient with the following advantages:
Guolin Ke's avatar
Guolin Ke committed
14

15
- Faster training speed and higher efficiency
xuehui's avatar
xuehui committed
16
- Lower memory usage
Guolin Ke's avatar
Guolin Ke committed
17
- Better accuracy
18
- Parallel and GPU learning supported
19
- Capable of handling large-scale data
Guolin Ke's avatar
Guolin Ke committed
20

21
For more details, please refer to [Features](https://github.com/Microsoft/LightGBM/blob/master/docs/Features.md).
Guolin Ke's avatar
Guolin Ke committed
22

23
[Comparison experiments](https://github.com/Microsoft/LightGBM/blob/master/docs/Experiments.rst#comparison-experiment) on public datasets show that LightGBM can outperform existing boosting frameworks on both efficiency and accuracy, with significantly lower memory consumption. What's more, the [parallel experiments](https://github.com/Microsoft/LightGBM/blob/master/docs/Experiments.rst#parallel-experiment) show that LightGBM can achieve a linear speed-up by using multiple machines for training in specific settings.
Guolin Ke's avatar
Guolin Ke committed
24

wxchan's avatar
wxchan committed
25
26
News
----
27

Guolin Ke's avatar
Guolin Ke committed
28
08/15/2017 : Optimal split for categorical features.
Guolin Ke's avatar
Guolin Ke committed
29

Guolin Ke's avatar
Guolin Ke committed
30
07/13/2017 : [Gitter](https://gitter.im/Microsoft/LightGBM) is avaiable.
Guolin Ke's avatar
Guolin Ke committed
31

Guolin Ke's avatar
Guolin Ke committed
32
06/20/2017 : Python-package is on [PyPI](https://pypi.python.org/pypi/lightgbm) now.
33

Guolin Ke's avatar
Guolin Ke committed
34
35
36
06/09/2017 : [LightGBM Slack team](https://lightgbm.slack.com) is available.

05/03/2017 : LightGBM v2 stable release.
37

38
04/10/2017 : LightGBM supports GPU-accelerated tree learning now. Please read our [GPU Tutorial](./docs/GPU-Tutorial.md) and [Performance Comparison](./docs/GPU-Performance.rst).
Guolin Ke's avatar
Guolin Ke committed
39

Guolin Ke's avatar
Guolin Ke committed
40
02/20/2017 : Update to LightGBM v2.
Guolin Ke's avatar
Guolin Ke committed
41

42
43
02/12/2017: LightGBM v1 stable release.

44
01/08/2017 : Release [**R-package**](https://github.com/Microsoft/LightGBM/tree/master/R-package) beta version, welcome to have a try and provide feedback.
Guolin Ke's avatar
Guolin Ke committed
45

Guolin Ke's avatar
Guolin Ke committed
46
12/05/2016 : **Categorical Features as input directly** (without one-hot coding). 
wxchan's avatar
wxchan committed
47

48
12/02/2016 : Release [**Python-package**](https://github.com/Microsoft/LightGBM/tree/master/python-package) beta version, welcome to have a try and provide feedback.
wxchan's avatar
wxchan committed
49

50
51
More detailed update logs : [Key Events](https://github.com/Microsoft/LightGBM/blob/master/docs/Key-Events.md).

52

53
54
External (unofficial) Repositories
----------------------------------
55
56
57
58
59
60

Julia Package: https://github.com/Allardvm/LightGBM.jl

JPMML: https://github.com/jpmml/jpmml-lightgbm


61
62
63
Get Started and Documentation
-----------------------------

64
Install by following the [guide](https://github.com/Microsoft/LightGBM/blob/master/docs/Installation-Guide.rst) for the command line program, [Python-package](https://github.com/Microsoft/LightGBM/tree/master/python-package) or [R-package](https://github.com/Microsoft/LightGBM/tree/master/R-package). Then please see the [Quick Start](https://github.com/Microsoft/LightGBM/blob/master/docs/Quick-Start.md) guide.
Guolin Ke's avatar
Guolin Ke committed
65

66
Our primary documentation is at https://lightgbm.readthedocs.io/ and is generated from this repository.
Guolin Ke's avatar
Guolin Ke committed
67

68
Next you may want to read:
69

70
* [**Examples**](https://github.com/Microsoft/LightGBM/tree/master/examples) showing command line usage of common tasks
71
* [**Features**](https://github.com/Microsoft/LightGBM/blob/master/docs/Features.md) and algorithms supported by LightGBM
72
* [**Parameters**](https://github.com/Microsoft/LightGBM/blob/master/docs/Parameters.md) is an exhaustive list of customization you can make
73
* [**Parallel Learning**](https://github.com/Microsoft/LightGBM/blob/master/docs/Parallel-Learning-Guide.rst) and [**GPU Learning**](https://github.com/Microsoft/LightGBM/blob/master/docs/GPU-Tutorial.md) can speed up computation
74
* [**Laurae++ interactive documentation**](https://sites.google.com/view/lauraepp/parameters) is a detailed guide for hyperparameters
75

76
Documentation for contributors:
77

78
79
* [**How we Update readthedocs.io**](https://github.com/Microsoft/LightGBM/blob/master/docs/README.md)
* Check out the [Development Guide](https://github.com/Microsoft/LightGBM/blob/master/docs/development.rst).
Qiwei Ye's avatar
Qiwei Ye committed
80

81
82
Support
-------
83

84
85
* Ask a question [on Stack Overflow with the `lightgbm` tag ](https://stackoverflow.com/questions/ask?tags=lightgbm), we monitor this for new questions.
* Discuss on the [LightGBM Gitter](https://gitter.im/Microsoft/LightGBM).
86
* Open **bug reports** and **feature requests** (not questions) on [GitHub issues](https://github.com/Microsoft/LightGBM/issues).
87

Guolin Ke's avatar
Guolin Ke committed
88
89
90
91
92
93
How to Contribute
-----------------

LightGBM has been developed and used by many active community members. Your help is very valuable to make it better for everyone.

- Check out [call for contributions](https://github.com/Microsoft/LightGBM/issues?q=is%3Aissue+is%3Aopen+label%3Acall-for-contribution) to see what can be improved, or open an issue if you want something.
94
- Contribute to the [tests](https://github.com/Microsoft/LightGBM/tree/master/tests) to make it more reliable.
95
- Contribute to the [documents](https://github.com/Microsoft/LightGBM/tree/master/docs) to make it clearer for everyone.
Guolin Ke's avatar
Guolin Ke committed
96
- Contribute to the [examples](https://github.com/Microsoft/LightGBM/tree/master/examples) to share your experience with other users.
Guolin Ke's avatar
Guolin Ke committed
97
- Open issue if you met problems during development.
Guolin Ke's avatar
Guolin Ke committed
98

Guolin Ke's avatar
Guolin Ke committed
99
Microsoft Open Source Code of Conduct
100
101
-------------------------------------

Guolin Ke's avatar
Guolin Ke committed
102
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.