"tilelang/original/examples/gdn/README.md" did not exist on "f5bc26c2295e334b3a8ce4cce8a7ba4b7927c736"
README.md 4.65 KB
Newer Older
Guolin Ke's avatar
Guolin Ke committed
1
LightGBM, Light Gradient Boosting Machine
wxchan's avatar
wxchan committed
2
=========================================
Guolin Ke's avatar
Guolin Ke committed
3
[![Build Status](https://travis-ci.org/Microsoft/LightGBM.svg?branch=master)](https://travis-ci.org/Microsoft/LightGBM)
4
[![Documentation Status](https://readthedocs.org/projects/lightgbm/badge/?version=latest)](http://lightgbm.readthedocs.io/)
Guolin Ke's avatar
Guolin Ke committed
5

6
LightGBM is a gradient boosting framework that uses tree based learning algorithms. It is designed to be distributed and efficient with the following advantages:
Guolin Ke's avatar
Guolin Ke committed
7

8
- Faster training speed and higher efficiency
xuehui's avatar
xuehui committed
9
- Lower memory usage
Guolin Ke's avatar
Guolin Ke committed
10
- Better accuracy
11
- Parallel and GPU learning supported
12
- Capable of handling large-scale data
Guolin Ke's avatar
Guolin Ke committed
13

xuehui's avatar
xuehui committed
14
For more details, please refer to [Features](https://github.com/Microsoft/LightGBM/wiki/Features).
Guolin Ke's avatar
Guolin Ke committed
15

16
[Experiments](https://github.com/Microsoft/LightGBM/wiki/Experiments#comparison-experiment) on public datasets show that LightGBM can outperform existing boosting frameworks on both efficiency and accuracy, with significantly lower memory consumption. What's more, the [experiments](https://github.com/Microsoft/LightGBM/wiki/Experiments#parallel-experiment) show that LightGBM can achieve a linear speed-up by using multiple machines for training in specific settings.
Guolin Ke's avatar
Guolin Ke committed
17

wxchan's avatar
wxchan committed
18
19
News
----
Guolin Ke's avatar
Guolin Ke committed
20

21
22
05/03/2017: LightGBM v2 stable release.

23
04/10/2017 : LightGBM now supports GPU-accelerated tree learning. Please read our [GPU Tutorial](./docs/GPU-Tutorial.md) and [Performance Comparison](./docs/GPU-Performance.md).
Guolin Ke's avatar
Guolin Ke committed
24

Guolin Ke's avatar
Guolin Ke committed
25
02/20/2017 : Update to LightGBM v2.
Guolin Ke's avatar
Guolin Ke committed
26

27
28
02/12/2017: LightGBM v1 stable release.

Guolin Ke's avatar
Guolin Ke committed
29
30
01/08/2017 : Release [**R-package**](./R-package) beta version, welcome to have a try and provide feedback.

31
12/05/2016 : **Categorical Features as input directly**(without one-hot coding). Experiment on [Expo data](http://stat-computing.org/dataexpo/2009/) shows about 8x speed-up with same accuracy compared with one-hot coding.
wxchan's avatar
wxchan committed
32

Guolin Ke's avatar
Guolin Ke committed
33
12/02/2016 : Release [**python-package**](./python-package) beta version, welcome to have a try and provide feedback.
wxchan's avatar
wxchan committed
34

35
36
External (unofficial) Repositories
----------------------------------
37
38
39
40
41
42

Julia Package: https://github.com/Allardvm/LightGBM.jl

JPMML: https://github.com/jpmml/jpmml-lightgbm


Guolin Ke's avatar
Guolin Ke committed
43
44
Get Started And Documents
-------------------------
45
To get started, please follow the [Installation Guide](https://github.com/Microsoft/LightGBM/wiki/Installation-Guide) and [Quick Start](https://github.com/Microsoft/LightGBM/wiki/Quick-Start).
Guolin Ke's avatar
Guolin Ke committed
46

Guolin Ke's avatar
Guolin Ke committed
47
* [**Wiki**](https://github.com/Microsoft/LightGBM/wiki)
48
* [**Installation Guide**](https://github.com/Microsoft/LightGBM/wiki/Installation-Guide)
Guolin Ke's avatar
Guolin Ke committed
49
* [**Quick Start**](https://github.com/Microsoft/LightGBM/wiki/Quick-Start)
Guolin Ke's avatar
Guolin Ke committed
50
* [**Examples**](https://github.com/Microsoft/LightGBM/tree/master/examples)
51
52
* [**Features**](https://github.com/Microsoft/LightGBM/wiki/Features)
* [**Parallel Learning Guide**](https://github.com/Microsoft/LightGBM/wiki/Parallel-Learning-Guide)
53
* [**GPU Learning Tutorial**](https://github.com/Microsoft/LightGBM/blob/master/docs/GPU-Tutorial.md)
54
* [**Configuration**](https://github.com/Microsoft/LightGBM/wiki/Configuration)
55
* [**Document Indexer**](https://github.com/Microsoft/LightGBM/blob/master/docs/README.md)
Guolin Ke's avatar
Guolin Ke committed
56

57
58
59
60
61
62
63
External Links
--------------
Useful if you are looking for details:

* [**Read The Docs**](http://lightgbm.readthedocs.io/en/latest/) for an all in one documentation from this repository in a browsable fashion
* [**Laurae++ interactive documentation**](https://sites.google.com/view/lauraepp/parameters) for an interactive and detailed documentation on hyperparameters

Guolin Ke's avatar
Guolin Ke committed
64
65
66
67
68
69
How to Contribute
-----------------

LightGBM has been developed and used by many active community members. Your help is very valuable to make it better for everyone.

- Check out [call for contributions](https://github.com/Microsoft/LightGBM/issues?q=is%3Aissue+is%3Aopen+label%3Acall-for-contribution) to see what can be improved, or open an issue if you want something.
Guolin Ke's avatar
Guolin Ke committed
70
- Contribute to the [tests](https://github.com/Microsoft/LightGBM/tree/master/tests) to make it more reliable. 
71
- Contribute to the [documents](https://github.com/Microsoft/LightGBM/tree/master/docs) to make it clearer for everyone.
Guolin Ke's avatar
Guolin Ke committed
72
- Contribute to the [examples](https://github.com/Microsoft/LightGBM/tree/master/examples) to share your experience with other users.
Guolin Ke's avatar
Guolin Ke committed
73
74
- Check out [Development Guide](./docs/development.md).
- Open issue if you met problems during development.
Guolin Ke's avatar
Guolin Ke committed
75

Guolin Ke's avatar
Guolin Ke committed
76
77
78
Microsoft Open Source Code of Conduct
------------
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.