@@ -57,7 +57,7 @@ We set up total 3 settings for experiments. The parameters of these settings are
1. xgboost:
.. code::
.. code:: text
eta = 0.1
max_depth = 8
...
...
@@ -68,7 +68,7 @@ We set up total 3 settings for experiments. The parameters of these settings are
2. xgboost\_hist (using histogram based algorithm):
.. code::
.. code:: text
eta = 0.1
num_round = 500
...
...
@@ -81,7 +81,7 @@ We set up total 3 settings for experiments. The parameters of these settings are
3. LightGBM:
.. code::
.. code:: text
learning_rate = 0.1
num_leaves = 255
...
...
@@ -102,7 +102,7 @@ Result
Speed
'''''
We compared speed using only the training task without any test or metric output. We didn't count the time for IO.
We compared speed using only the training task without any test or metric output. We didn't count the time for IO.
For the ranking tasks, since XGBoost and LightGBM implement different ranking objective functions, we used ``regression`` objective for speed benchmark, for the fair comparison.
The following table is the comparison of time cost:
...
...
@@ -212,7 +212,7 @@ We ran our experiments on 16 Windows servers with the following specifications:
@@ -461,7 +461,7 @@ Type ``run`` and press the Enter key.
You will probably get something similar to this:
::
.. code:: text
[LightGBM] [Info] This is the GPU trainer!!
[LightGBM] [Info] Total Bins 6143
...
...
@@ -476,7 +476,7 @@ You will probably get something similar to this:
There, write ``backtrace`` and press the Enter key as many times as gdb requests two choices:
::
.. code:: text
Program received signal SIGSEGV, Segmentation fault.
0x00007ffbb37c11f1 in strlen () from C:\Windows\system32\msvcrt.dll
...
...
@@ -511,7 +511,7 @@ There, write ``backtrace`` and press the Enter key as many times as gdb requests
Right-click the command prompt, click "Mark", and select all the text from the first line (with the command prompt containing gdb) to the last line printed, containing all the log, such as: