Commit bc0579c8 authored by wxchan's avatar wxchan Committed by Guolin Ke
Browse files

add init_score & test cpp and python result consistency (#1007)

* add init_score & test cpp and python result consistency

* try fix common.h

* Fix tests (#3)

* update atof

* fix bug

* fix tests.

* fix bug

* fix dtypes

* fix categorical feature override

* fix protobuf on vs build (#1004)

* [optional] support protobuf

* fix windows/LightGBM.vcxproj

* add doc

* fix doc

* fix vs support (#2)

* fix vs support

* fix cmake

* fix #1012

* [python] add network config api  (#1019)

* add network

* update doc

* add float tolerance in bin finder.

* fix a bug

* update tests

* add double torelance on tree model

* fix tests

* simplify the double comparison

* fix lightsvm zero base

* move double tolerance to the bin finder.

* fix pylint

* clean test.sh

* add sklearn test

* remove underline

* clean codes

* set random_state=None

* add last line

* fix doc

* rename file

* try fix test
parent 04d4811b
......@@ -377,3 +377,6 @@ lightgbm.model
# VSCode
.vscode
# duplicate version file
python-package/lightgbm/VERSION.txt
0.039
0.187
0.831
0.767
0.351
0.377
0.534
0.000
0.241
0.208
0.250
0.806
0.280
0.192
0.504
0.866
0.241
0.079
0.356
0.748
0.551
0.817
0.960
0.793
0.604
0.493
0.040
0.984
0.383
0.152
0.667
0.284
0.586
0.587
0.446
0.836
0.265
0.449
0.538
0.664
0.784
0.395
0.646
0.151
0.933
0.383
0.730
0.020
0.205
0.487
0.878
0.527
0.930
0.484
0.490
0.120
0.803
0.247
0.900
0.911
0.943
0.520
0.677
0.779
0.131
0.601
0.034
0.498
0.155
0.183
0.365
0.432
0.623
0.074
0.504
0.183
0.574
0.637
0.557
0.738
0.336
0.765
0.433
0.484
0.648
0.018
0.654
0.619
0.310
0.086
0.091
0.923
0.689
0.127
0.357
0.592
0.836
0.044
0.237
0.890
0.009
0.201
0.959
0.613
0.262
0.067
0.028
0.245
0.881
0.416
0.720
0.918
0.408
0.191
0.517
0.908
0.804
0.066
0.693
0.572
0.907
0.122
0.534
0.879
0.410
0.482
0.070
0.278
0.325
0.945
0.283
0.461
0.671
0.162
0.486
0.739
0.867
0.626
0.669
0.126
0.946
0.133
0.775
0.265
0.934
0.720
0.754
0.219
0.443
0.618
0.770
0.104
0.962
0.890
0.270
0.823
0.518
0.462
0.314
0.581
0.730
0.411
0.629
0.699
0.711
0.052
0.860
0.458
0.262
0.242
0.483
0.887
0.378
0.750
0.097
0.476
0.992
0.770
0.211
0.501
0.234
0.410
0.780
0.771
0.228
0.922
0.593
0.380
0.502
0.605
0.560
0.486
0.505
0.176
0.813
0.542
0.131
0.766
0.932
0.947
0.369
0.136
0.518
0.113
0.934
0.184
0.253
0.407
0.383
0.795
0.456
0.171
0.267
0.509
0.147
0.612
0.566
0.715
0.938
0.912
0.946
0.245
0.132
0.302
0.895
0.972
0.859
0.110
0.947
0.423
0.009
0.442
0.046
0.544
0.339
0.473
0.613
0.869
0.662
0.434
0.819
0.906
0.120
0.532
0.285
0.047
0.669
0.863
0.163
0.812
0.853
0.914
0.265
0.904
0.321
0.552
0.051
0.044
0.720
0.444
0.256
0.190
0.670
0.000
0.806
0.079
0.191
0.386
0.485
0.355
0.321
0.964
0.642
0.023
0.430
0.875
0.301
0.095
0.758
0.606
0.570
0.054
0.140
0.623
0.208
0.504
0.545
0.284
0.948
0.842
0.722
0.078
0.106
0.493
0.161
0.978
0.159
0.487
0.364
0.639
0.129
0.430
0.275
0.888
0.041
0.914
0.833
0.298
0.789
0.031
0.967
0.527
0.303
0.363
0.066
0.989
0.039
0.655
0.443
0.949
0.246
0.532
0.482
0.703
0.068
0.194
0.215
0.738
0.189
0.573
0.215
0.862
0.942
0.518
0.352
0.234
0.050
0.269
0.654
0.534
0.944
0.396
0.694
0.489
0.513
0.268
0.455
0.471
0.707
0.941
0.329
0.042
0.496
0.544
0.168
0.760
0.985
0.946
0.197
0.875
0.704
0.454
0.541
0.850
0.480
0.373
0.493
0.579
0.189
0.901
0.674
0.633
0.099
0.604
0.121
0.079
0.527
0.403
0.589
0.089
0.431
0.175
0.987
0.561
0.687
0.325
0.095
0.976
0.286
0.424
0.650
0.025
0.810
0.537
0.278
0.062
0.162
0.895
0.686
0.250
0.066
0.691
0.572
0.405
0.364
0.217
0.670
0.971
0.176
0.597
0.424
0.447
0.254
0.825
0.485
0.543
0.305
0.182
0.086
0.714
0.196
0.690
0.390
0.416
0.469
0.368
0.101
0.310
0.664
0.666
0.286
0.460
0.193
0.210
0.023
0.897
0.211
0.228
0.280
0.127
0.639
0.075
0.134
0.645
0.340
0.708
0.557
0.256
0.651
0.116
0.536
0.437
0.268
0.604
0.871
0.999
0.608
0.405
0.225
0.257
0.479
0.367
0.914
0.368
0.373
0.384
0.837
0.651
0.614
0.334
0.818
0.038
0.871
0.513
0.398
0.497
0.667
0.013
0.872
0.447
0.343
0.138
0.439
0.496
0.404
0.679
0.421
0.961
0.599
0.807
0.109
0.397
0.337
0.569
0.861
0.078
0.073
0.850
0.213
0.669
0.375
0.951
0.732
0.599
0.156
0.156
0.058
0.866
0.601
0.708
0.021
0.970
0.832
0.212
0.182
0.183
0.304
0.525
0.432
0.291
0.612
0.139
0.292
0.366
0.456
0.785
0.200
0.514
0.592
0.046
0.608
0.171
0.065
0.949
0.966
0.808
0.305
0.098
0.684
0.440
0.122
0.495
0.034
0.909
0.259
0.663
0.312
0.520
0.547
0.185
0.970
0.775
0.939
0.895
0.598
0.922
0.088
0.196
0.045
0.325
0.389
0.271
0.829
0.357
0.281
0.543
0.141
0.802
0.075
0.987
0.772
0.199
0.006
0.815
0.707
0.729
0.771
0.074
0.358
0.116
0.863
0.623
0.331
0.064
0.311
0.325
0.730
0.638
0.887
0.472
0.120
0.713
0.761
0.561
0.771
0.494
0.523
0.428
0.025
0.108
0.031
0.636
0.314
0.509
0.908
0.249
0.410
0.756
0.229
0.077
0.290
0.161
0.930
0.808
0.633
0.871
0.804
0.187
0.893
0.539
0.807
0.896
0.318
0.110
0.228
0.427
0.818
0.861
0.007
0.511
0.417
0.222
0.120
0.338
0.943
0.323
0.519
0.703
0.364
0.972
0.962
0.252
0.497
0.301
0.285
0.037
0.610
0.503
0.051
0.279
0.908
0.240
0.145
0.489
0.986
0.242
0.672
0.762
0.238
0.728
0.368
0.632
0.634
0.536
0.090
0.835
0.321
0.187
0.041
0.591
0.678
0.017
0.512
0.226
0.645
0.174
0.691
0.387
0.937
0.138
0.341
0.113
0.925
0.877
0.258
0.660
0.817
0.555
0.530
0.242
0.093
0.897
0.900
0.633
0.339
0.349
0.726
0.897
0.887
0.780
0.642
0.084
0.162
0.899
0.606
0.009
0.101
0.664
0.005
0.161
0.549
0.692
0.652
0.224
0.712
0.237
0.325
0.746
0.650
0.849
0.658
0.568
0.094
0.368
0.265
0.244
0.973
0.393
0.892
0.631
0.795
0.503
0.577
0.493
0.195
0.722
0.281
0.024
0.645
0.177
0.940
0.954
0.915
0.370
0.015
0.928
0.428
0.967
0.964
0.853
0.294
0.385
0.851
0.317
0.169
0.557
0.936
0.696
0.570
0.097
0.615
0.990
0.140
0.518
0.877
0.741
0.697
0.702
0.359
0.294
0.809
0.810
0.867
0.913
0.511
0.502
0.798
0.650
0.702
0.796
0.890
0.338
0.376
0.094
0.578
0.036
0.466
0.543
0.287
0.591
0.031
0.037
0.823
0.360
0.127
0.522
0.770
0.216
0.623
0.085
0.052
0.531
0.541
0.637
0.726
0.976
0.516
0.323
0.795
0.271
0.439
0.078
0.025
0.963
0.836
0.696
0.409
0.173
0.156
0.250
0.549
0.715
0.660
0.280
0.955
0.738
0.554
0.612
0.420
0.248
0.356
0.758
0.014
0.116
0.046
0.041
0.855
0.704
0.474
0.098
0.492
0.473
0.173
0.434
0.399
0.616
0.635
0.045
0.375
0.626
0.503
0.856
0.659
0.163
0.071
0.642
0.027
0.586
0.940
0.575
0.388
0.643
0.458
0.546
0.941
0.386
0.961
0.905
0.196
0.069
0.101
0.018
0.094
0.683
0.071
0.319
0.845
0.023
0.814
0.282
0.118
0.697
0.629
0.877
0.735
0.803
0.282
0.177
0.751
0.807
0.991
0.413
0.372
0.776
0.341
0.931
0.858
0.429
0.751
0.755
0.103
0.903
0.505
0.826
0.320
0.896
0.389
0.011
0.905
0.091
0.319
0.950
0.951
0.573
0.632
0.448
0.293
0.329
0.673
0.752
0.792
0.790
0.091
0.494
0.058
0.550
0.442
0.888
0.351
0.117
0.143
0.762
0.618
0.101
0.084
0.701
0.073
0.822
0.706
0.081
0.085
0.987
0.374
0.371
0.813
0.947
0.986
0.753
0.376
0.084
0.777
0.558
0.424
0.906
0.111
0.493
0.011
0.469
0.056
0.119
0.118
0.649
0.746
0.583
0.962
0.375
0.286
0.869
0.224
0.963
0.012
0.970
0.043
0.891
0.528
0.993
0.074
0.554
0.969
0.523
0.629
0.696
0.455
0.628
0.584
0.901
0.045
0.281
0.950
0.890
0.456
0.620
0.277
0.188
0.464
0.353
0.584
0.078
0.974
0.986
0.698
0.536
0.310
0.814
0.685
0.163
0.911
0.823
0.950
0.726
0.613
0.418
0.933
0.866
0.045
0.026
0.376
0.811
0.987
0.150
0.594
0.381
0.970
0.842
0.838
0.469
0.415
0.273
0.056
0.865
0.813
1.000
0.997
0.555
0.769
0.945
0.850
0.247
0.451
0.129
0.954
0.606
0.229
0.672
0.618
0.358
0.114
0.672
0.520
0.772
0.520
0.852
0.552
0.561
0.877
0.403
0.134
0.029
0.755
0.620
0.704
0.213
0.136
0.015
0.351
0.590
0.392
0.437
0.904
0.348
0.514
0.784
0.397
0.622
0.862
0.950
0.147
0.927
0.492
0.258
0.459
0.980
0.493
0.329
0.633
0.240
0.076
0.129
0.128
0.152
0.139
0.641
0.182
0.346
0.897
0.474
0.668
0.172
0.192
0.041
0.169
0.279
0.177
0.089
0.121
0.461
0.206
0.364
0.503
0.690
0.039
0.799
0.628
0.082
0.874
0.921
0.061
0.277
0.806
0.748
0.185
0.209
0.370
0.485
0.618
0.369
0.463
0.747
0.037
0.252
0.713
0.895
0.512
0.532
0.107
0.447
0.533
0.242
0.269
0.377
0.020
0.322
0.211
0.327
0.120
0.891
0.594
0.679
0.789
0.498
0.087
0.537
0.587
0.745
0.432
0.128
0.284
0.363
0.646
0.571
0.356
0.987
0.606
0.237
0.102
0.153
0.246
0.161
0.187
0.285
0.173
0.897
0.080
0.525
0.410
0.982
0.112
0.398
0.969
0.866
0.817
0.258
0.171
0.669
0.929
0.557
0.572
0.280
0.769
0.187
0.324
0.425
0.508
0.242
0.115
0.611
0.289
0.581
0.154
0.481
0.533
0.052
0.337
0.134
0.063
0.990
0.322
0.810
0.255
0.682
0.760
0.596
0.472
0.412
0.349
0.930
0.831
0.965
0.124
0.731
0.938
0.181
0.066
0.741
0.574
0.842
0.140
0.795
0.202
0.164
0.164
0.815
0.665
0.523
0.359
0.877
0.392
0.817
0.439
0.377
0.463
0.301
0.748
0.503
0.232
0.900
0.384
0.544
0.906
0.624
0.117
0.940
0.628
0.335
0.139
0.794
0.620
0.533
0.894
0.789
0.152
0.312
0.248
0.744
0.034
0.570
0.762
0.877
0.342
0.821
0.111
0.846
0.127
0.397
0.797
0.150
0.229
0.722
0.720
0.641
0.694
0.543
0.252
0.346
0.182
0.908
0.583
0.401
0.462
0.947
0.153
0.586
0.506
0.611
0.018
0.872
0.932
0.565
0.697
0.922
0.707
0.153
0.576
0.607
0.424
0.736
0.934
0.926
0.451
0.113
0.985
0.839
0.125
0.921
0.870
0.519
0.591
0.399
0.055
0.335
0.803
0.005
0.333
0.398
0.537
0.920
0.346
0.347
0.738
0.452
0.225
0.452
0.141
0.176
0.498
0.419
0.915
0.362
0.581
0.632
0.013
0.664
0.178
0.961
0.149
0.415
0.085
0.997
0.502
0.595
0.067
0.750
0.210
0.898
0.205
0.191
0.037
0.472
0.565
0.066
0.776
0.453
0.524
0.441
0.401
0.560
0.155
0.182
0.862
0.946
0.373
0.271
0.644
0.409
0.025
0.156
0.716
0.659
0.027
0.222
0.231
0.672
0.020
0.104
0.800
0.179
0.653
0.238
0.099
0.243
0.722
0.856
0.830
0.397
0.668
0.205
0.293
0.896
0.013
0.086
0.208
0.027
0.181
0.583
0.421
0.893
0.817
0.342
0.259
0.380
0.590
0.268
0.624
0.409
0.552
0.436
0.294
0.948
0.764
0.140
0.868
0.487
0.895
0.800
0.425
0.022
0.269
0.542
0.633
0.258
0.139
0.835
0.984
0.526
0.172
0.272
0.018
0.914
0.118
0.577
0.274
0.554
0.651
0.830
0.206
0.011
0.137
0.900
0.874
0.597
0.601
0.665
0.175
0.914
0.419
0.383
0.519
0.047
0.166
0.738
0.083
0.603
0.245
0.389
0.289
0.356
0.719
0.297
0.566
0.476
0.664
0.937
0.733
0.215
0.031
0.262
0.595
0.051
0.496
0.597
0.334
0.771
0.107
0.075
0.728
0.495
0.688
0.435
0.246
0.819
0.799
0.695
0.272
0.590
0.361
0.092
0.917
0.137
0.950
0.446
0.185
0.542
0.873
0.732
0.807
0.659
0.692
0.849
0.250
0.489
0.221
0.988
0.944
0.039
0.706
0.925
0.181
0.568
0.915
0.034
0.697
0.297
0.924
0.971
0.944
0.474
0.862
0.845
0.319
0.829
0.037
0.596
0.230
0.121
0.077
0.696
0.340
0.725
0.065
0.315
0.539
0.791
0.319
0.626
0.886
0.616
0.233
0.024
0.870
0.021
0.875
0.529
0.939
0.799
0.998
0.351
0.767
0.402
0.480
0.628
0.874
0.984
0.768
0.418
0.421
0.738
0.239
0.110
0.355
0.287
0.296
0.234
0.042
0.018
0.988
0.428
0.384
0.680
0.218
0.950
0.786
0.089
0.418
0.879
0.945
0.467
0.613
0.167
0.991
0.232
0.943
0.650
0.608
0.513
0.231
0.177
0.220
0.186
0.780
0.350
0.058
0.969
0.884
0.928
0.995
0.174
0.396
0.758
0.696
0.154
0.816
0.224
0.224
0.537
0.593
0.580
0.091
0.877
0.266
0.130
0.889
0.956
0.862
0.810
0.655
0.551
0.087
0.408
0.373
0.260
0.723
0.496
0.081
0.220
0.683
0.076
0.851
0.495
0.481
0.592
0.825
0.348
0.678
0.566
0.267
0.879
0.797
0.658
0.851
0.867
0.708
0.837
0.697
0.680
0.619
0.753
0.159
0.881
0.872
0.029
0.826
0.129
0.335
0.744
0.161
0.818
0.832
0.507
0.006
0.287
0.617
0.981
0.632
0.260
0.634
0.540
0.780
0.107
0.761
0.541
0.963
0.342
0.633
0.932
0.103
0.937
0.688
0.068
0.301
0.708
0.067
0.582
0.346
0.621
0.046
0.872
0.973
0.969
0.750
0.130
0.758
0.025
0.022
0.324
0.489
0.770
0.683
0.446
0.274
0.997
0.426
0.451
0.164
0.795
0.694
0.221
0.082
0.680
0.655
0.273
0.951
0.151
0.432
0.944
0.420
0.639
0.398
0.274
0.984
0.409
0.894
0.230
0.213
0.031
0.652
0.369
0.864
0.473
0.968
0.186
0.869
0.777
0.771
0.845
0.761
0.626
0.131
0.033
0.921
0.617
0.797
0.482
0.117
0.125
0.686
0.430
0.201
0.492
0.064
0.582
0.269
0.798
0.310
0.455
0.012
0.072
0.392
0.480
0.600
0.292
0.695
0.860
0.780
0.040
0.481
0.105
0.242
0.987
0.142
0.499
0.618
0.702
0.560
0.010
0.326
0.518
0.088
0.351
0.033
0.079
0.397
0.133
0.568
0.689
0.801
0.200
0.167
0.105
0.636
0.706
0.032
0.936
0.052
0.541
0.709
0.871
0.714
0.802
0.339
0.815
0.080
0.895
0.548
0.817
0.452
0.644
0.526
0.732
0.082
0.060
0.247
0.160
0.872
0.219
0.976
0.337
0.182
0.790
0.659
0.498
0.555
0.719
0.228
0.996
0.975
0.650
0.200
0.680
0.072
0.031
0.258
0.463
0.868
0.727
0.743
0.425
0.346
0.371
0.988
0.040
0.867
0.579
0.439
0.725
0.487
0.873
0.901
0.422
0.277
0.592
0.912
0.211
0.623
0.632
0.733
0.132
0.716
0.909
0.180
0.238
0.971
0.181
0.854
0.492
0.247
0.871
0.445
0.515
0.359
0.593
0.164
0.391
0.969
0.258
0.657
0.325
0.773
0.131
0.970
0.454
0.236
0.073
0.170
0.520
0.337
0.829
0.431
0.249
0.617
0.707
0.167
0.168
0.037
0.736
0.664
0.475
0.844
0.806
0.585
0.868
0.206
0.112
0.270
0.057
0.531
0.937
0.039
0.122
0.452
0.934
0.316
0.507
0.042
0.148
0.987
0.965
0.005
0.952
0.639
0.868
0.455
0.516
0.489
0.667
0.140
0.030
0.308
0.705
0.202
0.673
0.970
0.094
0.673
0.444
0.868
0.177
0.693
0.838
0.945
0.683
0.497
0.618
0.869
0.571
0.030
0.931
0.690
0.677
0.216
0.659
0.394
0.651
0.107
0.658
0.999
0.048
0.977
0.407
0.871
0.782
0.567
0.738
0.879
0.404
0.327
0.668
0.808
0.762
0.798
0.436
0.818
0.120
0.544
0.006
0.325
0.366
0.396
0.695
0.389
0.449
0.238
0.373
0.227
0.073
0.603
0.668
0.619
0.463
0.380
0.863
0.519
0.479
0.026
0.341
0.380
0.399
0.580
0.534
0.608
0.765
0.813
0.718
0.956
0.018
0.196
0.008
0.647
0.898
0.243
0.927
0.060
0.934
0.352
0.101
0.486
0.257
0.285
0.307
0.803
0.539
0.311
0.610
0.716
0.273
0.414
0.122
0.181
0.681
0.181
0.525
0.709
0.107
0.567
0.257
0.963
0.484
0.806
0.550
0.043
0.633
0.951
0.602
0.819
0.884
0.228
0.212
0.611
0.411
0.840
0.900
0.353
0.237
0.781
0.275
0.823
0.424
0.668
0.096
0.624
0.452
0.587
0.168
0.737
0.863
0.217
0.096
0.024
0.642
0.607
0.547
0.232
0.391
0.594
0.497
0.988
0.136
0.695
0.404
0.428
0.718
0.692
0.991
0.128
0.104
0.724
0.578
0.274
0.079
0.086
0.894
0.192
0.323
0.227
0.355
0.069
0.519
0.068
0.800
0.234
0.540
0.880
0.651
0.533
0.324
0.333
0.669
0.994
0.662
0.558
0.731
0.465
0.060
0.562
0.958
0.175
0.690
0.201
0.536
0.097
0.450
0.756
0.348
0.665
0.795
0.927
0.235
0.399
0.152
0.992
0.927
0.540
0.842
0.521
0.624
0.089
0.755
0.128
0.826
0.782
0.709
0.036
0.303
0.263
0.360
0.088
0.937
0.554
0.306
0.397
0.447
0.601
0.516
0.919
0.497
0.992
0.851
0.209
0.931
0.116
0.817
0.381
0.878
0.868
0.806
0.790
0.305
0.081
0.403
0.174
0.695
0.346
0.976
0.641
0.822
0.133
0.862
0.923
0.487
0.606
0.765
0.175
0.503
0.399
0.146
0.368
0.068
0.026
0.135
0.963
0.550
0.966
0.432
0.312
0.506
0.440
0.106
0.641
0.216
0.620
0.650
0.152
0.061
0.781
0.460
0.058
0.995
0.058
0.695
0.984
0.239
0.142
0.121
0.303
0.101
0.692
0.062
0.509
0.997
0.814
0.615
0.306
0.624
0.527
0.426
0.131
0.887
0.450
0.195
0.368
0.414
0.828
0.734
0.769
0.011
0.416
0.481
0.019
0.260
0.760
0.137
0.535
0.215
0.012
0.241
0.976
0.802
0.960
0.488
0.110
0.548
0.454
0.844
0.098
0.488
0.150
0.325
0.737
0.476
0.376
0.394
0.459
0.785
0.892
0.955
0.787
0.315
0.688
0.438
0.255
0.841
0.038
0.902
0.461
0.637
0.659
0.895
0.637
0.614
0.067
0.518
0.150
0.737
0.512
0.680
0.042
0.085
0.716
0.072
0.071
0.012
0.957
0.738
0.353
0.297
0.350
0.775
0.661
0.185
0.174
0.098
0.660
0.764
0.265
0.021
0.082
0.968
0.295
0.769
0.625
0.382
0.206
0.121
0.615
0.775
0.644
0.530
0.042
0.968
0.799
0.293
0.980
0.602
0.582
0.748
0.812
0.656
0.128
0.338
0.928
0.225
0.372
0.432
0.439
0.613
0.943
0.241
0.122
0.197
0.887
0.646
0.286
0.816
0.861
0.847
0.919
0.252
0.755
0.461
0.842
0.728
0.776
0.656
0.177
0.545
0.985
0.937
0.043
0.165
0.132
0.726
0.818
0.214
0.506
0.841
0.733
0.542
0.590
0.508
0.298
0.565
0.689
0.873
0.636
0.761
0.160
0.462
0.009
0.247
0.726
0.992
0.099
0.401
0.800
0.204
0.555
0.733
0.616
0.188
0.355
0.784
0.554
0.005
0.761
0.035
0.746
0.202
0.958
0.368
0.327
0.149
0.306
0.877
0.996
0.368
0.449
0.722
0.886
0.593
0.392
0.413
0.696
0.003
0.620
0.355
0.794
0.093
0.588
0.481
0.642
0.065
0.580
0.561
0.561
0.603
0.676
0.805
0.270
0.825
0.498
0.077
0.059
0.334
0.785
0.708
0.789
0.517
0.440
0.147
0.328
0.434
0.089
0.221
0.598
0.736
0.998
0.933
0.643
0.421
0.636
0.786
0.118
0.410
0.840
0.384
0.572
0.588
0.184
0.362
0.335
0.026
0.024
0.832
0.273
0.518
0.299
0.941
0.259
0.430
0.873
0.842
0.186
0.803
0.458
0.483
0.133
0.081
0.728
0.496
0.437
0.730
0.766
0.159
0.610
0.135
0.751
0.657
0.957
0.069
0.057
0.282
0.262
0.247
0.906
0.250
0.272
0.759
0.450
0.777
0.065
0.488
0.034
0.063
0.906
0.139
0.532
0.411
0.347
0.900
0.022
0.664
0.963
0.560
0.937
0.052
0.419
0.260
0.731
0.981
0.257
0.654
0.198
0.565
0.464
0.972
0.609
0.350
0.114
0.151
0.225
0.251
0.851
0.561
0.523
0.115
0.860
0.723
0.068
0.708
0.544
0.082
0.458
0.485
0.166
0.946
0.850
0.669
0.462
0.412
0.651
0.545
0.062
0.513
0.806
0.459
0.052
0.786
0.201
0.259
0.165
0.330
0.757
0.519
0.205
0.878
0.880
0.871
0.239
0.451
0.985
0.772
0.027
0.065
0.464
0.909
0.539
0.498
0.105
0.657
0.822
0.380
0.776
0.964
0.204
0.523
0.287
0.793
0.578
0.635
0.798
0.396
0.915
0.533
0.158
0.696
0.793
0.317
0.857
0.906
0.277
0.984
0.141
0.202
0.184
0.894
0.654
0.152
0.440
0.615
0.083
0.882
0.804
0.505
0.967
0.418
0.984
0.668
0.635
0.166
0.882
0.427
0.162
0.013
0.560
0.527
0.719
0.890
0.079
0.731
0.187
0.858
0.819
0.541
0.710
0.314
0.471
0.822
0.459
0.358
0.494
0.828
0.335
0.174
0.712
0.826
0.101
0.240
0.142
0.348
0.450
0.749
0.651
0.621
0.352
0.841
0.471
0.979
0.634
0.126
0.676
0.325
0.686
0.070
0.175
0.856
0.227
0.837
0.279
0.643
0.694
0.513
0.305
0.213
0.033
0.304
0.653
0.938
0.871
0.766
0.788
0.665
0.260
0.907
0.671
0.560
0.111
0.447
0.460
0.865
0.547
0.380
0.977
0.111
0.423
0.042
0.740
0.918
0.280
0.858
0.292
0.911
0.754
0.805
0.018
0.963
0.727
0.305
0.829
0.282
0.873
0.113
0.704
0.541
0.097
0.242
0.012
0.469
0.301
0.598
0.297
0.300
0.743
0.048
0.903
0.852
0.668
0.593
0.892
0.185
0.079
0.240
0.795
0.035
0.583
0.995
0.856
0.521
0.064
0.831
0.599
0.115
0.094
0.910
0.669
0.829
0.879
0.572
0.517
0.430
0.317
0.435
0.774
0.602
0.893
0.443
0.607
0.631
0.592
0.703
0.237
0.512
0.104
0.385
0.488
0.652
0.951
0.601
0.744
0.506
0.634
0.071
0.254
0.362
0.472
0.046
0.140
0.277
0.972
0.331
0.482
0.196
0.611
0.281
0.207
0.517
0.006
0.008
0.219
0.037
0.108
0.339
0.803
0.572
0.513
0.293
0.932
0.397
0.087
0.617
0.114
0.345
0.507
0.874
0.494
0.702
0.993
0.131
0.275
0.395
0.422
0.411
0.908
0.714
0.608
0.309
0.824
0.955
0.821
0.002
0.636
0.051
0.258
0.060
0.604
0.687
0.114
0.384
0.456
0.369
0.121
0.419
0.751
0.071
0.080
0.355
0.942
0.669
0.679
0.362
0.594
0.010
0.636
0.913
0.613
0.874
0.724
0.121
0.902
0.066
0.534
0.142
0.012
0.422
0.295
0.486
0.577
0.044
0.123
0.559
0.343
0.729
0.652
0.846
0.692
0.430
0.673
0.275
0.306
0.789
0.446
0.798
0.822
0.858
0.917
0.431
0.319
0.582
0.371
0.601
0.706
0.688
0.375
0.167
0.431
0.143
0.890
0.346
0.154
0.025
0.646
0.637
0.341
0.072
0.410
0.311
0.677
0.606
0.365
0.218
0.988
0.454
0.688
0.141
0.486
0.028
0.505
0.964
0.384
0.039
0.031
0.388
0.160
0.023
0.756
0.459
0.289
0.900
0.116
0.956
0.314
0.888
0.603
0.827
0.984
0.288
0.961
0.389
0.386
0.340
0.541
0.154
0.554
0.542
0.762
0.834
0.440
0.302
0.259
0.195
0.058
0.342
0.270
0.966
0.558
0.347
0.580
0.139
0.444
0.626
0.489
0.402
0.994
0.880
0.623
0.569
0.621
0.201
0.395
0.039
0.476
0.543
0.228
0.964
0.909
0.722
0.533
0.870
0.131
0.791
0.125
0.794
0.276
0.877
0.944
0.149
0.463
0.981
0.483
0.864
0.589
0.375
0.286
0.203
0.762
0.387
0.511
0.492
0.577
0.866
0.981
0.408
0.828
0.765
0.574
0.956
0.200
0.109
0.854
0.439
0.847
0.893
0.062
0.883
0.448
0.510
0.627
0.926
0.019
0.477
0.688
0.723
0.693
0.134
0.299
0.359
0.804
0.279
0.211
0.957
0.009
0.998
0.677
0.828
0.295
0.014
0.738
0.834
0.740
0.143
0.753
0.769
0.659
0.766
0.846
0.614
0.089
0.488
0.078
0.408
0.407
0.066
0.349
0.111
0.808
0.948
0.072
0.955
0.523
0.300
0.077
0.501
0.795
0.707
0.050
0.073
0.403
0.295
0.232
0.281
0.803
0.929
0.405
0.906
0.321
0.476
0.226
0.640
0.979
0.603
0.358
0.648
0.123
0.889
0.503
0.449
0.586
0.625
0.072
0.683
0.242
0.714
0.823
0.804
0.553
0.520
0.143
0.775
0.271
0.497
0.284
0.134
0.630
0.054
0.749
0.318
0.000
0.511
0.047
0.276
0.707
0.063
0.839
0.004
0.247
0.741
0.316
0.102
0.360
0.270
0.843
0.313
0.789
0.892
0.434
0.910
0.377
0.964
0.089
0.687
0.494
0.388
0.633
0.704
0.004
0.167
0.713
0.666
0.966
0.761
0.951
0.703
0.298
0.105
0.782
0.644
0.048
0.360
0.957
0.500
0.433
0.458
0.209
0.369
0.370
0.052
0.768
0.417
0.822
0.850
0.212
0.657
0.472
0.880
0.216
0.678
0.608
0.295
0.137
0.652
0.739
0.316
0.645
0.395
0.713
0.199
0.890
0.287
0.368
0.058
0.112
0.516
0.268
0.835
0.015
0.379
0.337
0.019
0.124
0.414
0.493
0.404
0.531
0.595
0.010
0.464
0.963
0.519
0.678
0.312
0.774
0.773
0.521
0.976
0.126
0.017
0.770
0.807
0.120
0.266
0.018
0.293
0.773
0.518
0.348
0.372
0.001
0.300
0.646
0.974
0.847
0.024
0.899
0.783
0.780
0.458
0.398
0.303
0.066
0.228
0.247
0.484
0.747
0.474
0.058
0.958
0.943
0.785
0.991
0.544
0.963
0.076
0.366
0.225
0.196
0.141
0.622
0.781
0.578
0.147
0.811
0.636
0.388
0.674
0.260
0.345
0.917
0.291
0.470
0.890
0.708
0.062
0.147
0.008
0.631
0.448
0.134
0.958
0.530
0.242
0.501
0.680
0.076
0.275
0.807
0.460
0.547
0.433
0.044
0.166
0.446
0.209
0.050
0.844
0.981
0.793
0.854
0.242
0.961
0.197
0.951
0.995
0.712
0.981
0.570
0.260
0.437
0.594
0.073
0.622
0.981
0.190
0.793
0.908
0.944
0.960
0.521
0.977
0.757
0.162
0.477
0.718
0.247
0.641
0.667
0.163
0.565
0.772
0.499
0.012
0.009
0.357
0.926
0.229
0.634
0.222
0.322
0.848
0.729
0.095
0.429
0.029
0.481
0.662
0.119
0.289
0.398
0.920
0.993
0.045
0.761
0.372
0.392
0.754
0.918
0.951
0.577
0.357
0.788
0.251
0.564
0.359
0.657
0.240
0.192
0.918
0.102
0.506
0.221
0.039
0.036
0.175
0.867
0.282
0.950
0.582
0.437
0.580
0.517
0.759
0.282
0.353
0.894
0.946
0.893
0.419
0.780
0.476
0.498
0.205
0.591
0.186
0.332
0.855
0.207
0.071
0.069
0.941
0.507
0.409
0.811
0.836
0.332
0.694
0.771
0.655
0.152
0.876
0.539
0.282
0.425
0.038
0.128
0.766
0.000
0.417
0.523
0.055
0.973
0.226
0.304
0.304
0.230
0.001
0.729
0.967
0.224
0.663
0.742
0.848
0.423
0.303
0.325
0.713
0.817
0.182
0.371
0.902
0.807
0.985
0.754
0.393
0.591
0.661
0.078
0.544
0.709
0.167
0.781
0.584
0.952
0.042
0.265
0.602
0.297
0.714
0.759
0.103
0.514
0.509
0.369
0.933
0.828
0.697
0.714
0.462
0.921
0.695
0.729
0.862
0.274
0.807
0.195
0.345
0.336
0.979
0.857
0.701
0.727
0.562
0.947
0.496
0.381
0.163
0.786
0.734
0.384
0.025
0.839
0.011
0.704
0.970
0.438
0.235
0.705
0.817
0.546
0.967
0.052
0.505
0.718
0.863
0.179
0.800
0.553
0.397
0.132
0.865
0.157
0.310
0.290
0.871
0.673
0.797
0.250
0.625
0.572
0.833
0.906
0.012
0.674
0.052
0.549
0.288
0.307
0.353
0.621
0.334
0.733
0.405
0.068
0.784
0.286
0.433
0.685
0.332
0.057
0.374
0.944
0.642
0.671
0.632
0.199
0.418
0.751
0.101
0.278
0.276
0.432
0.980
0.068
0.519
0.179
0.971
0.113
0.404
0.738
0.705
0.423
0.347
0.398
0.264
0.205
0.483
0.269
0.287
0.657
0.969
0.604
0.077
0.076
0.951
0.297
0.092
0.599
0.624
0.649
0.267
0.015
0.965
0.251
0.676
0.707
0.610
0.313
0.271
0.598
0.866
0.947
0.106
0.155
0.945
0.737
0.883
0.203
0.588
0.701
0.680
0.408
0.015
0.583
0.253
0.450
0.958
0.399
0.840
0.189
0.672
0.977
0.102
0.008
0.434
0.093
0.748
0.915
0.434
0.259
0.434
0.723
0.009
0.589
0.613
0.638
0.242
0.714
0.091
0.199
0.877
0.739
0.014
0.248
0.214
0.271
0.248
0.063
0.459
0.733
0.607
0.673
0.081
0.951
0.838
0.805
0.823
0.933
0.544
0.200
0.617
0.743
0.738
0.521
0.068
0.371
0.921
0.584
0.538
0.269
0.368
0.895
0.666
0.787
0.454
0.630
0.248
0.705
0.428
0.443
0.649
0.936
0.064
0.825
0.292
0.444
0.022
0.301
0.503
0.056
0.491
0.927
0.105
0.764
0.410
0.655
0.260
0.159
0.160
0.070
0.186
0.664
0.882
0.814
0.685
0.110
0.289
0.310
0.250
0.515
0.536
0.357
0.354
0.829
0.789
0.308
0.914
0.953
0.327
0.354
0.506
0.941
0.876
0.103
0.393
0.553
0.503
0.194
0.859
0.677
0.838
0.859
0.748
0.439
0.611
0.160
0.674
0.179
0.694
0.230
0.118
0.165
0.002
0.719
0.732
0.515
0.161
0.084
0.019
0.166
0.891
0.242
0.354
0.105
0.222
0.519
0.608
0.245
0.058
0.391
0.234
0.220
0.960
0.616
0.557
0.416
0.429
0.541
0.696
0.702
0.172
0.500
0.412
0.871
0.631
0.533
0.115
0.606
0.117
0.337
0.143
0.692
0.206
0.392
0.896
0.204
0.508
0.419
0.018
0.793
0.069
0.474
0.561
0.628
0.689
0.253
0.010
0.723
0.536
0.836
0.821
0.843
0.485
0.334
0.792
0.451
0.183
0.855
0.883
0.466
0.076
0.388
0.804
0.902
0.203
0.067
0.877
0.389
0.542
0.968
0.067
0.648
0.074
0.375
0.804
0.433
0.997
0.559
0.321
0.220
0.351
0.373
0.069
0.370
0.464
0.723
0.657
0.709
0.008
0.218
0.662
0.484
0.005
0.804
0.773
0.548
0.066
0.767
0.584
0.782
0.752
0.803
0.518
0.140
0.671
0.620
0.742
0.170
0.195
0.890
0.750
0.908
0.759
0.597
0.654
0.889
0.579
0.632
0.156
0.474
0.716
0.271
0.202
0.314
0.242
0.215
0.425
0.908
0.507
0.188
0.077
0.696
0.383
0.822
0.660
0.796
0.272
0.692
0.264
0.939
0.636
0.325
0.270
0.191
0.695
0.219
0.595
0.265
0.662
0.815
0.778
0.761
0.188
0.088
0.699
0.368
0.432
0.031
0.260
0.034
0.879
0.243
0.557
0.039
0.667
0.323
0.898
0.888
0.325
0.901
0.996
0.825
0.845
0.249
0.577
0.067
0.095
0.999
0.327
0.748
0.807
0.858
0.998
0.241
0.040
0.411
0.130
0.022
0.360
0.784
0.566
0.313
0.654
0.232
0.014
0.764
0.624
0.762
0.039
0.837
0.620
0.563
0.625
0.864
0.587
0.581
0.991
0.757
0.442
0.707
0.389
0.229
0.597
0.928
0.929
0.342
0.528
0.212
0.996
0.981
0.650
0.804
0.715
0.593
0.053
0.455
0.675
0.678
0.373
0.942
0.167
0.500
0.691
0.697
0.649
0.275
0.156
0.636
0.599
0.179
0.705
0.455
0.668
0.837
0.170
0.019
0.779
0.610
0.700
0.838
0.803
0.961
0.536
0.488
0.402
0.154
0.573
0.277
0.921
0.583
0.593
0.355
0.052
0.032
0.423
0.085
0.607
0.881
0.883
0.659
0.211
0.863
0.886
0.197
0.737
0.287
0.803
0.997
0.030
0.897
0.623
0.973
0.465
0.847
0.062
0.335
0.067
0.975
0.817
0.853
0.938
0.085
0.386
0.071
0.211
0.229
0.469
0.269
0.101
0.167
0.147
0.973
0.759
0.968
0.440
0.278
0.798
0.326
0.299
0.233
0.130
0.256
0.355
0.674
0.063
0.211
0.809
0.147
0.343
0.865
0.155
0.083
0.485
0.302
0.563
0.804
0.137
0.581
0.506
0.144
0.624
0.274
0.488
0.082
0.460
0.306
0.822
0.057
0.419
0.458
0.725
0.574
0.667
0.777
0.862
0.314
0.538
0.840
0.989
0.890
0.371
0.195
0.489
0.742
0.493
0.483
0.838
0.361
0.860
0.407
0.328
0.454
0.762
0.126
0.196
0.951
0.175
0.568
0.579
0.490
0.645
0.230
0.553
0.372
0.662
0.141
0.571
0.185
0.279
0.219
0.183
0.826
0.286
0.927
0.970
0.571
0.143
0.375
0.798
0.367
0.087
0.557
0.845
0.796
0.175
0.673
0.221
0.218
0.874
0.250
0.263
0.001
0.871
0.793
0.627
0.750
0.152
0.458
0.351
0.094
0.486
0.921
0.040
0.291
0.208
0.238
0.908
0.468
0.466
0.761
0.154
0.487
0.430
0.597
1.000
0.769
0.398
0.828
0.171
0.030
0.204
0.340
0.511
0.615
0.911
0.510
0.501
0.050
0.035
0.551
0.438
0.839
0.161
0.025
0.449
0.237
0.050
0.725
0.112
0.608
0.281
0.173
0.380
0.801
0.392
0.751
0.125
0.773
0.237
0.677
0.566
0.929
0.387
0.066
0.019
0.827
0.525
0.775
0.234
0.345
0.030
0.961
0.668
0.933
0.265
0.611
0.678
0.318
0.848
0.947
0.885
0.739
0.277
0.282
0.963
0.010
0.716
0.706
0.623
0.990
0.312
0.340
0.079
0.443
0.261
0.343
0.835
0.935
0.186
0.373
0.929
0.062
0.092
0.163
0.595
0.150
0.968
0.447
0.502
0.247
0.471
0.662
0.751
0.754
0.578
0.904
0.817
0.757
0.056
0.007
0.212
0.664
0.411
0.402
0.885
0.896
0.909
0.314
0.691
0.272
0.191
0.185
0.342
0.430
0.831
0.120
0.735
0.531
0.288
0.493
0.300
0.596
0.434
0.164
0.117
0.547
0.902
0.344
0.733
0.659
0.932
0.821
0.567
0.657
0.898
0.400
0.327
0.011
0.827
0.801
0.104
0.577
0.464
0.119
0.981
0.215
0.067
0.595
0.739
0.032
0.659
0.532
0.103
0.173
0.568
0.296
0.938
0.819
0.984
0.260
0.970
0.431
0.348
0.050
0.053
0.692
0.458
0.227
0.614
0.253
0.578
0.359
0.824
0.821
0.477
0.351
0.363
0.806
0.328
0.209
0.085
0.467
0.482
0.841
0.221
0.381
0.808
0.824
0.385
0.459
0.303
0.933
0.119
0.934
0.684
0.532
0.033
0.476
0.408
0.161
0.656
0.971
0.562
0.715
0.068
0.418
0.116
0.613
0.938
0.662
0.077
0.355
0.551
0.403
0.834
0.815
0.612
0.373
0.255
0.106
0.355
0.413
0.676
0.658
0.070
0.395
0.182
0.157
0.827
0.042
0.419
0.168
0.879
0.557
0.231
0.502
0.731
0.958
0.220
0.886
0.934
0.916
0.635
0.630
0.403
0.752
0.531
0.677
0.428
0.731
0.824
0.146
0.833
0.540
0.845
0.431
0.379
0.915
0.251
0.844
0.484
0.514
0.308
0.573
0.325
0.039
0.272
0.006
0.978
0.966
0.395
0.728
0.346
0.671
0.805
0.947
0.400
0.783
0.266
0.990
0.026
0.604
0.660
0.688
0.120
0.939
0.181
0.623
0.223
0.307
0.546
0.417
0.160
0.171
0.418
0.757
0.898
0.084
0.393
0.100
0.017
0.662
0.602
0.163
0.234
0.024
0.835
0.975
0.135
0.231
0.868
0.926
0.420
0.051
0.039
0.574
0.393
0.029
0.583
0.011
0.787
0.306
0.040
0.588
0.398
0.974
0.544
0.275
0.709
0.271
0.904
0.375
0.550
0.051
0.426
0.832
0.806
0.224
0.226
0.817
0.930
0.095
0.450
0.337
0.871
0.084
0.211
0.752
0.051
0.493
0.442
0.334
0.395
0.530
0.161
0.572
0.805
0.760
0.154
0.149
0.268
0.361
0.408
0.680
0.057
0.035
0.392
0.697
0.193
0.642
0.260
0.886
0.896
0.297
0.230
0.411
0.241
0.672
0.826
0.673
0.824
0.397
0.156
0.738
0.360
0.671
0.271
0.081
0.993
0.156
0.988
0.977
0.794
0.659
0.578
0.866
0.289
0.468
0.619
0.411
0.427
0.330
0.564
0.851
0.202
0.934
0.689
0.823
0.556
0.780
0.016
0.818
0.040
0.890
0.992
0.294
0.210
0.765
0.253
0.866
0.103
0.126
0.979
0.674
0.847
0.324
0.676
0.594
0.603
0.683
0.575
0.429
0.276
0.769
0.226
0.692
0.233
0.625
0.747
0.219
0.060
0.131
0.606
0.849
0.045
0.734
0.341
0.479
0.929
0.332
0.465
0.014
0.082
0.259
0.028
0.631
0.426
0.548
0.175
0.296
0.664
0.965
0.050
0.890
0.577
0.564
0.500
0.069
0.090
0.601
0.341
0.917
0.407
0.143
0.715
0.293
0.525
0.698
0.900
0.792
0.676
0.680
0.946
0.296
0.001
0.272
0.218
0.662
0.634
0.593
0.016
0.729
0.324
0.665
0.557
0.343
0.135
0.094
0.832
0.918
0.650
0.103
0.402
0.729
0.780
0.118
0.000
0.712
0.357
0.254
0.013
0.540
0.851
0.958
0.566
0.514
0.085
0.549
0.380
0.607
0.389
0.240
0.095
0.315
0.097
0.177
0.987
0.444
0.532
0.874
0.996
0.583
0.812
0.327
0.306
0.403
0.673
0.682
0.315
0.133
0.632
0.128
0.578
0.693
0.701
0.753
0.873
0.500
0.730
0.619
0.187
0.026
0.285
0.443
0.617
0.850
0.196
0.126
0.963
0.108
0.478
0.586
0.541
0.086
0.057
0.105
0.586
0.544
0.234
0.638
0.820
0.042
0.498
0.689
0.252
0.308
0.614
0.898
0.810
0.583
0.730
0.365
0.641
0.466
0.190
0.702
0.556
0.359
0.911
0.021
0.316
0.057
0.767
0.702
0.331
0.676
0.396
0.756
0.454
0.412
0.935
0.251
0.120
0.585
0.969
0.378
0.062
0.339
0.506
0.162
0.658
0.998
0.452
0.355
0.401
0.115
0.883
0.415
0.387
0.660
0.442
0.648
0.061
0.814
0.941
0.649
0.955
0.151
0.477
0.602
0.582
0.401
0.338
0.127
0.392
0.163
0.734
0.209
0.059
0.552
0.595
0.872
0.577
0.345
0.803
0.540
0.079
0.566
0.908
0.396
0.296
0.143
0.151
0.433
0.596
0.081
0.940
0.755
0.587
0.828
0.080
0.477
0.630
0.829
0.783
0.277
0.941
0.124
0.875
0.971
0.177
0.722
0.040
0.405
0.516
0.581
0.936
0.672
0.481
0.810
0.951
0.022
0.983
0.086
0.814
0.280
0.132
0.440
0.644
0.381
0.055
0.599
0.902
0.383
0.216
0.443
0.048
0.820
0.826
0.588
0.353
0.800
0.555
0.826
0.631
0.784
0.599
0.414
0.958
0.541
0.605
0.220
0.626
0.572
0.185
0.060
0.604
0.764
0.523
0.227
0.667
0.080
0.442
0.163
0.184
0.202
0.387
0.051
0.398
0.512
0.483
0.383
0.839
0.145
0.506
0.062
0.071
0.574
0.566
0.878
0.558
0.960
0.049
0.098
0.044
0.185
0.541
0.646
0.046
0.946
0.842
0.297
0.077
0.170
0.127
0.123
0.519
0.246
0.358
0.990
0.684
0.949
0.143
0.382
0.555
0.077
0.004
0.670
0.642
0.411
0.490
0.415
0.024
0.335
0.177
0.098
0.957
0.599
0.729
0.315
0.393
0.237
0.097
0.179
0.798
0.678
0.547
0.475
0.922
0.073
0.281
0.350
0.782
0.993
0.241
0.874
0.831
0.225
0.399
0.410
0.978
0.181
0.799
0.334
0.731
0.420
0.578
0.833
0.804
0.866
0.060
0.692
0.140
0.416
0.549
0.403
0.520
0.997
0.135
0.675
0.396
0.133
0.159
0.949
0.880
0.907
0.992
0.208
0.355
0.669
0.484
0.418
0.358
0.594
0.576
0.161
0.472
0.553
0.570
0.210
0.742
0.025
0.355
0.780
0.564
0.261
0.695
0.567
0.796
0.735
0.610
0.488
0.133
0.261
0.419
0.599
0.514
0.288
0.006
0.496
0.286
0.735
0.024
0.585
0.941
0.174
0.472
0.091
0.626
0.551
0.407
0.521
0.897
0.196
0.023
0.862
0.577
0.891
0.597
0.810
0.430
0.750
0.913
0.572
0.181
0.269
0.199
0.247
0.306
0.555
0.588
0.426
0.615
0.082
0.088
0.172
0.518
0.214
0.283
0.400
0.812
0.014
0.649
0.669
0.799
0.933
0.020
0.154
0.886
0.459
0.565
0.663
0.679
0.932
0.997
0.720
0.287
0.880
0.049
0.231
0.897
0.256
0.221
0.138
0.859
0.500
0.675
0.239
0.758
0.760
0.313
0.411
0.210
0.977
0.646
0.935
0.303
0.782
0.766
0.692
0.966
0.394
0.131
0.672
0.729
0.575
0.207
0.504
0.312
0.427
0.645
0.745
0.232
0.405
0.269
0.199
0.783
0.909
0.370
0.751
0.897
0.846
0.183
0.956
0.377
0.339
0.062
0.486
0.109
0.249
0.317
0.809
0.875
0.302
0.859
0.857
0.765
0.147
0.690
0.945
0.683
0.332
0.501
0.646
0.511
0.120
0.312
0.802
0.861
0.137
0.952
0.329
0.662
0.752
0.812
0.946
0.313
0.859
0.132
0.705
0.742
0.678
0.252
0.873
0.169
0.054
0.717
0.475
0.849
0.383
0.168
0.836
0.549
0.189
0.720
0.511
0.604
0.461
0.829
0.830
0.709
0.105
0.839
0.671
0.697
0.363
0.879
0.725
0.113
0.408
0.886
0.047
0.961
0.679
0.631
0.271
0.249
0.942
0.298
0.776
0.393
0.139
0.011
0.544
0.976
0.369
0.357
0.344
0.521
0.441
0.880
0.506
0.508
0.138
0.983
0.250
0.073
0.663
0.595
0.415
0.422
0.785
0.196
0.820
0.649
0.419
0.458
0.257
0.752
0.513
0.321
0.596
0.996
0.435
0.921
0.345
0.055
0.204
0.419
0.220
0.000
0.915
0.844
0.742
0.168
0.120
0.068
0.724
0.929
0.431
0.119
0.512
0.349
0.375
0.621
0.719
0.111
0.036
0.994
0.233
0.054
0.190
0.039
0.392
0.802
0.341
0.444
0.676
0.509
0.861
0.866
0.010
0.778
0.461
0.989
0.483
0.987
0.755
0.249
0.538
0.992
0.858
0.097
0.706
0.346
0.007
0.403
0.235
0.819
0.337
0.828
0.001
0.308
0.241
0.234
0.068
0.187
0.556
0.291
0.417
0.377
0.876
0.905
0.493
0.330
0.107
0.953
0.060
0.298
0.298
0.320
0.164
0.783
0.122
0.635
0.124
0.071
0.909
0.190
0.821
0.545
0.078
0.955
0.247
0.860
0.169
0.854
0.013
0.512
0.769
0.935
0.909
0.820
0.888
0.199
0.284
0.289
0.377
0.393
0.545
0.160
0.691
0.168
0.310
0.503
0.796
0.739
0.646
0.156
0.547
0.087
0.377
0.414
0.040
0.274
0.956
0.954
0.352
0.043
0.179
0.392
0.948
0.141
0.764
0.109
0.981
0.382
0.775
0.343
0.628
0.192
0.146
0.938
0.918
0.887
0.625
0.808
0.069
0.335
0.314
0.636
0.356
0.149
0.596
0.814
0.382
0.132
0.197
0.237
0.300
0.643
0.601
0.696
0.363
0.824
0.204
0.469
0.801
0.180
0.110
0.884
0.343
0.684
0.025
0.690
0.316
0.056
0.610
0.241
0.387
0.081
0.147
0.298
0.090
0.911
0.006
0.484
0.123
0.308
0.926
0.162
0.966
0.206
0.742
0.547
0.884
0.554
0.535
0.195
0.166
0.916
0.328
0.372
0.760
0.339
0.952
0.738
0.597
0.881
0.050
0.257
0.465
0.227
0.744
0.919
0.424
0.342
0.962
0.445
0.489
0.352
0.910
0.569
0.762
0.812
0.342
0.434
0.848
0.363
0.794
0.941
0.623
0.742
0.805
0.845
0.395
0.051
0.402
0.674
0.450
0.749
0.357
0.635
0.186
0.749
0.546
0.204
0.292
0.749
0.483
0.216
0.967
0.715
0.957
0.427
0.475
0.898
0.525
0.707
0.252
0.447
0.123
0.206
0.001
0.039
0.977
0.242
0.663
0.839
0.551
0.153
0.728
0.600
0.731
0.770
0.975
0.574
0.342
0.648
0.068
0.897
0.119
0.328
0.816
0.597
0.394
0.473
0.855
0.340
0.870
0.088
0.777
0.848
0.182
0.430
0.165
0.707
0.535
0.635
0.196
0.212
0.041
0.322
0.560
0.858
0.667
0.435
0.953
0.719
0.930
0.528
0.259
0.053
0.726
0.121
0.303
0.532
0.564
0.601
0.166
0.380
0.617
0.970
0.728
0.923
0.762
0.592
0.192
0.667
0.623
0.602
0.490
0.529
0.334
0.519
0.198
0.805
0.186
0.085
0.436
0.658
0.438
0.277
0.558
0.347
0.933
0.923
0.502
0.328
0.737
0.037
0.475
0.336
0.921
0.012
0.553
0.741
0.485
0.085
0.972
0.518
0.614
0.237
0.483
0.429
0.075
0.106
0.837
0.240
0.195
0.505
0.769
0.062
0.577
0.119
0.036
0.053
0.834
0.118
0.045
0.438
0.844
0.262
0.422
0.040
0.449
0.578
0.571
0.332
0.316
0.107
0.367
0.099
0.767
0.966
0.970
0.865
0.601
0.701
0.285
0.631
0.456
0.482
0.758
0.282
0.318
0.925
0.056
0.486
0.913
0.610
0.546
0.250
0.343
0.852
0.559
0.515
0.100
0.016
0.964
0.377
0.603
0.080
0.683
0.933
0.576
0.126
0.582
0.766
0.406
0.880
0.649
0.899
0.624
0.131
0.310
0.202
0.909
0.760
0.676
0.301
0.184
0.756
0.474
0.226
0.617
0.040
0.326
0.469
0.148
0.985
0.209
0.130
0.204
0.769
0.426
0.565
0.021
0.034
0.446
0.817
0.885
0.087
0.538
0.913
0.388
0.830
0.134
0.935
0.751
0.941
0.677
0.363
0.938
0.276
0.332
0.701
0.765
0.929
0.205
0.798
0.739
0.064
0.387
0.283
0.304
0.983
0.643
0.718
0.977
0.377
0.802
0.435
0.870
0.181
0.948
0.219
0.326
0.756
0.394
0.608
0.445
0.742
0.228
0.058
0.300
0.474
0.168
0.355
0.400
0.057
0.583
0.884
0.152
0.598
0.665
0.419
0.701
0.411
0.505
0.007
0.692
0.661
0.038
0.368
0.979
0.421
0.502
0.910
0.717
0.582
0.798
0.864
0.457
0.475
0.386
0.967
0.697
0.083
0.863
0.481
0.069
0.550
0.417
0.878
0.204
0.826
0.558
0.055
0.972
0.513
0.291
0.469
0.858
0.305
0.193
0.129
0.298
0.780
0.472
0.227
0.166
0.333
0.940
0.343
0.988
0.485
0.179
0.891
0.460
0.710
0.984
0.050
0.097
0.720
0.282
0.184
0.039
0.022
0.774
0.134
0.468
0.606
0.428
0.027
0.784
0.011
0.762
0.893
0.433
0.536
0.477
0.389
0.887
0.894
0.336
0.615
0.927
0.962
0.509
0.453
0.279
0.959
0.143
0.580
0.283
0.279
0.405
0.999
0.055
0.579
0.083
0.155
0.440
0.510
0.259
0.316
0.087
0.303
0.967
0.945
0.335
0.571
0.910
0.006
0.811
0.948
0.091
0.161
0.961
0.538
0.014
0.258
0.450
0.693
0.733
0.995
0.288
0.538
0.533
0.406
0.169
0.977
0.853
0.498
0.603
0.227
0.544
0.695
0.515
0.990
0.611
0.371
0.772
0.170
0.385
0.347
0.432
0.782
0.484
0.794
0.576
0.342
0.810
0.238
0.224
0.505
0.261
0.009
0.648
0.511
0.446
0.967
0.571
0.863
0.139
0.569
0.868
0.583
0.718
0.372
0.092
0.458
0.394
0.867
0.244
0.967
0.958
0.248
0.348
0.896
0.967
0.443
0.321
0.526
0.688
0.273
0.736
0.052
0.049
0.386
0.353
0.316
0.491
0.321
0.192
0.309
0.975
0.411
0.445
0.577
0.983
0.242
0.681
0.716
0.607
0.089
0.677
0.064
0.596
0.740
0.211
0.842
0.030
0.844
0.460
0.218
0.134
0.953
0.073
0.199
0.348
0.620
0.545
0.126
0.290
0.239
0.214
0.411
0.704
0.217
0.367
0.391
0.976
0.955
0.800
0.242
0.068
0.317
0.267
0.506
0.778
0.858
0.845
0.360
0.687
0.313
0.061
0.702
0.010
0.409
0.329
0.693
0.766
0.304
0.812
0.294
0.446
0.727
0.693
0.987
0.225
0.662
0.610
0.915
0.156
0.752
0.568
0.275
0.474
0.334
0.579
0.889
0.569
0.632
0.519
0.051
0.233
0.492
0.310
0.182
0.504
0.323
0.258
0.464
0.133
0.928
0.868
0.708
0.106
0.776
0.808
0.768
0.114
0.644
0.834
0.577
0.646
0.458
0.529
0.034
0.639
0.081
0.435
0.293
0.545
0.133
0.692
0.795
0.163
0.104
0.146
0.877
0.528
0.467
0.107
0.650
0.868
0.828
0.626
0.075
0.112
0.049
0.257
0.136
0.260
0.385
0.943
0.610
0.835
0.710
0.113
0.736
0.275
0.639
0.341
0.998
0.264
0.279
0.931
0.025
0.697
0.725
0.836
0.974
0.976
0.065
0.822
0.646
0.647
0.128
0.650
0.982
0.912
0.439
0.040
0.362
0.826
0.701
0.459
0.055
0.427
0.469
0.929
0.641
0.312
0.525
0.066
0.973
0.360
0.478
0.355
0.683
0.662
0.715
0.268
0.316
0.125
0.656
0.040
0.598
0.870
0.891
0.449
0.459
0.077
0.671
0.821
0.245
0.421
0.839
0.877
0.243
0.551
0.192
0.102
0.911
0.821
0.760
0.049
0.844
0.438
0.342
0.546
0.091
0.757
0.582
0.187
0.188
0.184
0.057
0.588
0.840
0.017
0.049
0.573
0.819
0.327
0.350
0.596
0.274
0.027
0.735
0.626
0.581
0.742
0.675
0.206
0.352
0.126
0.130
0.676
0.859
0.963
0.628
0.184
0.106
0.813
0.579
0.618
0.781
0.703
0.834
0.056
0.748
0.851
0.286
0.640
0.314
0.958
0.033
0.580
0.167
0.712
0.176
0.235
0.494
0.915
0.212
0.144
0.759
0.501
0.912
0.147
0.646
0.233
0.157
0.683
0.594
0.755
0.309
0.448
0.405
0.984
0.292
0.295
0.355
0.565
0.252
0.915
0.266
0.896
0.103
0.657
0.907
0.169
0.968
0.453
0.946
0.677
0.302
0.188
0.304
0.093
0.144
0.105
0.423
0.796
0.141
0.130
0.643
0.197
0.098
0.167
0.924
0.821
0.354
0.656
0.537
0.677
0.683
0.629
0.266
0.039
0.456
0.891
0.569
0.832
0.992
0.164
0.692
0.674
0.980
0.986
0.687
0.244
0.799
0.107
0.781
0.910
0.383
0.631
0.430
0.438
0.310
0.762
0.936
0.941
0.175
0.875
0.737
0.253
0.067
0.268
0.912
0.633
0.721
0.257
0.719
0.028
0.375
0.627
0.146
0.823
0.235
0.652
0.405
0.596
0.107
0.017
0.910
0.270
0.820
0.999
0.211
0.265
0.663
0.965
0.916
0.503
0.795
0.136
0.977
0.964
0.164
0.299
0.100
0.272
0.816
0.378
0.088
0.342
0.981
0.630
0.416
0.167
0.986
0.821
0.281
0.070
0.057
0.120
0.581
0.929
0.497
0.843
0.446
0.709
0.732
0.174
0.361
0.637
0.071
0.362
0.535
0.996
0.475
0.816
0.432
0.796
0.595
0.887
0.411
0.604
0.630
0.417
0.144
0.094
0.017
0.874
0.236
0.212
0.736
0.320
0.506
0.163
0.960
0.852
0.864
0.407
0.762
0.144
0.009
0.020
0.587
0.455
0.518
0.007
0.423
0.718
0.443
0.732
0.233
0.813
0.588
0.211
0.237
0.628
0.830
0.416
0.258
0.867
0.669
0.448
0.488
0.204
0.627
0.359
0.996
0.480
0.486
0.566
0.917
0.801
0.451
0.719
0.586
0.492
0.192
0.464
0.246
0.915
0.335
0.952
0.533
0.632
0.543
0.129
0.376
0.099
0.623
0.447
0.778
0.220
0.165
0.406
0.927
0.710
0.419
0.517
0.084
0.985
0.666
0.285
0.586
0.109
0.966
0.175
0.648
0.245
0.718
0.777
0.673
0.614
0.892
0.079
0.331
0.248
0.988
0.389
0.733
0.685
0.212
0.239
0.281
0.411
0.234
0.020
0.879
0.966
0.342
0.339
0.007
0.680
0.825
0.147
0.160
0.790
0.616
0.045
0.623
0.309
0.369
0.770
0.811
0.400
0.947
0.406
0.772
0.484
0.898
0.908
0.087
0.936
0.826
0.791
0.201
0.805
0.850
0.289
0.952
0.050
0.150
0.538
0.576
0.645
0.017
0.960
0.045
0.143
0.014
0.567
0.932
0.666
0.823
0.013
0.542
0.460
0.499
0.072
0.684
0.503
0.765
0.485
0.149
0.648
0.172
0.872
0.613
0.157
0.962
0.518
0.073
0.627
0.253
0.804
0.818
0.979
0.502
0.455
0.753
0.132
0.547
0.546
0.089
0.418
0.853
0.856
0.099
0.091
0.263
0.875
0.129
0.720
0.101
0.623
0.356
0.789
0.234
0.639
0.236
0.714
0.874
0.127
0.867
0.594
0.127
0.428
0.161
0.700
0.759
0.106
0.506
0.815
0.936
0.514
0.950
0.536
0.394
0.848
0.493
0.472
0.490
0.847
0.832
0.404
0.333
0.492
0.573
0.238
0.794
0.486
0.334
0.107
0.239
0.949
0.305
0.169
0.461
0.307
0.035
0.323
0.544
0.552
0.357
0.447
0.966
0.377
0.258
0.243
0.458
0.760
0.206
0.709
0.021
0.419
0.776
0.339
0.564
0.205
0.742
0.052
0.853
0.198
0.874
0.353
0.723
0.319
0.906
0.127
0.179
0.493
0.082
0.196
0.944
0.976
0.376
0.923
0.545
0.270
0.702
0.659
0.989
0.265
0.672
0.052
0.114
0.004
0.999
0.802
0.787
0.510
0.300
0.151
0.285
0.376
0.756
0.540
0.077
0.018
0.102
0.410
0.933
0.637
0.802
0.466
0.362
0.108
0.876
0.783
0.168
0.512
0.487
0.625
0.946
0.580
0.540
0.050
0.063
0.187
0.032
0.743
0.723
0.136
0.836
0.155
0.301
0.249
0.372
0.255
0.489
0.407
0.987
0.056
0.142
0.053
0.626
0.055
0.836
0.512
0.527
0.562
0.171
0.198
0.319
0.593
0.210
0.116
0.970
0.022
0.803
0.788
0.297
0.136
0.146
0.697
0.643
0.783
0.937
0.692
0.156
0.243
0.842
0.209
0.073
0.703
0.403
0.920
0.650
0.090
0.948
0.259
0.337
0.402
0.051
0.497
0.123
0.401
0.351
0.773
0.803
0.153
0.367
0.373
0.151
0.798
0.707
0.190
0.367
0.106
0.628
0.121
0.986
0.781
0.390
0.579
0.911
0.554
0.174
0.379
0.843
0.644
0.651
0.867
0.894
0.920
0.095
0.544
0.741
0.757
0.607
0.167
0.797
0.720
0.086
0.464
0.174
0.337
0.018
0.400
0.550
0.620
0.663
0.836
0.215
0.120
0.852
0.157
0.975
0.843
0.501
0.981
0.097
0.303
0.638
0.350
0.928
0.066
0.549
0.591
0.653
0.256
0.365
0.844
0.195
0.827
0.563
0.762
0.714
0.208
0.569
0.868
0.432
0.967
0.452
0.143
0.389
0.952
0.320
0.868
0.881
0.350
0.083
0.725
0.312
0.783
0.443
0.485
0.778
0.382
0.443
0.376
0.367
0.526
0.809
0.362
0.524
0.671
0.269
0.278
0.933
0.724
0.577
0.648
0.029
0.618
0.510
0.546
0.359
0.081
0.193
0.951
0.409
0.467
0.053
0.038
0.717
0.538
0.509
0.243
0.745
0.168
0.190
0.461
0.286
0.247
0.645
0.651
0.825
0.418
0.073
0.274
0.980
0.596
0.710
0.007
0.518
0.038
0.569
0.921
0.968
0.541
0.458
0.945
0.674
0.065
0.466
0.240
0.470
0.455
0.582
0.357
0.164
0.504
0.830
0.664
0.338
0.936
0.912
0.209
0.039
0.741
0.828
0.968
0.890
0.601
0.530
0.028
0.247
0.460
0.629
0.557
0.743
0.280
0.565
0.088
0.414
0.750
0.693
0.991
0.278
0.498
0.330
0.441
0.857
0.684
0.882
0.204
0.873
0.060
0.889
0.329
0.315
0.512
0.941
0.048
0.352
0.857
0.396
0.199
0.425
0.386
0.915
0.762
0.505
0.988
0.465
0.058
0.262
0.436
0.157
0.948
0.767
0.268
0.591
0.721
0.941
0.882
0.515
0.775
0.562
0.553
0.115
0.126
0.530
0.269
0.021
0.723
0.032
0.595
0.539
0.969
0.484
0.064
0.552
0.841
0.997
0.430
0.320
0.907
0.191
0.615
0.406
0.041
0.347
0.265
0.686
0.949
0.268
0.811
0.055
0.538
0.071
0.934
0.766
0.658
0.824
0.648
0.969
0.714
0.925
0.517
0.117
0.528
0.127
0.592
0.482
0.623
0.010
0.195
0.491
0.951
0.384
0.424
0.715
0.604
0.299
0.411
0.218
0.966
0.131
0.372
0.026
0.222
0.780
0.201
0.080
0.474
0.010
0.085
0.808
0.771
0.915
0.855
0.683
0.052
0.783
0.846
0.799
0.508
0.233
0.845
0.727
0.981
0.156
0.226
0.212
0.793
0.187
0.067
0.999
0.025
0.835
0.365
0.842
0.555
0.902
0.926
0.009
0.075
0.250
0.669
0.888
0.460
0.607
0.188
0.805
0.967
0.269
0.495
0.086
0.559
0.503
0.144
0.442
0.334
0.522
0.586
0.008
0.761
0.713
0.366
0.053
0.755
0.955
0.240
0.324
0.184
0.577
0.155
0.356
0.697
0.560
0.386
0.420
0.562
0.006
0.079
0.881
0.078
0.707
0.920
0.271
0.370
0.569
0.882
0.942
0.740
0.087
0.669
0.410
0.471
0.049
0.191
0.791
0.454
0.302
0.384
0.854
0.749
0.272
0.118
0.252
0.921
0.582
0.533
0.596
0.366
0.807
0.523
0.117
0.129
0.537
0.937
0.308
0.870
0.764
0.248
0.036
0.102
0.728
0.200
0.371
0.502
0.725
0.113
0.406
0.485
0.541
0.818
0.172
0.808
0.693
0.556
0.700
0.412
0.138
0.666
0.525
0.639
0.338
0.907
0.587
0.085
0.450
0.756
0.794
0.495
0.895
0.268
0.411
0.899
0.896
0.686
0.859
0.089
0.865
0.321
0.178
0.239
0.418
0.251
0.321
0.135
0.296
0.572
0.442
0.347
0.746
0.894
0.757
0.332
0.744
0.152
0.130
0.346
0.808
0.246
0.063
0.132
0.859
0.634
0.253
0.555
0.528
0.655
0.233
0.864
0.309
0.694
0.016
0.373
0.352
0.543
0.073
0.790
0.218
0.677
0.803
0.562
0.741
0.662
0.356
0.809
0.295
0.571
0.745
0.379
0.303
0.459
0.818
0.930
0.545
0.661
0.903
0.099
0.232
0.709
0.824
0.857
0.415
0.089
0.438
0.231
0.291
0.772
0.076
0.828
0.428
0.596
0.090
0.337
0.090
0.962
0.807
0.460
0.224
0.933
0.345
0.161
0.563
0.944
0.004
0.238
0.938
0.171
0.113
0.570
0.862
0.477
0.995
0.679
0.074
0.993
0.408
0.287
0.558
0.304
0.086
0.001
0.450
0.837
0.572
0.999
0.287
0.280
0.160
0.164
0.285
0.793
0.906
0.975
0.232
0.881
0.328
0.113
0.870
0.636
0.165
0.430
0.287
0.737
0.453
0.989
0.055
0.355
0.482
0.503
0.316
0.729
0.875
0.939
0.805
0.851
0.551
0.826
0.304
0.398
0.317
0.467
0.286
0.962
0.313
0.098
0.527
0.057
0.111
0.219
0.464
0.433
0.800
0.739
0.426
0.334
0.316
0.774
0.052
0.013
0.796
0.477
0.131
0.236
0.933
0.860
0.710
0.482
0.520
0.975
0.966
0.215
0.811
0.197
0.214
0.676
0.405
0.932
0.089
0.512
0.919
0.904
0.157
0.367
0.619
0.159
0.968
0.915
0.167
0.121
0.085
0.909
0.194
0.416
0.449
0.599
0.072
0.227
0.931
0.908
0.781
0.019
0.327
0.628
0.275
0.910
0.634
0.072
0.909
0.958
0.078
0.532
0.303
0.393
0.180
0.578
0.958
0.503
0.083
0.620
0.340
0.856
0.435
0.193
0.191
0.769
0.266
0.767
0.206
0.322
0.465
0.943
0.932
0.207
0.358
0.149
0.510
0.461
0.901
0.930
0.384
0.492
0.896
0.807
0.004
0.320
0.422
0.230
0.200
0.038
0.643
0.263
0.052
0.231
0.802
0.851
0.483
0.416
0.978
0.201
0.215
0.292
0.173
0.379
0.911
0.615
0.412
0.934
0.048
0.210
0.566
0.616
0.040
0.403
0.649
0.698
0.638
0.321
0.094
0.427
0.129
0.948
0.475
0.317
0.766
0.498
0.310
0.812
0.867
0.673
0.047
0.566
0.763
0.137
0.229
0.882
0.020
0.753
0.474
0.447
0.007
0.819
0.360
0.541
0.677
0.703
0.956
0.531
0.400
0.954
0.642
0.298
0.126
0.843
0.362
0.660
0.471
0.199
0.602
0.061
0.413
0.547
0.436
0.749
0.828
0.771
0.039
0.194
0.537
0.935
0.835
0.843
0.303
0.429
0.471
0.157
0.031
0.947
0.241
0.240
0.083
0.929
0.580
0.667
0.912
0.346
0.622
......@@ -160,6 +160,21 @@ inline static const char* Atoi(const char* p, int* out) {
return p;
}
template<class T>
inline static double Pow(T base, int power) {
if (power < 0) {
return 1.0 / Pow(base, -power);
} else if (power == 0) {
return 1;
} else if (power % 2 == 0) {
return Pow(base*base, power / 2);
} else if (power % 3 == 0) {
return Pow(base*base*base, power / 3);
} else {
return base * Pow(base, power - 1);
}
}
inline static const char* Atof(const char* p, double* out) {
int frac;
double sign, value, scale;
......@@ -168,7 +183,6 @@ inline static const char* Atof(const char* p, double* out) {
while (*p == ' ') {
++p;
}
// Get sign, if any.
sign = 1.0;
if (*p == '-') {
......@@ -187,13 +201,15 @@ inline static const char* Atof(const char* p, double* out) {
// Get digits after decimal point, if any.
if (*p == '.') {
double pow10 = 10.0;
double right = 0.0;
int nn = 0;
++p;
while (*p >= '0' && *p <= '9') {
value += (*p - '0') / pow10;
pow10 *= 10.0;
right = (*p - '0') + right * 10.0;
++nn;
++p;
}
value += right / Pow(10.0, nn);
}
// Handle exponent, if any.
......@@ -250,8 +266,6 @@ inline static const char* Atof(const char* p, double* out) {
return p;
}
inline bool AtoiAndCheck(const char* p, int* out) {
const char* after = Atoi(p, out);
if (*after != '\0') {
......@@ -632,6 +646,15 @@ inline bool FindInBitset(const uint32_t* bits, int n, T pos) {
return (bits[i1] >> i2) & 1;
}
inline static bool CheckDoubleEqualOrdered(double a, double b) {
double upper = std::nextafter(a, INFINITY);
return b <= upper;
}
inline static double GetDoubleUpperBound(double a) {
return std::nextafter(a, INFINITY);;
}
} // namespace Common
} // namespace LightGBM
......
......@@ -131,7 +131,7 @@ def param_dict_to_str(data):
pairs.append(str(key) + '=' + ','.join(map(str, val)))
elif isinstance(val, string_type) or isinstance(val, numeric_types) or is_numeric(val):
pairs.append(str(key) + '=' + str(val))
else:
elif val is not None:
raise TypeError('Unknown type of parameter:%s, got:%s'
% (key, type(val).__name__))
return ' '.join(pairs)
......@@ -555,8 +555,8 @@ class _InnerPredictor(object):
class Dataset(object):
"""Dataset in LightGBM."""
def __init__(self, data, label=None, max_bin=255, reference=None,
weight=None, group=None, silent=False,
def __init__(self, data, label=None, max_bin=None, reference=None,
weight=None, group=None, init_score=None, silent=False,
feature_name='auto', categorical_feature='auto', params=None,
free_raw_data=True):
"""Constract Dataset.
......@@ -566,9 +566,9 @@ class Dataset(object):
data : string, numpy array or scipy.sparse
Data source of Dataset.
If string, it represents the path to txt file.
label : list or numpy 1-D array, optional (default=None)
label : list, numpy 1-D array or None, optional (default=None)
Label of the data.
max_bin : int, optional (default=255)
max_bin : int or None, optional (default=None)
Max number of discrete bins for features.
reference : Dataset or None, optional (default=None)
If this is Dataset for validation, training data should be used as reference.
......@@ -576,6 +576,8 @@ class Dataset(object):
Weight for each instance.
group : list, numpy 1-D array or None, optional (default=None)
Group/query size for Dataset.
init_score : list, numpy 1-D array or None, optional (default=None)
Init score for Dataset.
silent : bool, optional (default=False)
Whether to print messages during construction.
feature_name : list of strings or 'auto', optional (default="auto")
......@@ -598,6 +600,7 @@ class Dataset(object):
self.reference = reference
self.weight = weight
self.group = group
self.init_score = init_score
self.silent = silent
self.feature_name = feature_name
self.categorical_feature = categorical_feature
......@@ -616,8 +619,8 @@ class Dataset(object):
_safe_call(_LIB.LGBM_DatasetFree(self.handle))
self.handle = None
def _lazy_init(self, data, label=None, max_bin=255, reference=None,
weight=None, group=None, predictor=None,
def _lazy_init(self, data, label=None, max_bin=None, reference=None,
weight=None, group=None, init_score=None, predictor=None,
silent=False, feature_name='auto',
categorical_feature='auto', params=None):
if data is None:
......@@ -633,7 +636,8 @@ class Dataset(object):
params = {} if params is None else params
self.max_bin = max_bin
self.predictor = predictor
params["max_bin"] = max_bin
if self.max_bin is not None:
params["max_bin"] = self.max_bin
if "verbosity" in params:
params.setdefault("verbose", params.pop("verbosity"))
if silent:
......@@ -655,6 +659,10 @@ class Dataset(object):
raise TypeError("Wrong type({}) or unknown name({}) in categorical_feature"
.format(type(name).__name__, name))
if categorical_indices:
if "categorical_feature" in params or "categorical_column" in params:
warnings.warn('categorical_feature in param dict is overrided.')
params.pop("categorical_feature", None)
params.pop("categorical_column", None)
params['categorical_column'] = sorted(categorical_indices)
params_str = param_dict_to_str(params)
......@@ -697,7 +705,11 @@ class Dataset(object):
if group is not None:
self.set_group(group)
# load init score
if isinstance(self.predictor, _InnerPredictor):
if init_score is not None:
self.set_init_score(init_score)
if self.predictor is not None:
warnings.warn("The prediction of init_model will be overrided by init_score.")
elif isinstance(self.predictor, _InnerPredictor):
init_score = self.predictor.predict(data,
raw_score=True,
data_has_header=self.data_has_header,
......@@ -802,7 +814,7 @@ class Dataset(object):
if self.used_indices is None:
"""create valid"""
self._lazy_init(self.data, label=self.label, max_bin=self.max_bin, reference=self.reference,
weight=self.weight, group=self.group, predictor=self._predictor,
weight=self.weight, group=self.group, init_score=self.init_score, predictor=self._predictor,
silent=self.silent, feature_name=self.feature_name, params=self.params)
else:
"""construct subset"""
......@@ -820,15 +832,15 @@ class Dataset(object):
else:
"""create train"""
self._lazy_init(self.data, label=self.label, max_bin=self.max_bin,
weight=self.weight, group=self.group, predictor=self._predictor,
silent=self.silent, feature_name=self.feature_name,
weight=self.weight, group=self.group, init_score=self.init_score,
predictor=self._predictor, silent=self.silent, feature_name=self.feature_name,
categorical_feature=self.categorical_feature, params=self.params)
if self.free_raw_data:
self.data = None
return self
def create_valid(self, data, label=None, weight=None, group=None,
silent=False, params=None):
init_score=None, silent=False, params=None):
"""Create validation data align with current Dataset.
Parameters
......@@ -842,6 +854,8 @@ class Dataset(object):
Weight for each instance.
group : list, numpy 1-D array or None, optional (default=None)
Group/query size for Dataset.
init_score : list, numpy 1-D array or None, optional (default=None)
Init score for Dataset.
silent : bool, optional (default=False)
Whether to print messages during construction.
params: dict or None, optional (default=None)
......@@ -853,8 +867,8 @@ class Dataset(object):
Returns self.
"""
ret = Dataset(data, label=label, max_bin=self.max_bin, reference=self,
weight=weight, group=group, silent=silent, params=params,
free_raw_data=self.free_raw_data)
weight=weight, group=group, init_score=init_score,
silent=silent, params=params, free_raw_data=self.free_raw_data)
ret._predictor = self._predictor
ret.pandas_categorical = self.pandas_categorical
return ret
......
......@@ -95,13 +95,13 @@ def train(params, train_set, num_boost_round=100,
"""create predictor first"""
for alias in ["num_boost_round", "num_iterations", "num_iteration", "num_tree", "num_trees", "num_round", "num_rounds"]:
if alias in params:
num_boost_round = int(params.pop(alias))
warnings.warn("Found `{}` in params. Will use it instead of argument".format(alias))
num_boost_round = params.pop(alias)
break
for alias in ["early_stopping_round", "early_stopping_rounds", "early_stopping"]:
if alias in params:
if alias in params and params[alias] is not None:
early_stopping_rounds = int(params.pop(alias))
warnings.warn("Found `{}` in params. Will use it instead of argument".format(alias))
early_stopping_rounds = params.pop(alias)
break
if isinstance(init_model, string_type):
......
......@@ -142,7 +142,7 @@ class LGBMModel(_LGBMModelBase):
subsample_for_bin=200000, objective=None,
min_split_gain=0., min_child_weight=1e-3, min_child_samples=20,
subsample=1., subsample_freq=1, colsample_bytree=1.,
reg_alpha=0., reg_lambda=0., random_state=0,
reg_alpha=0., reg_lambda=0., random_state=None,
n_jobs=-1, silent=True, **kwargs):
"""Construct a gradient boosting model.
......@@ -185,8 +185,9 @@ class LGBMModel(_LGBMModelBase):
L1 regularization term on weights.
reg_lambda : float, optional (default=0.)
L2 regularization term on weights.
random_state : int, optional (default=0)
random_state : int or None, optional (default=None)
Random number seed.
Will use default seeds in c++ code if set to None.
n_jobs : int, optional (default=-1)
Number of parallel threads.
silent : bool, optional (default=True)
......
......@@ -79,10 +79,13 @@ namespace LightGBM {
for (int i = 0; i < num_distinct_values - 1; ++i) {
cur_cnt_inbin += counts[i];
if (cur_cnt_inbin >= min_data_in_bin) {
bin_upper_bound.push_back((distinct_values[i] + distinct_values[i + 1]) / 2);
auto val = Common::GetDoubleUpperBound((distinct_values[i] + distinct_values[i + 1]) / 2.0);
if (bin_upper_bound.empty() || !Common::CheckDoubleEqualOrdered(bin_upper_bound.back(), val)) {
bin_upper_bound.push_back(val);
cur_cnt_inbin = 0;
}
}
}
cur_cnt_inbin += counts[num_distinct_values - 1];
bin_upper_bound.push_back(std::numeric_limits<double>::infinity());
} else {
......@@ -131,12 +134,15 @@ namespace LightGBM {
}
++bin_cnt;
// update bin upper bound
bin_upper_bound.resize(bin_cnt);
bin_upper_bound.clear();
for (int i = 0; i < bin_cnt - 1; ++i) {
bin_upper_bound[i] = (upper_bounds[i] + lower_bounds[i + 1]) / 2.0f;
auto val = Common::GetDoubleUpperBound((upper_bounds[i] + lower_bounds[i + 1]) / 2.0);
if (bin_upper_bound.empty() || !Common::CheckDoubleEqualOrdered(bin_upper_bound.back(), val)) {
bin_upper_bound.push_back(val);
}
}
// last bin upper bound
bin_upper_bound[bin_cnt - 1] = std::numeric_limits<double>::infinity();
bin_upper_bound.push_back(std::numeric_limits<double>::infinity());
}
return bin_upper_bound;
}
......@@ -241,7 +247,7 @@ namespace LightGBM {
}
for (int i = 1; i < num_sample_values; ++i) {
if (values[i] != values[i - 1]) {
if (!Common::CheckDoubleEqualOrdered(values[i - 1], values[i])) {
if (values[i - 1] < 0.0f && values[i] > 0.0f) {
distinct_values.push_back(0.0f);
counts.push_back(zero_cnt);
......@@ -249,6 +255,8 @@ namespace LightGBM {
distinct_values.push_back(values[i]);
counts.push_back(1);
} else {
// use the large value
distinct_values.back() = values[i];
++counts.back();
}
}
......
# coding: utf-8
# pylint: skip-file
import os
import unittest
import lightgbm as lgb
import numpy as np
from sklearn.datasets import load_svmlight_file
class FileLoader(object):
def __init__(self, directory, prefix, config_file='train.conf'):
directory = os.path.join(os.path.dirname(os.path.realpath(__file__)), directory)
self.directory = directory
self.prefix = prefix
self.params = {'gpu_use_dp': True}
with open(os.path.join(directory, config_file), 'r') as f:
for line in f.readlines():
line = line.strip()
if line and not line.startswith('#'):
key, value = [token.strip() for token in line.split('=')]
if 'early_stopping' not in key: # disable early_stopping
self.params[key] = value
def load_dataset(self, suffix, is_sparse=False):
filename = os.path.join(self.directory, self.prefix + suffix)
if is_sparse:
X, Y = load_svmlight_file(filename, dtype=np.float64, zero_based=True)
return X, Y, filename
else:
mat = np.loadtxt(filename, dtype=np.float64)
return mat[:, 1:], mat[:, 0], filename
def load_field(self, suffix):
return np.loadtxt(os.path.join(self.directory, self.prefix + suffix))
def load_cpp_result(self, result_file='LightGBM_predict_result.txt'):
return np.loadtxt(os.path.join(self.directory, result_file))
def train_predict_check(self, lgb_train, X_test, X_test_fn, sk_pred):
gbm = lgb.train(self.params, lgb_train)
y_pred = gbm.predict(X_test)
cpp_pred = gbm.predict(X_test_fn)
np.testing.assert_array_almost_equal(y_pred, cpp_pred, decimal=5)
np.testing.assert_array_almost_equal(y_pred, sk_pred, decimal=5)
class TestEngine(unittest.TestCase):
def test_binary(self):
fd = FileLoader('../../examples/binary_classification', 'binary')
X_train, y_train, _ = fd.load_dataset('.train')
X_test, _, X_test_fn = fd.load_dataset('.test')
weight_train = fd.load_field('.train.weight')
lgb_train = lgb.Dataset(X_train, y_train, params=fd.params, weight=weight_train)
gbm = lgb.LGBMClassifier(**fd.params)
gbm.fit(X_train, y_train, sample_weight=weight_train)
sk_pred = gbm.predict_proba(X_test)[:, 1]
fd.train_predict_check(lgb_train, X_test, X_test_fn, sk_pred)
def test_multiclass(self):
fd = FileLoader('../../examples/multiclass_classification', 'multiclass')
X_train, y_train, _ = fd.load_dataset('.train')
X_test, _, X_test_fn = fd.load_dataset('.test')
lgb_train = lgb.Dataset(X_train, y_train)
gbm = lgb.LGBMClassifier(**fd.params)
gbm.fit(X_train, y_train)
sk_pred = gbm.predict_proba(X_test)
fd.train_predict_check(lgb_train, X_test, X_test_fn, sk_pred)
def test_regression(self):
fd = FileLoader('../../examples/regression', 'regression')
X_train, y_train, _ = fd.load_dataset('.train')
X_test, _, X_test_fn = fd.load_dataset('.test')
init_score_train = fd.load_field('.train.init')
lgb_train = lgb.Dataset(X_train, y_train, init_score=init_score_train)
gbm = lgb.LGBMRegressor(**fd.params)
gbm.fit(X_train, y_train, init_score=init_score_train)
sk_pred = gbm.predict(X_test)
fd.train_predict_check(lgb_train, X_test, X_test_fn, sk_pred)
def test_lambdarank(self):
fd = FileLoader('../../examples/lambdarank', 'rank')
X_train, y_train, _ = fd.load_dataset('.train', is_sparse=True)
X_test, _, X_test_fn = fd.load_dataset('.test', is_sparse=True)
group_train = fd.load_field('.train.query')
lgb_train = lgb.Dataset(X_train, y_train, group=group_train)
gbm = lgb.LGBMRanker(**fd.params)
gbm.fit(X_train, y_train, group=group_train)
sk_pred = gbm.predict(X_test)
fd.train_predict_check(lgb_train, X_test, X_test_fn, sk_pred)
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment