1. 13 Apr, 2019 2 commits
  2. 11 Apr, 2019 3 commits
  3. 10 Apr, 2019 2 commits
  4. 09 Apr, 2019 1 commit
    • Nikita Titov's avatar
      [ci] update CI stuff (#2079) · 691b8428
      Nikita Titov authored
      * updated boost submodule
      
      * updated docker with new stable Clang and CMake
      
      * switch to dev docker
      
      * updated setup script
      
      * updated MinGW on Appveyor
      
      * updated Azure config to use docker for GPU task
      
      * do not upgrade gcc - takes too long
      
      * test: switch compilers
      
      * switch compilers back
      
      * get back to main docker
      691b8428
  5. 04 Apr, 2019 1 commit
    • remcob-gr's avatar
      Add Cost Effective Gradient Boosting (#2014) · 76102284
      remcob-gr authored
      * Add configuration parameters for CEGB.
      
      * Add skeleton CEGB tree learner
      
      Like the original CEGB version, this inherits from SerialTreeLearner.
      Currently, it changes nothing from the original.
      
      * Track features used in CEGB tree learner.
      
      * Pull CEGB tradeoff and coupled feature penalty from config.
      
      * Implement finding best splits for CEGB
      
      This is heavily based on the serial version, but just adds using the coupled penalties.
      
      * Set proper defaults for cegb parameters.
      
      * Ensure sanity checks don't switch off CEGB.
      
      * Implement per-data-point feature penalties in CEGB.
      
      * Implement split penalty and remove unused parameters.
      
      * Merge changes from CEGB tree learner into serial tree learner
      
      * Represent features_used_in_data by a bitset, to reduce the memory overhead of CEGB, and add sanity checks for the lengths of the penalty vectors.
      
      * Fix bug where CEGB would incorrectly penalise a previously used feature
      
      The tree learner did not update the gains of previously computed leaf splits when splitting a leaf elsewhere in the tree.
      This caused it to prefer new features due to incorrectly penalising splitting on previously used features.
      
      * Document CEGB parameters and add them to the appropriate section.
      
      * Remove leftover reference to cegb tree learner.
      
      * Remove outdated diff.
      
      * Fix warnings
      
      * Fix minor issues identified by @StrikerRUS.
      
      * Add docs section on CEGB, including citation.
      
      * Fix link.
      
      * Fix CI failure.
      
      * Add some unit tests
      
      * Fix pylint issues.
      
      * Fix remaining pylint issue
      76102284
  6. 02 Apr, 2019 1 commit
  7. 01 Apr, 2019 1 commit
  8. 26 Mar, 2019 3 commits
  9. 25 Mar, 2019 3 commits
    • mjmckp's avatar
      Add API method LGBM_BoosterPredictForMats (#2008) · 823fc03c
      mjmckp authored
      * Fix index out-of-range exception generated by BaggingHelper on small datasets.
      
      Prior to this change, the line "score_t threshold = tmp_gradients[top_k - 1];" would generate an exception, since tmp_gradients would be empty when the cnt input value to the function is zero.
      
      * Update goss.hpp
      
      * Update goss.hpp
      
      * Add API method LGBM_BoosterPredictForMats which runs prediction on a data set given as of array of pointers to rows (as opposed to existing method LGBM_BoosterPredictForMat which requires data given as contiguous array)
      
      * Fix incorrect upstream merge
      
      * Add link to LightGBM.NET
      
      * Fix indenting to 2 spaces
      
      * Dummy edit to trigger CI
      
      * Dummy edit to trigger CI
      823fc03c
    • kenmatsu4's avatar
      [python] Use first_metric_only flag for early_stopping function. (#2049) · 011cc90a
      kenmatsu4 authored
      * Use first_metric_only flag for early_stopping function.
      
      In order to apply early stopping with only first metric, applying first_metric_only flag for early_stopping function.
      
      * upcate comment
      
      * Revert "upcate comment"
      
      This reverts commit 1e75a1a415cc16cfbe795181e148ebfe91469be4.
      
      * added test
      
      * fixed docstring
      
      * cut comment and save one line
      
      * document new feature
      011cc90a
    • Guolin Ke's avatar
      remove warnings · 548fc91e
      Guolin Ke authored
      548fc91e
  10. 22 Mar, 2019 1 commit
  11. 20 Mar, 2019 1 commit
  12. 18 Mar, 2019 2 commits
  13. 16 Mar, 2019 1 commit
  14. 14 Mar, 2019 4 commits
  15. 09 Mar, 2019 2 commits
  16. 07 Mar, 2019 2 commits
  17. 26 Feb, 2019 1 commit
    • remcob-gr's avatar
      Add ability to move features from one data set to another in memory (#2006) · 219c943d
      remcob-gr authored
      * Initial attempt to implement appending features in-memory to another data set
      
      The intent is for this to enable munging files together easily, without needing to round-trip via numpy or write multiple copies to disk.
      In turn, that enables working more efficiently with data sets that were written separately.
      
      * Implement Dataset.dump_text, and fix small bug in appending of group bin boundaries.
      
      Dumping to text enables us to compare results, without having to worry about issues like features being reordered.
      
      * Add basic tests for validation logic for add_features_from.
      
      * Remove various internal mapping items from dataset text dumps
      
      These are too sensitive to the exact feature order chosen, which is not visible to the user.
      Including them in tests appears unnecessary, as the data dumping code should provide enough coverage.
      
      * Add test that add_features_from results in identical data sets according to dump_text.
      
      * Add test that booster behaviour after using add_features_from matches that of training on the full data
      
      This checks:
      - That training after add_features_from works at all
      - That add_features_from does not cause training to misbehave
      
      * Expose feature_penalty and monotone_types/constraints via get_field
      
      These getters allow us to check that add_features_from does the right thing with these vectors.
      
      * Add tests that add_features correctly handles feature_penalty and monotone_constraints.
      
      * Ensure add_features_from properly frees the added dataset and add unit test for this
      
      Since add_features_from moves the feature group pointers from the added dataset to the dataset being added to, the added dataset is invalid after the call.
      We must ensure we do not try and access this handle.
      
      * Remove some obsolete TODOs
      
      * Tidy up DumpTextFile by using a single iterator for each feature
      
      This iterators were also passed around as raw pointers without being freed, which is now fixed.
      
      * Factor out offsetting logic in AddFeaturesFrom
      
      * Remove obsolete TODO
      
      * Remove another TODO
      
      This one is debatable, test code can be a bit messy and duplicate-heavy, factoring it out tends to end badly.
      Leaving this for now, will revisit if adding more tests later on becomes a mess.
      
      * Add documentation for newly-added methods.
      
      * Fix whitespace issues identified by pylint.
      
      * Fix a few more whitespace issues.
      
      * Fix doc comments
      
      * Implement deep copying for feature groups.
      
      * Replace awkward std::move usage by emplace_back, and reduce vector size to num_features rather than num_total_features.
      
      * Copy feature groups in addFeaturesFrom, rather than moving them.
      
      * Fix bugs in FeatureGroup copy constructor and ensure source dataset remains usable
      
      * Add reserve to PushVector and PushOffset
      
      * Move definition of Clone into class body
      
      * Fix PR review issues
      
      * Fix for loop increment style.
      
      * Fix test failure
      
      * Some more docstring fixes.
      
      * Remove blank line
      219c943d
  18. 24 Feb, 2019 1 commit
  19. 21 Feb, 2019 1 commit
  20. 20 Feb, 2019 1 commit
  21. 18 Feb, 2019 3 commits
  22. 07 Feb, 2019 1 commit
  23. 06 Feb, 2019 1 commit
  24. 05 Feb, 2019 1 commit