1. 04 Nov, 2021 1 commit
  2. 03 Nov, 2021 3 commits
  3. 29 Oct, 2021 1 commit
  4. 28 Oct, 2021 1 commit
  5. 27 Oct, 2021 1 commit
  6. 26 Oct, 2021 2 commits
  7. 21 Oct, 2021 1 commit
    • Xin Yao's avatar
      [Sampling] Implement dgl.compact_graphs() for the GPU (#3423) · a8c81018
      Xin Yao authored
      * gpu compact graph template
      
      * cuda compact graph draft
      
      * fix typo
      
      * compact graphs
      
      * pass unit test but fail in training
      
      * example using EdgeDataLoader on the GPU
      
      * refactor cuda_compact_graph and cuda_to_block
      
      * update training scripts
      
      * fix linting
      
      * fix linting
      
      * fix exclude_edges for the GPU
      
      * add --data-cpu & fix copyright
      a8c81018
  8. 19 Oct, 2021 1 commit
  9. 18 Oct, 2021 4 commits
  10. 15 Oct, 2021 2 commits
  11. 14 Oct, 2021 6 commits
  12. 12 Oct, 2021 2 commits
  13. 11 Oct, 2021 2 commits
  14. 07 Oct, 2021 1 commit
    • K's avatar
      [Model] Refine GraphSAINT (#3328) · aef96dfa
      K authored
      * The start of experiments of Jiahang Li on GraphSAINT.
      
      * a nightly build
      
      * a nightly build
      
      Check the basic pipeline of codes. Next to check the details of samplers , GCN layer (forward propagation) and loss (backward propagation)
      
      * a night build
      
      * Implement GraphSAINT with torch.dataloader
      
      There're still some bugs with sampling in training procedure
      
      * Test validity
      
      Succeed in testing validity on ppi_node experiments without testing other setup.
      1. Online sampling on ppi_node experiments performs perfectly.
      2. Sampling speed is a bit slow because the operations on [dgl.subgraphs], next step is to improve this part by putting the conversion into parallelism
      3. Figuring out why offline+online sampling method performs bad, which does not make sense
      4. Doing experiments on other setup
      
      * Implement saint with torch.dataloader
      
      Use torch.dataloader to speed up saint sampling with experiments. Except experiments on too large dataset Amazon, we've ...
      aef96dfa
  15. 30 Sep, 2021 1 commit
  16. 29 Sep, 2021 1 commit
  17. 28 Sep, 2021 1 commit
  18. 23 Sep, 2021 2 commits
  19. 22 Sep, 2021 1 commit
  20. 21 Sep, 2021 3 commits
  21. 20 Sep, 2021 1 commit
  22. 19 Sep, 2021 1 commit
  23. 17 Sep, 2021 1 commit