Commit 25d39342 authored by liyinhao's avatar liyinhao
Browse files

change copyright, change README, add README in configs

parent 5070ebff
# Deep Hough Voting for 3D Object Detection in Point Clouds
## Introduction
```
@inproceedings{qi2019deep,
author = {Qi, Charles R and Litany, Or and He, Kaiming and Guibas, Leonidas J},
title = {Deep Hough Voting for 3D Object Detection in Point Clouds},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision},
year = {2019}
}
```
## Results and models
| Backbone | Style | Lr schd | Mem (GB) | Inf time (fps) | box AP | Download |
## Different regression loss
| Backbone | Loss type | Mem (GB) | Inf time (fps) | box AP | Download |
## Pre-trained Models
| Backbone | Style | Lr schd | Mem (GB) | Inf time (fps) | box AP | Download |
...@@ -3,10 +3,14 @@ We follow the procedure in [votenet](https://github.com/facebookresearch/votenet ...@@ -3,10 +3,14 @@ We follow the procedure in [votenet](https://github.com/facebookresearch/votenet
1. Download ScanNet v2 data [HERE](https://github.com/ScanNet/ScanNet). Link or move the 'scans' folder to this level of directory. 1. Download ScanNet v2 data [HERE](https://github.com/ScanNet/ScanNet). Link or move the 'scans' folder to this level of directory.
2. In this level of directory, extract point clouds and annotations by running `python batch_load_scannet_data.py`. 2. In this directory, extract point clouds and annotations by running `python batch_load_scannet_data.py`.
3. Enter the project root directory, generate training data by running `python tools/create_data.py scannet --root-path ./data/scannet --out-dir ./data/scannet --extra-tag scannet`. 3. Enter the project root directory, generate training data by running
```bash
python tools/create_data.py scannet --root-path ./data/scannet --out-dir ./data/scannet --extra-tag scannet
```
The directory structure after pre-processing should be as below
``` ```
scannet scannet
├── scannet_utils.py ├── scannet_utils.py
......
# Modified from # Modified from
# https://github.com/facebookresearch/votenet/blob/master/scannet/batch_load_scannet_data.py # https://github.com/facebookresearch/votenet/blob/master/scannet/batch_load_scannet_data.py
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""Batch mode in loading Scannet scenes with vertices and ground truth labels """Batch mode in loading Scannet scenes with vertices and ground truth labels
for semantic and instance segmentations for semantic and instance segmentations
Usage example: python ./batch_load_scannet_data.py Usage example: python ./batch_load_scannet_data.py
""" """
import argparse import argparse
import datetime import datetime
import os import os
......
# Modified from # Modified from
# https://github.com/facebookresearch/votenet/blob/master/scannet/load_scannet_data.py # https://github.com/facebookresearch/votenet/blob/master/scannet/load_scannet_data.py
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""Load Scannet scenes with vertices and ground truth labels """Load Scannet scenes with vertices and ground truth labels
for semantic and instance segmentations for semantic and instance segmentations
""" """
import argparse import argparse
import inspect import inspect
import json import json
......
# Modified from # Modified from
# https://github.com/facebookresearch/votenet/blob/master/scannet/scannet_utils.py # https://github.com/facebookresearch/votenet/blob/master/scannet/scannet_utils.py
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""Ref: https://github.com/ScanNet/ScanNet/blob/master/BenchmarkScripts """Ref: https://github.com/ScanNet/ScanNet/blob/master/BenchmarkScripts
""" """
import csv import csv
import os import os
......
...@@ -5,14 +5,17 @@ We follow the procedure in [votenet](https://github.com/facebookresearch/votenet ...@@ -5,14 +5,17 @@ We follow the procedure in [votenet](https://github.com/facebookresearch/votenet
2. Enter the `matlab` folder, Extract point clouds and annotations by running `extract_split.m`, `extract_rgbd_data_v2.m` and `extract_rgbd_data_v1.m`. 2. Enter the `matlab` folder, Extract point clouds and annotations by running `extract_split.m`, `extract_rgbd_data_v2.m` and `extract_rgbd_data_v1.m`.
3. Back to this level of directory, prepare data by running `python sunrgbd_data.py --gen_v1_data`. 3. Back to this directory, prepare data by running `python sunrgbd_data.py --gen_v1_data`.
4. Enter the project root directory, Generate training data by running `python tools/create_data.py sunrgbd --root-path ./data/sunrgbd --out-dir ./data/sunrgbd --extra-tag sunrgbd`. 4. Enter the project root directory, Generate training data by running
```bash
python tools/create_data.py sunrgbd --root-path ./data/sunrgbd --out-dir ./data/sunrgbd --extra-tag sunrgbd
```
NOTE: SUNRGBDtoolbox.zip should have MD5 hash `18d22e1761d36352f37232cba102f91f` (you can check the hash with `md5 SUNRGBDtoolbox.zip` on Mac OS or `md5sum SUNRGBDtoolbox.zip` on Linux) NOTE: SUNRGBDtoolbox.zip should have MD5 hash `18d22e1761d36352f37232cba102f91f` (you can check the hash with `md5 SUNRGBDtoolbox.zip` on Mac OS or `md5sum SUNRGBDtoolbox.zip` on Linux)
The directory structure after pre-processing should be as below
``` ```
sunrgbd sunrgbd
├── sunrgbd_utils.py ├── sunrgbd_utils.py
......
% Modified from % Modified from
% https://github.com/facebookresearch/votenet/blob/master/sunrgbd/matlab/extract_split.m % https://github.com/facebookresearch/votenet/blob/master/sunrgbd/matlab/extract_split.m
% Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved % Copyright (c) Facebook, Inc. and its affiliates.
%
% This source code is licensed under the MIT license found in the
% LICENSE file in the root directory of this source tree.
%% Dump train/val split. %% Dump train/val split.
% Author: Charles R. Qi % Author: Charles R. Qi
......
# Modified from # Modified from
# https://github.com/facebookresearch/votenet/blob/master/sunrgbd/sunrgbd_data.py # https://github.com/facebookresearch/votenet/blob/master/sunrgbd/sunrgbd_data.py
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
''' Helper class and functions for loading SUN RGB-D objects ''' Helper class and functions for loading SUN RGB-D objects
Author: Charles R. Qi Author: Charles R. Qi
...@@ -10,6 +13,7 @@ Note: removed unused code for frustum preparation. ...@@ -10,6 +13,7 @@ Note: removed unused code for frustum preparation.
Changed a way for data visualization (removed depdency on mayavi). Changed a way for data visualization (removed depdency on mayavi).
Load depth with scipy.io Load depth with scipy.io
''' '''
import argparse import argparse
import os import os
import sys import sys
......
# Modified from # Modified from
# https://github.com/facebookresearch/votenet/blob/master/sunrgbd/sunrgbd_utils.py # https://github.com/facebookresearch/votenet/blob/master/sunrgbd/sunrgbd_utils.py
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved # Copyright (c) Facebook, Inc. and its affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
"""Provides Python helper function to read My SUNRGBD dataset. """Provides Python helper function to read My SUNRGBD dataset.
Author: Charles R. Qi Author: Charles R. Qi
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment