README.md 5.65 KB
Newer Older
1
# NerfAcc
Matthew Tancik's avatar
Matthew Tancik committed
2
[![Core Tests.](https://github.com/KAIR-BAIR/nerfacc/actions/workflows/code_checks.yml/badge.svg)](https://github.com/KAIR-BAIR/nerfacc/actions/workflows/code_checks.yml)
3
[![Documentation Status](https://readthedocs.com/projects/plenoptix-nerfacc/badge/?version=latest)](https://www.nerfacc.com/en/latest/?badge=latest)
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
4
[![Downloads](https://pepy.tech/badge/nerfacc)](https://pepy.tech/project/nerfacc)
Ruilong Li's avatar
readme  
Ruilong Li committed
5

6
https://www.nerfacc.com/
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
7

8
9
10
11
12
NerfAcc is a PyTorch Nerf acceleration toolbox for both training and inference. It focus on
efficient volumetric rendering of radiance fields, which is universal and plug-and-play for most of the NeRFs.

Using NerfAcc, 

Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
13
- The `vanilla NeRF` model with 8-layer MLPs can be trained to *better quality* (+~0.5 PNSR)
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
14
  in *1 hour* rather than *days* as in the paper.
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
15
16
- The `Instant-NGP NeRF` model can be trained to *equal quality* in *4.5 minutes*,
  comparing to the official pure-CUDA implementation.
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
17
- The `D-NeRF` model for *dynamic* objects can also be trained in *1 hour*
18
  rather than *2 days* as in the paper, and with *better quality* (+~2.5 PSNR).
19
20
21
- Both *bounded* and *unbounded* scenes are supported.

**And it is pure Python interface with flexible APIs!**
Ruilong Li's avatar
readme  
Ruilong Li committed
22

23
24
25
26
27
28
## Installation

```
pip install nerfacc
```

29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
## Usage

The idea of NerfAcc is to perform efficient ray marching and volumetric rendering. So NerfAcc can work with any user-defined radiance field. To plug the NerfAcc rendering pipeline into your code and enjoy the acceleration, you only need to define two functions with your radience field.
- `sigma_fn`: Compute density at each sample. It will be used by `nerfacc.ray_marching()` to skip the empty and occluded space during ray marching, which is where the major speedup comes from. 
- `rgb_sigma_fn`: Compute color and density at each sample. It will be used by `nerfacc.rendering()` to conduct differentiable volumetric rendering. This function will receive gradients to update your network.

An simple example is like this:

``` python
import torch
from torch import Tensor
import nerfacc 

radiance_field = ...  # network: a NeRF model
rays_o: Tensor = ...  # ray origins. (n_rays, 3)
rays_d: Tensor = ...  # ray normalized directions. (n_rays, 3)
45
optimizer = ...  # optimizer
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79

def sigma_fn(
    t_starts: Tensor, t_ends:Tensor, ray_indices: Tensor
) -> Tensor:
    """ Query density values from a user-defined radiance field.
    :params t_starts: Start of the sample interval along the ray. (n_samples, 1).
    :params t_ends: End of the sample interval along the ray. (n_samples, 1).
    :params ray_indices: Ray indices that each sample belongs to. (n_samples,).
    :returns The post-activation density values. (n_samples, 1).
    """
    t_origins = rays_o[ray_indices]  # (n_samples, 3)
    t_dirs = rays_d[ray_indices]  # (n_samples, 3)
    positions = t_origins + t_dirs * (t_starts + t_ends) / 2.0
    sigmas = radiance_field.query_density(positions) 
    return sigmas  # (n_samples, 1)

def rgb_sigma_fn(
    t_starts: Tensor, t_ends: Tensor, ray_indices: Tensor
) -> Tuple[Tensor, Tensor]:
    """ Query rgb and density values from a user-defined radiance field.
    :params t_starts: Start of the sample interval along the ray. (n_samples, 1).
    :params t_ends: End of the sample interval along the ray. (n_samples, 1).
    :params ray_indices: Ray indices that each sample belongs to. (n_samples,).
    :returns The post-activation rgb and density values. 
        (n_samples, 3), (n_samples, 1).
    """
    t_origins = rays_o[ray_indices]  # (n_samples, 3)
    t_dirs = rays_d[ray_indices]  # (n_samples, 3)
    positions = t_origins + t_dirs * (t_starts + t_ends) / 2.0
    rgbs, sigmas = radiance_field(positions, condition=t_dirs)  
    return rgbs, sigmas  # (n_samples, 3), (n_samples, 1)

# Efficient Raymarching: Skip empty and occluded space, pack samples from all rays.
# packed_info: (n_rays, 2). t_starts: (n_samples, 1). t_ends: (n_samples, 1).
80
81
82
83
84
with torch.no_grad():
    packed_info, t_starts, t_ends = nerfacc.ray_marching(
        rays_o, rays_d, sigma_fn=sigma_fn, near_plane=0.2, far_plane=1.0, 
        early_stop_eps=1e-4, alpha_thre=1e-2, 
    )
85
86
87
88
89

# Differentiable Volumetric Rendering.
# colors: (n_rays, 3). opaicity: (n_rays, 1). depth: (n_rays, 1).
color, opacity, depth = nerfacc.rendering(rgb_sigma_fn, packed_info, t_starts, t_ends)

90
# Optimize: Both the network and rays will receive gradients
91
92
93
94
95
96
optimizer.zero_grad()
loss = F.mse_loss(color, color_gt)
loss.backward()
optimizer.step()
```

Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
97
## Examples: 
Ruilong Li's avatar
readme  
Ruilong Li committed
98

Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
99
Before running those example scripts, please check the script about which dataset it is needed, and download
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
100
101
the dataset first.

Ruilong Li's avatar
Ruilong Li committed
102
``` bash
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
103
# Instant-NGP NeRF in 4.5 minutes with reproduced performance!
104
# See results at here: https://www.nerfacc.com/en/latest/examples/ngp.html
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
105
python examples/train_ngp_nerf.py --train_split train --scene lego
Ruilong Li's avatar
readme  
Ruilong Li committed
106
107
```

Ruilong Li's avatar
Ruilong Li committed
108
``` bash
109
# Vanilla MLP NeRF in 1 hour with better performance!
110
# See results at here: https://www.nerfacc.com/en/latest/examples/vanilla.html
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
111
python examples/train_mlp_nerf.py --train_split train --scene lego
Ruilong Li's avatar
Ruilong Li committed
112
113
```

Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
114
```bash
115
# D-NeRF for Dynamic objects in 1 hour with better performance!
116
# See results at here: https://www.nerfacc.com/en/latest/examples/dnerf.html
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
117
python examples/train_mlp_dnerf.py --train_split train --scene lego
118
119
```

Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
120
```bash
121
# Instant-NGP on unbounded scenes in 20 minutes!
122
# See results at here: https://www.nerfacc.com/en/latest/examples/unbounded.html
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
123
python examples/train_ngp_nerf.py --train_split train --scene garden --auto_aabb --unbounded --cone_angle=0.004
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
124
```
125
126
127
128
129
130
131
132
133
134
135


## Citation

```bibtex
@article{li2022nerfacc,
  title={NerfAcc: A General NeRF Accleration Toolbox.},
  author={Li, Ruilong and Tancik, Matthew and Kanazawa, Angjoo},
  journal={arXiv preprint arXiv:2210.04847},
  year={2022}
}
Ruilong Li(李瑞龙)'s avatar
Ruilong Li(李瑞龙) committed
136
```