Commit b384c4e7 authored by Konstantin Lopuhin's avatar Konstantin Lopuhin Committed by Soumith Chintala
Browse files

Build GPU extensions whenever CUDA_HOME is available (#911)

This is the same logic and motivation as in
https://github.com/pytorch/pytorch/pull/8244/
that is to build a docker image (which has no GPU available at build time),
and then use it with nvidia-docker
parent 37bd11df
...@@ -88,7 +88,7 @@ def get_extensions(): ...@@ -88,7 +88,7 @@ def get_extensions():
define_macros = [] define_macros = []
if torch.cuda.is_available() and CUDA_HOME is not None: if CUDA_HOME is not None:
extension = CUDAExtension extension = CUDAExtension
sources += source_cuda sources += source_cuda
define_macros += [('WITH_CUDA', None)] define_macros += [('WITH_CUDA', None)]
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment