提交 e67f96cd authored 作者: Nicolas Ballas's avatar Nicolas Ballas

Small fix according to the pull request comments

上级 69bfdf7a
...@@ -36,6 +36,11 @@ TODO: Give examples for how to use these things! They are pretty complicated. ...@@ -36,6 +36,11 @@ TODO: Give examples for how to use these things! They are pretty complicated.
that it requires CUDA >= 5.0, scikits.cuda >= 0.5.0 and PyCUDA to run. that it requires CUDA >= 5.0, scikits.cuda >= 0.5.0 and PyCUDA to run.
- :func:`conv3d_fft <theano.sandbox.cuda.fftconv.conv3d_fft>` - :func:`conv3d_fft <theano.sandbox.cuda.fftconv.conv3d_fft>`
This is the same as conv2d_fft but with 3d data instead. This is the same as conv2d_fft but with 3d data instead.
You can enable it by setting THEANO_FLAGS to
'optimizer_including=conv3d_fft:convgrad3d_fft:convtransp3d_fft'
in your environement. This is not enabled by default because it
has some restrictions on input and uses more memory. Also note
that it requires CUDA >= 5.0, scikits.cuda >= 0.5.0 and PyCUDA to run.
- :func:`conv3D <theano.tensor.nnet.Conv3D.conv3D>`. Doesn't work on the GPU. - :func:`conv3D <theano.tensor.nnet.Conv3D.conv3D>`. Doesn't work on the GPU.
- :func:`conv3d2d <theano.tensor.nnet.conv3d2d.conv3d>` - :func:`conv3d2d <theano.tensor.nnet.conv3d2d.conv3d>`
Another conv3d implementation that uses the conv2d with data reshaping. Another conv3d implementation that uses the conv2d with data reshaping.
......
...@@ -80,9 +80,6 @@ def register_opt(*tags, **kwargs): ...@@ -80,9 +80,6 @@ def register_opt(*tags, **kwargs):
return local_opt return local_opt
return f return f
#register local_track_shape_i at this level too #register local_track_shape_i at this level too
#to make multi-level lift of shape work. #to make multi-level lift of shape work.
register_opt()(theano.tensor.opt.local_track_shape_i) register_opt()(theano.tensor.opt.local_track_shape_i)
...@@ -1121,6 +1118,7 @@ def local_gpu_softmax_with_bias(node): ...@@ -1121,6 +1118,7 @@ def local_gpu_softmax_with_bias(node):
#### Convolution, maxpooling #### Convolution, maxpooling
from theano.tensor.nnet import conv from theano.tensor.nnet import conv
@register_opt() @register_opt()
@local_optimizer([gpu_from_host, conv.ConvOp]) @local_optimizer([gpu_from_host, conv.ConvOp])
def local_gpu_conv(node): def local_gpu_conv(node):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论