提交 344f5adb authored 作者: Ian Goodfellow's avatar Ian Goodfellow

moved convolution gradient to the new interface

上级 2040b54a
......@@ -801,10 +801,9 @@ class ConvOp(OpenMPOp):
# mimic what happens inside theano.grad: get the input gradient
# of the final cost wrt all variables involved.
tmp_gmap = theano.gradient.grad_sources_inputs(
[(node, gz)], [inputs, kerns])
return theano.gradient.grad(cost=None,
known_grads={node: gz}, wrt=[inputs, kerns])
return [tmp_gmap[inputs], tmp_gmap[kerns]]
if self.dx not in (1, 2) or self.dy not in (1, 2):
raise NotImplementedError(
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论