提交 b0256d0f authored 作者: Yang Zhang's avatar Yang Zhang 提交者: Gokul

Add grad_scale op

Add a new grad_scale op to scale or inverse gradient in backpropagation.
上级 5b7edbde
...@@ -2062,3 +2062,34 @@ def grad_clip(x, lower_bound, upper_bound): ...@@ -2062,3 +2062,34 @@ def grad_clip(x, lower_bound, upper_bound):
""" """
return GradClip(lower_bound, upper_bound)(x) return GradClip(lower_bound, upper_bound)(x)
class GradScale(ViewOp):
def __init__(self,multiplier):
self.multiplier=multiplier
def grad(self, args, g_outs):
return [self.multiplier*g_out for g_out in g_outs]
def grad_scale(x,multiplier):
"""
This op scale or inverse the gradient in the backpropagation.
:param x: the variable we want its gradient inputs scale
:param multiplier: scale of the gradient
:examples:
x = theano.tensor.fscalar()
fx = theano.tensor.sin(x)
fp = theano.tensor.grad(fx, wrt=x)
fprime = theano.function([x], fp)
print(fprime(2))#-0.416
f_inverse=grad_scale(fx,-1.)
fpp = theano.tensor.grad(f_inverse, wrt=x)
fpprime = theano.function([x], fpp)
print(fpprime(2))#0.416
"""
return GradScale(multiplier)(x)
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论