提交 c2247cd1 authored 作者: Nicholas Leonard's avatar Nicholas Leonard

rerenamed hypograph -> subgraph_grad

上级 c1124b59
......@@ -1564,7 +1564,7 @@ Gradient / Differentiation
:rtype: variable or list of variables (matching `wrt`)
:returns: gradients of the cost with respect to each of the `wrt` terms
.. function:: hypograd(wrt, end, start=None, cost=None, details=False)
.. function:: subgraph_grad(wrt, end, start=None, cost=None, details=False)
With respect to `wrt`, computes gradients of cost and/or from existing
`start` gradients, up to the `end` variables of a symbolic digraph.
......@@ -1578,13 +1578,13 @@ Gradient / Differentiation
non-differentiable process could be approximated by user-defined
formula, which could be calculated using the gradients of a cost
with respect to samples (0s and 1s). These gradients are obtained
by performing a hypograd from the `cost` or previously known gradients
by performing a subgraph_grad from the `cost` or previously known gradients
(`start`) up to the outputs of the stochastic process (`end`).
A dictionary mapping gradients obtained from the user-defined
differentiation of the process, to variables, could then be fed into
another hypograd as `start` with any other `cost` (e.g. weight decay).
another subgraph_grad as `start` with any other `cost` (e.g. weight decay).
In an MLP, we could use hypograd to iteratively backpropagate:
In an MLP, we could use subgraph_grad to iteratively backpropagate:
>>> x, t = theano.tensor.fvector('x'), theano.tensor.fvector('t')
>>> w1 = theano.shared(np.random.randn(3,4))
>>> w2 = theano.shared(np.random.randn(4,2))
......@@ -1601,7 +1601,7 @@ Gradient / Differentiation
>>> next_grad = None
>>> param_grads = []
>>> for i in xrange(2):
>>> param_grad, next_grad = theano.hypograd(
>>> param_grad, next_grad = theano.subgraph_grad(
>>> wrt=params[i], end=grad_ends[i],
>>> start=next_grad, cost=costs[i]
>>> )
......
......@@ -79,7 +79,7 @@ from theano.updates import Updates, OrderedUpdates
#we don't import by default as we don't want to force having scipy installed.
#import sparse
from theano.gradient import Rop, Lop, grad, hypograd
from theano.gradient import Rop, Lop, grad, subgraph_grad
if config.device.startswith('gpu') or config.init_gpu_device.startswith('gpu'):
import theano.sandbox.cuda
......
......@@ -544,7 +544,7 @@ def grad(cost, wrt, consider_constant=None,
rval, = rval
return rval
def hypograd(wrt, end, start=None, cost=None, details=False):
def subgraph_grad(wrt, end, start=None, cost=None, details=False):
'''
With respect to `wrt`, computes gradients of cost and/or from existing
`start` gradients, up to the `end` variables of a symbolic digraph.
......@@ -558,11 +558,11 @@ def hypograd(wrt, end, start=None, cost=None, details=False):
non-differentiable process could be approximated by user-defined
formula, which could be calculated using the gradients of a cost
with respect to samples (0s and 1s). These gradients are obtained
by performing a hypograd from the `cost` or previously known gradients
by performing a subgraph_grad from the `cost` or previously known gradients
(`start`) up to the outputs of the stochastic process (`end`).
A dictionary mapping gradients obtained from the user-defined
differentiation of the process, to variables, could then be fed into
another hypograd as `start` with any other `cost` (e.g. weight decay).
another subgraph_grad as `start` with any other `cost` (e.g. weight decay).
:type wrt : List of Variables.
Gradients are computed with respect to `wrt`.
......
......@@ -555,10 +555,10 @@ def test_disconnected_cost_grad():
return
raise AssertionError("A disconnected gradient has been ignored.")
def test_hypograd():
def test_subgraph_grad():
# Tests that the grad method with no known_grads
# matches what happens if you use successive hypograds
# matches what happens if you use successive subgraph_grads
x = theano.tensor.fvector('x')
t = theano.tensor.fvector('t')
......@@ -588,7 +588,7 @@ def test_hypograd():
next_grad = None
param_grads = []
for i in xrange(2):
param_grad, next_grad = theano.hypograd(
param_grad, next_grad = theano.subgraph_grad(
wrt=params[i], end=grad_ends[i],
start=next_grad, cost=costs[i]
)
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论