提交 e6fe58c3 authored 作者: Olivier Breuleux's avatar Olivier Breuleux

fixed README and gradient doc

上级 0ebe2132
THEANO THEANO
Documentation et al is in Trac: Documentation et al is in Trac:
http://lgcm:8000/testenv/wiki/WikiStart http://lgcm.iro.umontreal.ca:8000/theano/wiki/WikiStart
The lisa twiki is deprecated for documenting Theano. The lisa twiki is deprecated for documenting Theano.
......
...@@ -27,27 +27,14 @@ def grad_sources_inputs(sources, graph_inputs): ...@@ -27,27 +27,14 @@ def grad_sources_inputs(sources, graph_inputs):
calling L{Op.grad}(...) when it is provided by an L{Op}, and at least one of the calling L{Op.grad}(...) when it is provided by an L{Op}, and at least one of the
outputs of the L{Op} has an associated gradient. outputs of the L{Op} has an associated gradient.
The L{Op.grad}(...) functions may be called in several ways (for the The L{Op.grad}(...) functions are called as such:
convenience of the L{Op} implementer) depending on the number of inputs and
outputs.
If there is one input and one output::
op.grad( op.inputs[0], grad(op.outputs[0])) op.grad( op.inputs[0], grad(op.outputs[0]))
If there are several inputs and one output::
op.grad( op.inputs, grad(op.outputs[0]))
If there is one input and several outputs::
op.grad( op.inputs[0], [grad(o) for o in op.outputs[0]])
If there are multiple inputs and outputs::
op.grad( op.inputs, [grad(o) for o in op.outputs[0]])
This function expects the L{Op.grad}(...) function to return the gradient This function expects the L{Op.grad}(...) function to return the gradient
expression [results] associated with the inputs of the L{Op}. If the L{Op} has a expression [results] associated with the inputs of the L{Op}. The L{Op} should
single input, it should return a single result; if the L{Op} has multiple return a list of results corresponding to the gradients in the same order
inputs, it should return a list of results corresponding to the gradients in as the inputs. If it has a single output it should return a list or tuple
the same order as the inputs. of length 1.
For each input wrt to which an L{Op} is not differentiable, it should return For each input wrt to which an L{Op} is not differentiable, it should return
None instead of a result instance. None instead of a result instance.
...@@ -78,9 +65,6 @@ def grad_sources_inputs(sources, graph_inputs): ...@@ -78,9 +65,6 @@ def grad_sources_inputs(sources, graph_inputs):
#if all output gradients are None, continue #if all output gradients are None, continue
if all(map(lambda x:x is None, g_outputs)): continue if all(map(lambda x:x is None, g_outputs)): continue
# output_arg = _unpack_result(g_outputs)
# input_arg = _unpack_result(op.inputs)
output_arg = g_outputs output_arg = g_outputs
input_arg = op.inputs input_arg = op.inputs
...@@ -90,8 +74,6 @@ def grad_sources_inputs(sources, graph_inputs): ...@@ -90,8 +74,6 @@ def grad_sources_inputs(sources, graph_inputs):
except AttributeError: except AttributeError:
dinputs = [] dinputs = []
# input_arg = [input in dinputs and input.copy() or input for input in input_arg]
new_input_arg = [] new_input_arg = []
for input in input_arg: for input in input_arg:
if input in dinputs: if input in dinputs:
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论