提交 2d4eca23 authored 作者: Frederic's avatar Frederic

Updated OpFromGraph docstring.

上级 082fa2c2
...@@ -5,19 +5,13 @@ from theano.gof import ops_with_inner_function ...@@ -5,19 +5,13 @@ from theano.gof import ops_with_inner_function
class OpFromGraph(gof.Op): class OpFromGraph(gof.Op):
""" """This create an `Op` from a list of input variables and a list of output
This create an L{Op} from a list of input variables and a list of output
variables. variables.
The signature is the same as the signature of L{FunctionFactory} The signature is similar to theano.function() and the resulting
and/or function and the resulting L{Op}'s perform will do the same `Op` perform will do the same operation as::
operation as::
function(inputs, outputs, **kwargs) function(inputs, outputs, **kwargs)
Take note that the following options, if provided, must take the
value(s) listed below:
unpack_single = False
borrow_outputs = False
OpFromGraph takes an additional input, grad_depth. If grad_depth OpFromGraph takes an additional input, grad_depth. If grad_depth
is n, OpFromGraph will make special Ops for gradients up to the is n, OpFromGraph will make special Ops for gradients up to the
...@@ -32,6 +26,15 @@ class OpFromGraph(gof.Op): ...@@ -32,6 +26,15 @@ class OpFromGraph(gof.Op):
# op behaves like a normal theano op # op behaves like a normal theano op
e2 = op(x, y, z) + op(z, y, x) e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2]) fn = function([x, y, z], [e2])
TODO: -examples
- support shared var
- __hash__, __eq__ otherwise won't merge
- c_code() to remove the double overhead?
- move call to function to make_thunk().
- opt to unfold it, work inplace on inputs
- move grad stuff from __init__ to grad()
""" """
def __init__(self, inputs, outputs, grad_depth=1, **kwargs): def __init__(self, inputs, outputs, grad_depth=1, **kwargs):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论