提交 0bde7a3c authored 作者: Frederic's avatar Frederic

Updated class doc of OpFromGraph

上级 a576be90
...@@ -6,24 +6,16 @@ from theano.gof import ops_with_inner_function ...@@ -6,24 +6,16 @@ from theano.gof import ops_with_inner_function
class OpFromGraph(gof.Op): class OpFromGraph(gof.Op):
"""This create an `Op` from a list of input variables and a list of output """This create an `Op` from inputs and outputs list of variables.
variables.
The signature is similar to theano.function() and the resulting The signature is similar to theano.function() and the resulting
`Op` perform will do the same operation as:: `Op` perform will do the same operation as::
orig_function(inputs, outputs, **kwargs)
Example:
x, y, z = tensor.scalars('xyz')
e = x + y * z
op = OpFromGraph([x, y, z], [e], linker='c')
# op behaves like a normal theano op
e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2])
orig_function(inputs, outputs, **kwargs)
TODO: - examples TODO:
- __hash__, __eq__ otherwise won't merge - examples for a multi-layer mlp. where?
- __hash__, __eq__ otherwise won't merge, try gof.opt.is_same_graph_with_merge(op1.new_outputs, op2, new_outputs)
- c_code() to remove the double overhead? - c_code() to remove the double overhead?
- opt to unfold it, work inplace on inputs - opt to unfold it, work inplace on inputs
- grad() make it support DisconnectedType and the new interface - grad() make it support DisconnectedType and the new interface
...@@ -31,10 +23,42 @@ class OpFromGraph(gof.Op): ...@@ -31,10 +23,42 @@ class OpFromGraph(gof.Op):
- add test with constant as input or inside the inner graph. - add test with constant as input or inside the inner graph.
- Add support for the GPU? Probably just need an opt to remove transfer - Add support for the GPU? Probably just need an opt to remove transfer
- Add support to pickle this Op. - Add support to pickle this Op.
- Add support/test with random generator
:note: :note:
We support unused inputs. This is needed for the grad. - We support shared variable in the inner graph. This is automatic and
We support shared variable in the inner graph. This is automatic and invisible to the user. They can be as input to the node or in the
invisible to the user. inner graph.
- We support unused inputs. This is needed for the grad.
Example 1:
.. code-block:: python
from theano import function, OpFromGraph, tensor
x, y, z = tensor.scalars('xyz')
e = x + y * z
op = OpFromGraph([x, y, z], [e])
# op behaves like a normal theano op
e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2])
Example 2 with shared variable:
.. code-block:: python
import numpy
import theano
from theano import config, function, OpFromGraph, tensor
x, y, z = tensor.scalars('xyz')
s = theano.shared(numpy.random.rand(2, 2).astype(config.floatX))
e = x + y * z + s
op = OpFromGraph([x, y, z], [e])
# op behaves like a normal theano op
e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2])
""" """
def __init__(self, inputs, outputs, **kwargs): def __init__(self, inputs, outputs, **kwargs):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论