提交 bd24526f authored 作者: Olivier Delalleau's avatar Olivier Delalleau

Typo fixes in doc

上级 835c39f4
......@@ -4,23 +4,23 @@
OpFromGraph
===========
This page descripbe :class:`theano.OpFromGraph
<theano.compile.builders.OpFromGraph>`. an Op that allow to
This page describes :class:`theano.OpFromGraph
<theano.compile.builders.OpFromGraph>`, an Op that allows to
encapsulate a Theano graph in an op.
This can be used to encapsulate some functionality in one block. It is
useful to scale Theano compilation for regular bigger graph when we
useful to scale Theano compilation for regular bigger graphs when we
reuse that encapsulated fonctionality with different inputs many
times. Due to this encapsulation, it can make Theano compilation phase
faster for graph with many nodes.
faster for graphs with many nodes.
Using this for small graph isn't recommanded as it disable
optimization between what is inside the encapsulation and outside it.
Using this for small graphs is not recommended as it disables
optimizations between what is inside the encapsulation and outside of it.
.. note:
This wasn't used widely up to now. If you have any
questions/comments don't contact us on the mailing list.
This was not used widely up to now. If you have any
questions/comments do not hesitate to contact us on the mailing list.
......
......@@ -6,10 +6,10 @@ from theano.gof import ops_with_inner_function
class OpFromGraph(gof.Op):
"""This create an `Op` from inputs and outputs list of variables.
"""This creates an `Op` from inputs and outputs lists of variables.
The signature is similar to theano.function() and the resulting
`Op` perform will do the same operation as::
`Op`'s perform will do the same operation as::
orig_function(inputs, outputs, **kwargs)
......@@ -19,13 +19,13 @@ class OpFromGraph(gof.Op):
- c_code() to remove the double overhead?
- opt to unfold it, work inplace on inputs
- grad() make it support DisconnectedType and the new interface
- check how it work with updates.
- check how it works with updates.
- add test with constant as input or inside the inner graph.
- Add support for the GPU? Probably just need an opt to remove transfer
- Add support to pickle this Op.
- Add support/test with random generator
:note:
- We support shared variable in the inner graph. This is automatic and
- We support shared variables in the inner graph. This is automatic and
invisible to the user. They can be as input to the node or in the
inner graph.
- We support unused inputs. This is needed for the grad.
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论