提交 7c91cce4 authored 作者: Pascal Lamblin's avatar Pascal Lamblin

merge

上级 567ef16c
...@@ -208,7 +208,7 @@ for this somewhere in the future. ...@@ -208,7 +208,7 @@ for this somewhere in the future.
exposed to the end user. You can use it to test out optimizations, exposed to the end user. You can use it to test out optimizations,
etc. if you are comfortable with it, but it is recommended to use etc. if you are comfortable with it, but it is recommended to use
the function/Module frontends and to interface optimizations with the function/Module frontends and to interface optimizations with
optdb (we'll see how to do that soon). :ref:`optdb <optdb>` (we'll see how to do that soon).
Local optimization Local optimization
...@@ -240,12 +240,11 @@ The local version of the above code would be the following: ...@@ -240,12 +240,11 @@ The local version of the above code would be the following:
The definition of transform is the inner loop of the global optimizer, The definition of transform is the inner loop of the global optimizer,
where the node is given as argument. If no changes are to be made, where the node is given as argument. If no changes are to be made,
False must be returned. Else, a list of what to replace the node's ``False`` must be returned. Else, a list of what to replace the node's
outputs with must be returned. outputs with must be returned.
In order to apply the local optimizer we must use it in conjunction In order to apply the local optimizer we must use it in conjunction
with a :ref:`navigator`. You can follow this :ref:`link <navigator>` with a :ref:`navigator`. Basically, a :ref:`navigator` is a global
for further documentation, but basically a Navigator is a global
optimizer that loops through all nodes in the graph (or a well-defined optimizer that loops through all nodes in the graph (or a well-defined
subset of them) and applies one or several local optimizers on them. subset of them) and applies one or several local optimizers on them.
...@@ -256,37 +255,40 @@ subset of them) and applies one or several local optimizers on them. ...@@ -256,37 +255,40 @@ subset of them) and applies one or several local optimizers on them.
>>> e = gof.Env([x, y, z], [a]) >>> e = gof.Env([x, y, z], [a])
>>> e >>> e
[add(z, mul(div(mul(y, x), y), div(z, x)))] [add(z, mul(div(mul(y, x), y), div(z, x)))]
>>> simplify = gof.TopoOptimizer([local_simplify]) >>> simplify = gof.TopoOptimizer(local_simplify)
>>> simplify.optimize(e) >>> simplify.optimize(e)
>>> e >>> e
[add(z, mul(x, div(z, x)))] [add(z, mul(x, div(z, x)))]
TODO: test this.
OpSub, OpRemove, PatternSub OpSub, OpRemove, PatternSub
+++++++++++++++++++++++++++ +++++++++++++++++++++++++++
Theano defines some shortcuts to make LocalOptimizers: Theano defines some shortcuts to make LocalOptimizers:
* **OpSub(op1, op2)**: replaces all uses of op1 by op2. In other .. function:: OpSub(op1, op2)
words, the outputs of all :ref:`apply` involving op1 by the outputs
of Apply nodes involving op2, where their inputs are the same. Replaces all uses of *op1* by *op2*. In other
words, the outputs of all :ref:`apply` involving *op1* by the outputs
of Apply nodes involving *op2*, where their inputs are the same.
.. function:: OpRemove(op)
Removes all uses of *op* in the following way:
if ``y = op(x)`` then ``y`` is replaced by ``x``. *op* must have as many
outputs as it has inputs. The first output becomes the first input,
the second output becomes the second input, and so on.
* **OpRemove(op)**: removes all uses of op in the following way: if y .. function:: PatternSub(pattern1, pattern2)
= op(x) then y is replaced by x. The op must have as many outputs as
it has inputs. The first output becomes the first input, the second
output becomes the second input, and so on.
* **PatternSub(pattern1, pattern2)**: replaces all occurrences of the Replaces all occurrences of the first pattern by the second pattern.
first pattern by the second pattern. See the api for See :api:`theano.gof.opt.PatternSub`.
:api:`theano.gof.opt.PatternSub`.
.. code-block:: python .. code-block:: python
from theano.gof.opt import OpSub, OpRemove, PatternSub from theano.gof.opt import OpSub, OpRemove, PatternSub
# Replacing add by mul (this is not recommended for primarily # Replacing add by mul (this is not recommended for primarily
# mathematical reasons): # mathematical reasons):
add_to_mul = OpSub(add, mul) add_to_mul = OpSub(add, mul)
...@@ -304,16 +306,16 @@ Theano defines some shortcuts to make LocalOptimizers: ...@@ -304,16 +306,16 @@ Theano defines some shortcuts to make LocalOptimizers:
.. note:: .. note::
OpSub, OpRemove and PatternSub produce local optimizers, which ``OpSub``, ``OpRemove`` and ``PatternSub`` produce local optimizers, which
means that everything we said previously about local optimizers means that everything we said previously about local optimizers
apply: they need to be wrapped in a Navigator, etc. apply: they need to be wrapped in a Navigator, etc.
When an optimization can be naturally expressed using OpSub, OpRemove When an optimization can be naturally expressed using ``OpSub``, ``OpRemove``
or PatternSub, it is highly recommended to use them. Do note that they or ``PatternSub``, it is highly recommended to use them.
WRITEME: more about using PatternSub (syntax for the patterns, how to WRITEME: more about using PatternSub (syntax for the patterns, how to
use constraints, etc. - there's some decent doc in the api use constraints, etc. - there's some decent doc at
:api:`theano.gof.opt.PatternSub` for those interested) :api:`theano.gof.opt.PatternSub` for those interested)
...@@ -346,10 +348,12 @@ optimizations. ...@@ -346,10 +348,12 @@ optimizations.
Definition of optdb Definition of optdb
------------------- -------------------
optdb is an object which is an instance of ``theano.gof.SequenceDB``, optdb is an object which is an instance of
itself a subclass of ``theano.gof.DB``. There exist (for now) two :api:`theano.gof.SequenceDB <theano.gof.optdb.SequenceDB>`,
types of DB, SequenceDB and EquilibriumDB. When given an appropriate itself a subclass of :api:`theano.gof.DB <theano.gof.optdb.DB>`.
Query, DB objects build an Optimizer matching the query. There exist (for now) two types of DB, SequenceDB and EquilibriumDB.
When given an appropriate Query, DB objects build an Optimizer matching
the query.
A SequenceDB contains Optimizer or DB objects. Each of them has a A SequenceDB contains Optimizer or DB objects. Each of them has a
name, an arbitrary number of tags and an integer representing their name, an arbitrary number of tags and an integer representing their
...@@ -368,10 +372,10 @@ well and the LocalOptimizers they return will be put in their places ...@@ -368,10 +372,10 @@ well and the LocalOptimizers they return will be put in their places
(note that as of yet no DB can produce LocalOptimizer objects, so this (note that as of yet no DB can produce LocalOptimizer objects, so this
is a moot point). is a moot point).
Theano contains one principal DB object, ``theano.optdb`` which Theano contains one principal DB object, :api:`theano.gof.optdb`, which
contains all of Theano's optimizers with proper tags. It is contains all of Theano's optimizers with proper tags. It is
recommended to insert new Optimizers in it. As mentioned previously, recommended to insert new Optimizers in it. As mentioned previously,
optdb is a SequenceDB, so at the top level Theano applies a sequence optdb is a SequenceDB, so, at the top level, Theano applies a sequence
of global optimizations to the computation graphs. of global optimizations to the computation graphs.
...@@ -380,27 +384,37 @@ Query ...@@ -380,27 +384,37 @@ Query
A Query is built by the following call: A Query is built by the following call:
theano.gof.Query(include, require = None, exclude = None, subquery = None) ::
theano.gof.Query(include, require = None, exclude = None, subquery = None)
.. attribute:: include
**include**: a set of tags (a tag being a string) such that every A set of tags (a tag being a string) such that every
optimization obtained through this Query must have **one** of the tags optimization obtained through this Query must have **one** of the tags
listed. This field is required and basically acts as a starting point listed. This field is required and basically acts as a starting point
for the search. for the search.
**require**: a set of tags such that every optimization obtained .. attribute:: require
through this Query must have **all** of these tags.
**exclude**: a set of tags such that every optimization obtained A set of tags such that every optimization obtained
through this Query must have **none** of these tags. through this Query must have **all** of these tags.
**subquery**: optdb can contain sub-databases; subquery is a .. attribute:: exclude
dictionary mapping the name of a sub-database to a special Query. If
no subquery is given for a sub-database, the original Query will be A set of tags such that every optimization obtained
used again. through this Query must have **none** of these tags.
.. attribute:: subquery
optdb can contain sub-databases; subquery is a
dictionary mapping the name of a sub-database to a special Query.
If no subquery is given for a sub-database, the original Query will be
used again.
Furthermore, a Query object includes three methods, ``including``, Furthermore, a Query object includes three methods, ``including``,
``requiring`` and ``excluding`` which each produce a new Query object ``requiring`` and ``excluding`` which each produce a new Query object
with include, require and exclude sets refined to contain the new with include, require and exclude sets refined to contain the new [WRITEME]
Examples Examples
...@@ -439,8 +453,9 @@ it to ``optdb`` as follows: ...@@ -439,8 +453,9 @@ it to ``optdb`` as follows:
Once this is done, the FAST_RUN mode will automatically include your Once this is done, the FAST_RUN mode will automatically include your
optimization (since you gave it the 'fast_run' tag). Of course, optimization (since you gave it the 'fast_run' tag). Of course,
already-compiled functions will see no change. The 'order' parameter already-compiled functions will see no change. The 'order' parameter
(what it means and how to choose it) will be explained in another (what it means and how to choose it) will be explained in
section below. :ref:`optdb-structure` below.
Registering a LocalOptimizer Registering a LocalOptimizer
...@@ -456,21 +471,23 @@ Theano defines two EquilibriumDBs where you can put local ...@@ -456,21 +471,23 @@ Theano defines two EquilibriumDBs where you can put local
optimizations: optimizations:
**canonicalize**: this contains optimizations that aim to *simplify* .. function:: canonicalize
the graph:
This contains optimizations that aim to *simplify* the graph:
* Replace rare or esoterical operations with their equivalents using * Replace rare or esoterical operations with their equivalents using
elementary operations. elementary operations.
* Order operations in a canonical way (any sequence of * Order operations in a canonical way (any sequence of
multiplications and divisions can be rewritten to contain at most multiplications and divisions can be rewritten to contain at most
one division, for example; x*x can be rewritten x**2; etc.) one division, for example; ``x*x`` can be rewritten ``x**2``; etc.)
* Fold constants (``Constant(2)*Constant(2)`` becomes ``Constant(4)``)
* Fold constants (Constant(2)*Constant(2) becomes Constant(4))
.. function:: specialize
**specialize**: this contains optimizations that aim to *specialize* This contains optimizations that aim to *specialize* the graph:
the graph:
* Replace a combination of operations with a special operation that * Replace a combination of operations with a special operation that
does the same thing (but better). does the same thing (but better).
...@@ -480,12 +497,14 @@ For each group, all optimizations of the group that are selected by ...@@ -480,12 +497,14 @@ For each group, all optimizations of the group that are selected by
the Query will be applied on the graph over and over again until none the Query will be applied on the graph over and over again until none
of them is applicable, so keep that in mind when designing it: check of them is applicable, so keep that in mind when designing it: check
carefully that your optimization leads to a fixpoint (a point where it carefully that your optimization leads to a fixpoint (a point where it
cannot apply anymore) at which point it returns False to indicate its cannot apply anymore) at which point it returns ``False`` to indicate its
job is done. Also be careful not to undo the work of another local job is done. Also be careful not to undo the work of another local
optimizer in the group, because then the graph will oscillate between optimizer in the group, because then the graph will oscillate between
two or more states and nothing will get done. two or more states and nothing will get done.
.. _optdb-structure:
optdb structure optdb structure
--------------- ---------------
......
...@@ -37,7 +37,7 @@ Use Theano's high order Ops when applicable ...@@ -37,7 +37,7 @@ Use Theano's high order Ops when applicable
=========================================== ===========================================
Theano provides some generic Op classes which allow you to generate a Theano provides some generic Op classes which allow you to generate a
lot of ops at a lesser effort. For instance, Elemwise can be used to lot of Ops at a lesser effort. For instance, Elemwise can be used to
make :term:`elementwise` operations easily whereas DimShuffle can be make :term:`elementwise` operations easily whereas DimShuffle can be
used to make transpose-like transformations. These higher order Ops used to make transpose-like transformations. These higher order Ops
are mostly Tensor-related, as this is Theano's specialty. An exposé of are mostly Tensor-related, as this is Theano's specialty. An exposé of
......
...@@ -6,7 +6,9 @@ Using theano.function ...@@ -6,7 +6,9 @@ Using theano.function
===================== =====================
This page is about ``theano.function``, the interface for compiling graphs into callable objects. This page is about :api:`theano.function
<theano.compile.function_module.function>`, the interface for compiling
graphs into callable objects.
The signature for this function is: The signature for this function is:
...@@ -34,16 +36,16 @@ The ``inputs`` argument to ``theano.function`` is a list, containing the ``Varia ...@@ -34,16 +36,16 @@ The ``inputs`` argument to ``theano.function`` is a list, containing the ``Varia
.. class:: In .. class:: In
.. function:: __init__(variable, name=None, value=None, update=None, mutable=False) .. method:: __init__(variable, name=None, value=None, update=None, mutable=False)
``variable``: a Variable instance. This will be assigned a value ``variable``: a Variable instance. This will be assigned a value
before running the function, not computed from its owner. before running the function, not computed from its owner.
``name``: Any type. (If autoname_input=True, defaults to ``name``: Any type. (If ``autoname_input==True``, defaults to
variable.name). If name is a valid Python identifier, this input ``variable.name``). If ``name`` is a valid Python identifier, this input
can be set by ``kwarg``, and its value can be accessed by can be set by ``kwarg``, and its value can be accessed by
``self.<name>``. The default value is ``None`` ``self.<name>``. The default value is ``None``.
``value``: literal or Container. This is the default value of ``value``: literal or Container. This is the default value of
the Input. The default value of this parameter is ``None`` the Input. The default value of this parameter is ``None``
...@@ -52,10 +54,10 @@ The ``inputs`` argument to ``theano.function`` is a list, containing the ``Varia ...@@ -52,10 +54,10 @@ The ``inputs`` argument to ``theano.function`` is a list, containing the ``Varia
``None``, indicating that no update is to be done. ``None``, indicating that no update is to be done.
``mutable``: Bool (requires value). If ``True``, permit the ``mutable``: Bool (requires value). If ``True``, permit the
compiled function to modify the python object being used as the compiled function to modify the Python object being used as the
default value. The default value is ``False``. default value. The default value is ``False``.
``autoname``: Bool. If set to ``True``, if ``name`` is None and ``autoname``: Bool. If set to ``True``, if ``name`` is ``None`` and
the Variable has a name, it will be taken as the input's the Variable has a name, it will be taken as the input's
name. If autoname is set to ``False``, the name is the exact name. If autoname is set to ``False``, the name is the exact
value passed as the name parameter (possibly ``None``). value passed as the name parameter (possibly ``None``).
...@@ -68,7 +70,7 @@ A non-None `value` argument makes an In() instance an optional parameter ...@@ -68,7 +70,7 @@ A non-None `value` argument makes an In() instance an optional parameter
of the compiled function. For example, in the following code we are of the compiled function. For example, in the following code we are
defining an arity-2 function ``inc``. defining an arity-2 function ``inc``.
>>> u, x, s = T.scalars('uxs') >>> u, x, s = T.scalars('u', 'x', 's')
>>> inc = function([u, In(x, value=3), In(s, update=(s+x*u), value=10.0)], []) >>> inc = function([u, In(x, value=3), In(s, update=(s+x*u), value=10.0)], [])
Since we provided a ``value`` for ``s`` and ``x``, we can call it with just a value for ``u`` like this: Since we provided a ``value`` for ``s`` and ``x``, we can call it with just a value for ``u`` like this:
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论