提交 81953c26 authored 作者: James Bergstra's avatar James Bergstra

fixing rst bugs

上级 88df8436
...@@ -14,7 +14,7 @@ also good for you if you are interested in getting more under the hood with ...@@ -14,7 +14,7 @@ also good for you if you are interested in getting more under the hood with
Theano itself. Theano itself.
Before tackling this tutorial, it is highly recommended to read the Before tackling this tutorial, it is highly recommended to read the
:ref:`basictutorial`. :ref:`tutorial`.
The first few pages will walk you through the definition of a new :ref:`type`, The first few pages will walk you through the definition of a new :ref:`type`,
``double``, and a basic arithmetic :ref:`operations <op>` on that Type. We ``double``, and a basic arithmetic :ref:`operations <op>` on that Type. We
......
...@@ -26,9 +26,9 @@ Global and local optimizations ...@@ -26,9 +26,9 @@ Global and local optimizations
First, let's lay out the way optimizations work in Theano. There are First, let's lay out the way optimizations work in Theano. There are
two types of optimizations: *global* optimizations and *local* two types of optimizations: *global* optimizations and *local*
optimizations. A global optimization takes an :ref:`env` object (an optimizations. A global optimization takes an ``Env`` object (an
Env is a wrapper around a whole computation graph, you can see its Env is a wrapper around a whole computation graph, you can see its
:ref:`documentation <env>` for more details) and navigates through it :class:`documentation <Env>` for more details) and navigates through it
in a suitable way, replacing some Variables by others in the process. A in a suitable way, replacing some Variables by others in the process. A
local optimization, on the other hand, is defined as a function on a local optimization, on the other hand, is defined as a function on a
*single* :ref:`apply` node and must return either ``False`` (to mean that *single* :ref:`apply` node and must return either ``False`` (to mean that
...@@ -52,26 +52,28 @@ Global optimization ...@@ -52,26 +52,28 @@ Global optimization
A global optimization (or optimizer) is an object which defines the following A global optimization (or optimizer) is an object which defines the following
methods: methods:
.. function:: apply(env) .. class:: Optimizer
.. method:: apply(env)
This method takes an Env object which contains the computation graph This method takes an Env object which contains the computation graph
and does modifications in line with what the optimization is meant and does modifications in line with what the optimization is meant
to do. This is of the main method of the optimizer. to do. This is of the main method of the optimizer.
.. function:: add_requirements(env) .. method:: add_requirements(env)
This method takes an Env object and adds :ref:`features This method takes an Env object and adds :ref:`features
<envfeature>` to it. These features are "plugins" that are needed <libdoc_gof_envfeature>` to it. These features are "plugins" that are needed
for the ``apply`` method to do its job properly. for the ``apply`` method to do its job properly.
.. function:: optimize(env) .. method:: optimize(env)
This is the interface function called by Theano. This is the interface function called by Theano.
*Default:* this is defined by Optimizer as ``add_requirement(env); *Default:* this is defined by Optimizer as ``add_requirement(env);
apply(env)``. apply(env)``.
See the section about :ref:`env` to understand how to define these See the section about :class:`Env` to understand how to define these
methods. methods.
...@@ -80,7 +82,9 @@ Local optimization ...@@ -80,7 +82,9 @@ Local optimization
A local optimization is an object which defines the following methods: A local optimization is an object which defines the following methods:
.. function:: transform(node) .. class:: LocalOptimizer
.. method:: transform(node)
This method takes an :ref:`apply` node and returns either ``False`` to This method takes an :ref:`apply` node and returns either ``False`` to
signify that no changes are to be done or a list of Variables which signify that no changes are to be done or a list of Variables which
...@@ -138,7 +142,7 @@ simplification described above: ...@@ -138,7 +142,7 @@ simplification described above:
requirements we might want to know about? requirements we might want to know about?
Here's how it works: first, in ``add_requirements``, we add the Here's how it works: first, in ``add_requirements``, we add the
``ReplaceValidate`` :ref:`envfeature` located in ``ReplaceValidate`` :ref:`libdoc_gof_envfeature` located in
:ref:`libdoc_gof_toolbox`. This feature adds the ``replace_validate`` :ref:`libdoc_gof_toolbox`. This feature adds the ``replace_validate``
method to ``env``, which is an enhanced version of ``replace`` that method to ``env``, which is an enhanced version of ``replace`` that
does additional checks to ensure that we are not messing up the does additional checks to ensure that we are not messing up the
...@@ -147,9 +151,9 @@ another optimizer, ``extend`` will do nothing). In a nutshell, ...@@ -147,9 +151,9 @@ another optimizer, ``extend`` will do nothing). In a nutshell,
``toolbox.ReplaceValidate`` grants access to ``env.replace_validate``, ``toolbox.ReplaceValidate`` grants access to ``env.replace_validate``,
and ``env.replace_validate`` allows us to replace a Variable with and ``env.replace_validate`` allows us to replace a Variable with
another while respecting certain validation constraints. You can another while respecting certain validation constraints. You can
browse the list of :ref:`features <envfeaturelist>` and see if some of browse the list of :ref:`libdoc_gof_envfeaturelist` and see if some of
them might be useful to write optimizations with. For example, as an them might be useful to write optimizations with. For example, as an
exercise, try to rewrite Simplify using :ref:`nodefinder`. (Hint: you exercise, try to rewrite Simplify using :class:`NodeFinder`. (Hint: you
want to use the method it publishes instead of the call to toposort!) want to use the method it publishes instead of the call to toposort!)
Then, in ``apply`` we do the actual job of simplification. We start by Then, in ``apply`` we do the actual job of simplification. We start by
...@@ -222,12 +226,12 @@ arithmetics that your Ops implement. Theano might provide facilities ...@@ -222,12 +226,12 @@ arithmetics that your Ops implement. Theano might provide facilities
for this somewhere in the future. for this somewhere in the future.
.. note:: .. note::
:ref:`env` is a Theano structure intended for the optimization :class:`Env` is a Theano structure intended for the optimization
phase. It is used internally by function and Module and is rarely phase. It is used internally by function and Module and is rarely
exposed to the end user. You can use it to test out optimizations, exposed to the end user. You can use it to test out optimizations,
etc. if you are comfortable with it, but it is recommended to use etc. if you are comfortable with it, but it is recommended to use
the function/Module frontends and to interface optimizations with the function/Module frontends and to interface optimizations with
:ref:`optdb <optdb>` (we'll see how to do that soon). :class:`optdb` (we'll see how to do that soon).
Local optimization Local optimization
...@@ -399,7 +403,7 @@ well and the LocalOptimizers they return will be put in their places ...@@ -399,7 +403,7 @@ well and the LocalOptimizers they return will be put in their places
(note that as of yet no DB can produce LocalOptimizer objects, so this (note that as of yet no DB can produce LocalOptimizer objects, so this
is a moot point). is a moot point).
Theano contains one principal DB object, :ref:`libdoc.gof.optdb`, which Theano contains one principal DB object, :class:`optdb`, which
contains all of Theano's optimizers with proper tags. It is contains all of Theano's optimizers with proper tags. It is
recommended to insert new Optimizers in it. As mentioned previously, recommended to insert new Optimizers in it. As mentioned previously,
optdb is a SequenceDB, so, at the top level, Theano applies a sequence optdb is a SequenceDB, so, at the top level, Theano applies a sequence
......
...@@ -40,17 +40,17 @@ Step 1 - Create an Env ...@@ -40,17 +40,17 @@ Step 1 - Create an Env
^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^
The subgraph given by the end user is wrapped in a structure called The subgraph given by the end user is wrapped in a structure called
:ref:`env`. That structure defines several hooks on adding and *Env*. That structure defines several hooks on adding and
removing (pruning) nodes as well as on modifying links between nodes removing (pruning) nodes as well as on modifying links between nodes
(for example, modifying an input of an :ref:`apply` node) (see the (for example, modifying an input of an :ref:`apply` node) (see the
article about :ref:`env` for more information). article about :ref:`libdoc_gof_env` for more information).
Env provides a method to change the input of an Apply node from one Env provides a method to change the input of an Apply node from one
Variable to another and a more high-level method to replace a Variable Variable to another and a more high-level method to replace a Variable
with another. This is the structure that :ref:`Optimizers with another. This is the structure that :ref:`Optimizers
<optimization>` work on. <optimization>` work on.
Some relevant :ref:`Features <envfeature>` are typically added to the Some relevant :ref:`Features <libdoc_gof_envfeature>` are typically added to the
Env, namely to prevent any optimization from operating inplace on Env, namely to prevent any optimization from operating inplace on
inputs declared as immutable. inputs declared as immutable.
...@@ -58,19 +58,19 @@ inputs declared as immutable. ...@@ -58,19 +58,19 @@ inputs declared as immutable.
Step 2 - Execute main Optimizer Step 2 - Execute main Optimizer
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Once the Env is made, an :ref:`optimizer <optimization>` is produced Once the Env is made, an :term:`optimizer` is produced
by the :ref:`function_mode` passed to ``function`` or to the Method/Module's by the :term:`mode` passed to ``function`` or to the Method/Module's
``make`` (the Mode basically has two important fields, ``linker`` and ``make`` (the Mode basically has two important fields, ``linker`` and
``optimizer``). That optimizer is applied on the Env using its ``optimizer``). That optimizer is applied on the Env using its
optimize() method. optimize() method.
The optimizer is typically obtained through :ref:`optdb <optdb>`. The optimizer is typically obtained through :attr:`optdb`.
Step 3 - Execute linker to obtain a thunk Step 3 - Execute linker to obtain a thunk
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Once the computation graph is optimized, the :ref:`linker` is Once the computation graph is optimized, the :term:`linker` is
extracted from the Mode. It is then called with the Env as argument to extracted from the Mode. It is then called with the Env as argument to
produce a ``thunk``, which is a function with no arguments that produce a ``thunk``, which is a function with no arguments that
returns nothing. Along with the thunk, one list of input containers (a returns nothing. Along with the thunk, one list of input containers (a
......
...@@ -40,8 +40,7 @@ Theano provides some generic Op classes which allow you to generate a ...@@ -40,8 +40,7 @@ Theano provides some generic Op classes which allow you to generate a
lot of Ops at a lesser effort. For instance, Elemwise can be used to lot of Ops at a lesser effort. For instance, Elemwise can be used to
make :term:`elementwise` operations easily whereas DimShuffle can be make :term:`elementwise` operations easily whereas DimShuffle can be
used to make transpose-like transformations. These higher order Ops used to make transpose-like transformations. These higher order Ops
are mostly Tensor-related, as this is Theano's specialty. An exposé of are mostly Tensor-related, as this is Theano's specialty.
them can therefore be found in :ref:`tensoroptools`.
.. _opchecklist: .. _opchecklist:
......
...@@ -251,7 +251,7 @@ attempt to clear up the confusion: ...@@ -251,7 +251,7 @@ attempt to clear up the confusion:
there is actually only one Type in that set, therefore the subclass there is actually only one Type in that set, therefore the subclass
doesn't represent anything that one of its instances doesn't. In this doesn't represent anything that one of its instances doesn't. In this
case it is a singleton, a set with one element. However, the case it is a singleton, a set with one element. However, the
:api:`TensorType` :class:`TensorType`
class in Theano (which is a subclass of Type) class in Theano (which is a subclass of Type)
represents a set of types of tensors represents a set of types of tensors
parametrized by their data type or number of dimensions. We could say parametrized by their data type or number of dimensions. We could say
......
...@@ -7,8 +7,8 @@ Glossary of terminology ...@@ -7,8 +7,8 @@ Glossary of terminology
Apply Apply
Instances of :class:`Apply` represent the application of an :term:`Op` Instances of :class:`Apply` represent the application of an :term:`Op`
to some input :term:`Variable`s to produce some output to some input :term:`Variable` (or variables) to produce some output
:term:`Variable`s. They are like the application of a [symbolic] :term:`Variable` (or variables). They are like the application of a [symbolic]
mathematical function to some [symbolic] inputs. mathematical function to some [symbolic] inputs.
Broadcasting Broadcasting
...@@ -33,26 +33,18 @@ Glossary of terminology ...@@ -33,26 +33,18 @@ Glossary of terminology
Elementwise Elementwise
An elementwise operation ``f`` on two matrices ``M`` and ``N`` An elementwise operation ``f`` on two tensor variables ``M`` and ``N``
is one such that: is one such that:
``f(M, N)[i, j] = f(M[i, j], N[i, j])`` ``f(M, N)[i, j] == f(M[i, j], N[i, j])``
In other words, each element of an input matrix is combined In other words, each element of an input matrix is combined
with the corresponding element of the other(s). There are no with the corresponding element of the other(s). There are no
dependencies between elements whose ``[i, j]`` coordinates do dependencies between elements whose ``[i, j]`` coordinates do
not correspond, so an elementwise operation is like a scalar not correspond, so an elementwise operation is like a scalar
operation generalized along several dimensions. operation generalized along several dimensions. Elementwise
operations are defined for tensors of different numbers of dimensions by
There exist unary, binary, ternary, etc. elementwise :term:`broadcasting` the smaller ones.
operations and they can work on scalars, vectors, matrices,
etc. as long as all the inputs have the same dimensions or can
be :term:`broadcasted <broadcasting>` to the same dimensions.
Examples of elementwise operations in Theano: ``add, sub, mul,
div, neg, inv, log, exp, sin, cos, tan`` and many
others. These operations are all instances of :api:`Elemwise
<theano.tensor.elemwise.Elemwise>`.
Expression Expression
See :term:`Apply` See :term:`Apply`
...@@ -102,7 +94,7 @@ Glossary of terminology ...@@ -102,7 +94,7 @@ Glossary of terminology
Mode Mode
An object providing an :term:`optimizer` and a :term:`linker` that is An object providing an :term:`optimizer` and a :term:`linker` that is
passed to :term:`theano.funcion`. It parametrizes how an expression passed to :term:`theano.function`. It parametrizes how an expression
graph is converted to a callable object. graph is converted to a callable object.
Op Op
...@@ -118,12 +110,15 @@ Glossary of terminology ...@@ -118,12 +110,15 @@ Glossary of terminology
Optimizer Optimizer
An instance of :class:`Optimizer`, which has the capacity to provide An instance of :class:`Optimizer`, which has the capacity to provide
:term:`optimization`s. an :term:`optimization` (or optimizations).
Optimization Optimization
A :term:`graph` transformation applied by an :term:`optimizer` during A :term:`graph` transformation applied by an :term:`optimizer` during
the compilation of a :term:`graph` by :term:`theano.function`. the compilation of a :term:`graph` by :term:`theano.function`.
Pure
An :term:`Op` is *pure* if it has no :term:`destructive` side-effects.
Storage Storage
The memory that is used to store the value of a Variable. In most The memory that is used to store the value of a Variable. In most
cases storage is internal to a compiled function, but in some cases cases storage is internal to a compiled function, but in some cases
...@@ -134,7 +129,7 @@ Glossary of terminology ...@@ -134,7 +129,7 @@ Glossary of terminology
theano.function theano.function
The interface for Theano's compilation from symbolic expression graphs The interface for Theano's compilation from symbolic expression graphs
to callable objects. See :func:`function.function'. to callable objects. See :func:`function.function`.
Type Type
The ``.type`` of a The ``.type`` of a
......
...@@ -60,7 +60,6 @@ Community ...@@ -60,7 +60,6 @@ Community
tutorial/index tutorial/index
library/index library/index
extending/index extending/index
indexes/index
glossary glossary
links links
internal/index internal/index
......
...@@ -122,7 +122,7 @@ array(10.0) ...@@ -122,7 +122,7 @@ array(10.0)
Advanced: Sharing Storage Between Functions Advanced: Sharing Storage Between Functions
------------------------------------------- -------------------------------------------
``value`` can be a :api:`theano.gof.link.Container` as well as a literal. ``value`` can be a :class:`Container` as well as a literal.
This permits linking a value of a Variable in one function to the value of a Variable in another function. This permits linking a value of a Variable in one function to the value of a Variable in another function.
By using a ``Container`` as a value we can implement shared variables between functions. By using a ``Container`` as a value we can implement shared variables between functions.
......
...@@ -26,8 +26,10 @@ environment variable 'THEANO_DEFAULT_MODE', which can in turn be overridden by ...@@ -26,8 +26,10 @@ environment variable 'THEANO_DEFAULT_MODE', which can in turn be overridden by
setting ``theano.compile.mode.default_mode`` directly, which can in turn be setting ``theano.compile.mode.default_mode`` directly, which can in turn be
overridden by passing the keyword argument to ``theano.function``. overridden by passing the keyword argument to ``theano.function``.
For a finer level of control over which optimizations are applied, and whether .. TODO::
C or python implementations are used, read :api:`compile.mode.Mode`.
For a finer level of control over which optimizations are applied, and whether
C or Python implementations are used, read.... what exactly?
Reference Reference
......
...@@ -175,7 +175,7 @@ Using Inheritance ...@@ -175,7 +175,7 @@ Using Inheritance
A friendlier way to use Module is to implement your functionality as a A friendlier way to use Module is to implement your functionality as a
subclass of Module: subclass of Module:
.. literalinclude:: ../examples/module/accumulator.py .. literalinclude:: ../../examples/module/accumulator.py
This is just like the previous example except slightly fancier. This is just like the previous example except slightly fancier.
......
...@@ -4,3 +4,17 @@ ...@@ -4,3 +4,17 @@
================================================ ================================================
:mod:`gof` -- Theano Internals [doc TODO] :mod:`gof` -- Theano Internals [doc TODO]
================================================ ================================================
.. module:: gof
:platform: Unix, Windows
:synopsis: Theano Internals
.. moduleauthor:: LISA
.. toctree::
:maxdepth: 1
env
toolbox
...@@ -7,20 +7,24 @@ ...@@ -7,20 +7,24 @@
TensorType TensorType
========== ==========
.. class:: TensorType .. class:: TensorType(Type)
.. attribute:: broadcastable
.. _libdoc_tensor_variable .. attribute:: ndim
.. attribute:: dtype
.. _libdoc_tensor_variable:
TensorVariable TensorVariable
============== ==============
.. class:: TensorVariable(_tensory_py_operators)
.. _libdoc_tensor_constant .. class:: TensorConstant(_tensory_py_operators)
TensorConstant
==============
.. class:: TensorSharedVariable(_tensory_py_operators)
.. _libdoc_tensor_creation: .. _libdoc_tensor_creation:
...@@ -362,9 +366,12 @@ Bit-wise ...@@ -362,9 +366,12 @@ Bit-wise
The bitwise operators possess this interface: The bitwise operators possess this interface:
:Parameter: *a* - symbolic Tensor of integer type. :Parameter: *a* - symbolic Tensor of integer type.
:Parameter: *b* - symbolic Tensor of integer type. :Parameter: *b* - symbolic Tensor of integer type.
.. note:: The bit-wise not (invert) does not have this second parameter.
:Return type: symbolic Tensor .. note::
The bit-wise not (invert) takes only one parameter.
:Return type: symbolic Tensor with ``int8`` dtype.
.. function:: and_(a, b) .. function:: and_(a, b)
...@@ -382,13 +389,13 @@ The bitwise operators possess this interface: ...@@ -382,13 +389,13 @@ The bitwise operators possess this interface:
Returns a variable representing the result of the bitwise not. Returns a variable representing the result of the bitwise not.
Here is an example using the bit-wise and_: Here is an example using the bit-wise ``and_`` via the ``&`` operator:
.. code-block:: python .. code-block:: python
import theano.tensor as T import theano.tensor as T
x,y = T.imatrices('x','y') x,y = T.imatrices('x','y')
z = T.and_(x,y) z = x & y
Mathematical Mathematical
......
.. _libdoc_tensor: .. _libdoc_tensor:
================================================== ==================================================
:mod:`tensor` -- Types and Ops for Symbolic numpy :mod:`tensor` -- Types and Ops for Symbolic numpy
================================================== ==================================================
......
...@@ -109,9 +109,10 @@ Reference ...@@ -109,9 +109,10 @@ Reference
.. method:: updates() .. method:: updates()
:returns: a list of all the (state, new_state) update pairs from the :returns: a list of all the (state, new_state) update pairs from the
random variables it has returned. This can be a convenient shortcut random variables it has returned.
to enumerating all the random variables in a large graph in the
``update`` paramter of function. This can be a convenient shortcut to enumerating all the random
variables in a large graph in the ``update`` parameter of function.
.. method:: seed(meta_seed) .. method:: seed(meta_seed)
......
...@@ -18,6 +18,9 @@ of Theano. Let's import that subpackage under a handy name. I like ...@@ -18,6 +18,9 @@ of Theano. Let's import that subpackage under a handy name. I like
If that worked you're ready for the tutorial, otherwise check your If that worked you're ready for the tutorial, otherwise check your
installation (see :ref:`install`). installation (see :ref:`install`).
Throughout the tutorial, bear in mind that there is a :ref:`glossary` to help
you out.
.. toctree:: .. toctree::
numpy numpy
......
.. tutorial_loadsave: .. _tutorial_loadsave:
================== ==================
Loading and Saving Loading and Saving
================== ==================
......
...@@ -76,7 +76,7 @@ x ...@@ -76,7 +76,7 @@ x
InplaceDimShuffle{x,x}.0 InplaceDimShuffle{x,x}.0
Note that the second input is not 2 as we would have expected. This is Note that the second input is not 2 as we would have expected. This is
because 2 was first :ref:`broadcasted <broadcasting>` to a matrix of because 2 was first :term:`broadcasted <broadcasting>` to a matrix of
same shape as x. This is done by using the op ``DimShuffle`` : same shape as x. This is done by using the op ``DimShuffle`` :
>>> type(y.owner.inputs[1]) >>> type(y.owner.inputs[1])
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论