Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
81953c26
提交
81953c26
authored
1月 18, 2010
作者:
James Bergstra
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
fixing rst bugs
上级
88df8436
显示空白字符变更
内嵌
并排
正在显示
17 个修改的文件
包含
89 行增加
和
63 行删除
+89
-63
index.txt
doc/extending/index.txt
+1
-1
optimization.txt
doc/extending/optimization.txt
+18
-14
pipeline.txt
doc/extending/pipeline.txt
+7
-7
tips.txt
doc/extending/tips.txt
+1
-2
type.txt
doc/extending/type.txt
+1
-1
glossary.txt
doc/glossary.txt
+13
-18
index.txt
doc/index.txt
+0
-1
io.txt
doc/library/compile/io.txt
+1
-1
mode.txt
doc/library/compile/mode.txt
+4
-2
module.txt
doc/library/compile/module.txt
+1
-1
index.txt
doc/library/gof/index.txt
+14
-0
basic.txt
doc/library/tensor/basic.txt
+17
-10
index.txt
doc/library/tensor/index.txt
+1
-0
shared_randomstreams.txt
doc/library/tensor/shared_randomstreams.txt
+4
-3
index.txt
doc/tutorial/index.txt
+3
-0
loading_and_saving.txt
doc/tutorial/loading_and_saving.txt
+2
-1
symbolic_graphs.txt
doc/tutorial/symbolic_graphs.txt
+1
-1
没有找到文件。
doc/extending/index.txt
浏览文件 @
81953c26
...
...
@@ -14,7 +14,7 @@ also good for you if you are interested in getting more under the hood with
Theano itself.
Before tackling this tutorial, it is highly recommended to read the
:ref:`
basic
tutorial`.
:ref:`tutorial`.
The first few pages will walk you through the definition of a new :ref:`type`,
``double``, and a basic arithmetic :ref:`operations <op>` on that Type. We
...
...
doc/extending/optimization.txt
浏览文件 @
81953c26
...
...
@@ -26,9 +26,9 @@ Global and local optimizations
First, let's lay out the way optimizations work in Theano. There are
two types of optimizations: *global* optimizations and *local*
optimizations. A global optimization takes an
:ref:`env
` object (an
optimizations. A global optimization takes an
``Env`
` object (an
Env is a wrapper around a whole computation graph, you can see its
:
ref:`documentation <e
nv>` for more details) and navigates through it
:
class:`documentation <E
nv>` for more details) and navigates through it
in a suitable way, replacing some Variables by others in the process. A
local optimization, on the other hand, is defined as a function on a
*single* :ref:`apply` node and must return either ``False`` (to mean that
...
...
@@ -52,26 +52,28 @@ Global optimization
A global optimization (or optimizer) is an object which defines the following
methods:
.. function:: apply(env)
.. class:: Optimizer
.. method:: apply(env)
This method takes an Env object which contains the computation graph
and does modifications in line with what the optimization is meant
to do. This is of the main method of the optimizer.
.. function
:: add_requirements(env)
.. method
:: add_requirements(env)
This method takes an Env object and adds :ref:`features
<
envfeature>` to it. These features are "plugins" that are needed
<libdoc_gof_
envfeature>` to it. These features are "plugins" that are needed
for the ``apply`` method to do its job properly.
.. function
:: optimize(env)
.. method
:: optimize(env)
This is the interface function called by Theano.
*Default:* this is defined by Optimizer as ``add_requirement(env);
apply(env)``.
See the section about :
ref:`e
nv` to understand how to define these
See the section about :
class:`E
nv` to understand how to define these
methods.
...
...
@@ -80,7 +82,9 @@ Local optimization
A local optimization is an object which defines the following methods:
.. function:: transform(node)
.. class:: LocalOptimizer
.. method:: transform(node)
This method takes an :ref:`apply` node and returns either ``False`` to
signify that no changes are to be done or a list of Variables which
...
...
@@ -138,7 +142,7 @@ simplification described above:
requirements we might want to know about?
Here's how it works: first, in ``add_requirements``, we add the
``ReplaceValidate`` :ref:`envfeature` located in
``ReplaceValidate`` :ref:`
libdoc_gof_
envfeature` located in
:ref:`libdoc_gof_toolbox`. This feature adds the ``replace_validate``
method to ``env``, which is an enhanced version of ``replace`` that
does additional checks to ensure that we are not messing up the
...
...
@@ -147,9 +151,9 @@ another optimizer, ``extend`` will do nothing). In a nutshell,
``toolbox.ReplaceValidate`` grants access to ``env.replace_validate``,
and ``env.replace_validate`` allows us to replace a Variable with
another while respecting certain validation constraints. You can
browse the list of :ref:`
features <envfeaturelist>
` and see if some of
browse the list of :ref:`
libdoc_gof_envfeaturelist
` and see if some of
them might be useful to write optimizations with. For example, as an
exercise, try to rewrite Simplify using :
ref:`nodef
inder`. (Hint: you
exercise, try to rewrite Simplify using :
class:`NodeF
inder`. (Hint: you
want to use the method it publishes instead of the call to toposort!)
Then, in ``apply`` we do the actual job of simplification. We start by
...
...
@@ -222,12 +226,12 @@ arithmetics that your Ops implement. Theano might provide facilities
for this somewhere in the future.
.. note::
:
ref:`e
nv` is a Theano structure intended for the optimization
:
class:`E
nv` is a Theano structure intended for the optimization
phase. It is used internally by function and Module and is rarely
exposed to the end user. You can use it to test out optimizations,
etc. if you are comfortable with it, but it is recommended to use
the function/Module frontends and to interface optimizations with
:
ref:`optdb <optdb>
` (we'll see how to do that soon).
:
class:`optdb
` (we'll see how to do that soon).
Local optimization
...
...
@@ -399,7 +403,7 @@ well and the LocalOptimizers they return will be put in their places
(note that as of yet no DB can produce LocalOptimizer objects, so this
is a moot point).
Theano contains one principal DB object, :
ref:`libdoc.gof.
optdb`, which
Theano contains one principal DB object, :
class:`
optdb`, which
contains all of Theano's optimizers with proper tags. It is
recommended to insert new Optimizers in it. As mentioned previously,
optdb is a SequenceDB, so, at the top level, Theano applies a sequence
...
...
doc/extending/pipeline.txt
浏览文件 @
81953c26
...
...
@@ -40,17 +40,17 @@ Step 1 - Create an Env
^^^^^^^^^^^^^^^^^^^^^^
The subgraph given by the end user is wrapped in a structure called
:ref:`env`
. That structure defines several hooks on adding and
*Env*
. That structure defines several hooks on adding and
removing (pruning) nodes as well as on modifying links between nodes
(for example, modifying an input of an :ref:`apply` node) (see the
article about :ref:`env` for more information).
article about :ref:`
libdoc_gof_
env` for more information).
Env provides a method to change the input of an Apply node from one
Variable to another and a more high-level method to replace a Variable
with another. This is the structure that :ref:`Optimizers
<optimization>` work on.
Some relevant :ref:`Features <envfeature>` are typically added to the
Some relevant :ref:`Features <
libdoc_gof_
envfeature>` are typically added to the
Env, namely to prevent any optimization from operating inplace on
inputs declared as immutable.
...
...
@@ -58,19 +58,19 @@ inputs declared as immutable.
Step 2 - Execute main Optimizer
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Once the Env is made, an :
ref:`optimizer <optimization>
` is produced
by the :
ref:`function_
mode` passed to ``function`` or to the Method/Module's
Once the Env is made, an :
term:`optimizer
` is produced
by the :
term:`
mode` passed to ``function`` or to the Method/Module's
``make`` (the Mode basically has two important fields, ``linker`` and
``optimizer``). That optimizer is applied on the Env using its
optimize() method.
The optimizer is typically obtained through :
ref:`optdb <optdb>
`.
The optimizer is typically obtained through :
attr:`optdb
`.
Step 3 - Execute linker to obtain a thunk
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Once the computation graph is optimized, the :
ref
:`linker` is
Once the computation graph is optimized, the :
term
:`linker` is
extracted from the Mode. It is then called with the Env as argument to
produce a ``thunk``, which is a function with no arguments that
returns nothing. Along with the thunk, one list of input containers (a
...
...
doc/extending/tips.txt
浏览文件 @
81953c26
...
...
@@ -40,8 +40,7 @@ Theano provides some generic Op classes which allow you to generate a
lot of Ops at a lesser effort. For instance, Elemwise can be used to
make :term:`elementwise` operations easily whereas DimShuffle can be
used to make transpose-like transformations. These higher order Ops
are mostly Tensor-related, as this is Theano's specialty. An exposé of
them can therefore be found in :ref:`tensoroptools`.
are mostly Tensor-related, as this is Theano's specialty.
.. _opchecklist:
...
...
doc/extending/type.txt
浏览文件 @
81953c26
...
...
@@ -251,7 +251,7 @@ attempt to clear up the confusion:
there is actually only one Type in that set, therefore the subclass
doesn't represent anything that one of its instances doesn't. In this
case it is a singleton, a set with one element. However, the
:
api
:`TensorType`
:
class
:`TensorType`
class in Theano (which is a subclass of Type)
represents a set of types of tensors
parametrized by their data type or number of dimensions. We could say
...
...
doc/glossary.txt
浏览文件 @
81953c26
...
...
@@ -7,8 +7,8 @@ Glossary of terminology
Apply
Instances of :class:`Apply` represent the application of an :term:`Op`
to some input :term:`Variable`
s
to produce some output
:term:`Variable`
s
. They are like the application of a [symbolic]
to some input :term:`Variable`
(or variables)
to produce some output
:term:`Variable`
(or variables)
. They are like the application of a [symbolic]
mathematical function to some [symbolic] inputs.
Broadcasting
...
...
@@ -33,26 +33,18 @@ Glossary of terminology
Elementwise
An elementwise operation ``f`` on two
matric
es ``M`` and ``N``
An elementwise operation ``f`` on two
tensor variabl
es ``M`` and ``N``
is one such that:
``f(M, N)[i, j] = f(M[i, j], N[i, j])``
``f(M, N)[i, j] =
=
f(M[i, j], N[i, j])``
In other words, each element of an input matrix is combined
with the corresponding element of the other(s). There are no
dependencies between elements whose ``[i, j]`` coordinates do
not correspond, so an elementwise operation is like a scalar
operation generalized along several dimensions.
There exist unary, binary, ternary, etc. elementwise
operations and they can work on scalars, vectors, matrices,
etc. as long as all the inputs have the same dimensions or can
be :term:`broadcasted <broadcasting>` to the same dimensions.
Examples of elementwise operations in Theano: ``add, sub, mul,
div, neg, inv, log, exp, sin, cos, tan`` and many
others. These operations are all instances of :api:`Elemwise
<theano.tensor.elemwise.Elemwise>`.
operation generalized along several dimensions. Elementwise
operations are defined for tensors of different numbers of dimensions by
:term:`broadcasting` the smaller ones.
Expression
See :term:`Apply`
...
...
@@ -102,7 +94,7 @@ Glossary of terminology
Mode
An object providing an :term:`optimizer` and a :term:`linker` that is
passed to :term:`theano.funcion`. It parametrizes how an expression
passed to :term:`theano.func
t
ion`. It parametrizes how an expression
graph is converted to a callable object.
Op
...
...
@@ -118,12 +110,15 @@ Glossary of terminology
Optimizer
An instance of :class:`Optimizer`, which has the capacity to provide
:term:`optimization`s
.
an :term:`optimization` (or optimizations)
.
Optimization
A :term:`graph` transformation applied by an :term:`optimizer` during
the compilation of a :term:`graph` by :term:`theano.function`.
Pure
An :term:`Op` is *pure* if it has no :term:`destructive` side-effects.
Storage
The memory that is used to store the value of a Variable. In most
cases storage is internal to a compiled function, but in some cases
...
...
@@ -134,7 +129,7 @@ Glossary of terminology
theano.function
The interface for Theano's compilation from symbolic expression graphs
to callable objects. See :func:`function.function
'
.
to callable objects. See :func:`function.function
`
.
Type
The ``.type`` of a
...
...
doc/index.txt
浏览文件 @
81953c26
...
...
@@ -60,7 +60,6 @@ Community
tutorial/index
library/index
extending/index
indexes/index
glossary
links
internal/index
...
...
doc/library/compile/io.txt
浏览文件 @
81953c26
...
...
@@ -122,7 +122,7 @@ array(10.0)
Advanced: Sharing Storage Between Functions
-------------------------------------------
``value`` can be a :
api:`theano.gof.link.
Container` as well as a literal.
``value`` can be a :
class:`
Container` as well as a literal.
This permits linking a value of a Variable in one function to the value of a Variable in another function.
By using a ``Container`` as a value we can implement shared variables between functions.
...
...
doc/library/compile/mode.txt
浏览文件 @
81953c26
...
...
@@ -26,8 +26,10 @@ environment variable 'THEANO_DEFAULT_MODE', which can in turn be overridden by
setting ``theano.compile.mode.default_mode`` directly, which can in turn be
overridden by passing the keyword argument to ``theano.function``.
For a finer level of control over which optimizations are applied, and whether
C or python implementations are used, read :api:`compile.mode.Mode`.
.. TODO::
For a finer level of control over which optimizations are applied, and whether
C or Python implementations are used, read.... what exactly?
Reference
...
...
doc/library/compile/module.txt
浏览文件 @
81953c26
...
...
@@ -175,7 +175,7 @@ Using Inheritance
A friendlier way to use Module is to implement your functionality as a
subclass of Module:
.. literalinclude:: ../examples/module/accumulator.py
.. literalinclude:: ../
../
examples/module/accumulator.py
This is just like the previous example except slightly fancier.
...
...
doc/library/gof/index.txt
浏览文件 @
81953c26
...
...
@@ -4,3 +4,17 @@
================================================
:mod:`gof` -- Theano Internals [doc TODO]
================================================
.. module:: gof
:platform: Unix, Windows
:synopsis: Theano Internals
.. moduleauthor:: LISA
.. toctree::
:maxdepth: 1
env
toolbox
doc/library/tensor/basic.txt
浏览文件 @
81953c26
...
...
@@ -7,20 +7,24 @@
TensorType
==========
.. class:: TensorType
.. class:: TensorType
(Type)
.. attribute:: broadcastable
.. _libdoc_tensor_variable
.. attribute:: ndim
.. attribute:: dtype
.. _libdoc_tensor_variable:
TensorVariable
==============
.. class:: TensorVariable(_tensory_py_operators)
.. _libdoc_tensor_constant
TensorConstant
==============
.. class:: TensorConstant(_tensory_py_operators)
.. class:: TensorSharedVariable(_tensory_py_operators)
.. _libdoc_tensor_creation:
...
...
@@ -362,9 +366,12 @@ Bit-wise
The bitwise operators possess this interface:
:Parameter: *a* - symbolic Tensor of integer type.
:Parameter: *b* - symbolic Tensor of integer type.
.. note:: The bit-wise not (invert) does not have this second parameter.
:Return type: symbolic Tensor
.. note::
The bit-wise not (invert) takes only one parameter.
:Return type: symbolic Tensor with ``int8`` dtype.
.. function:: and_(a, b)
...
...
@@ -382,13 +389,13 @@ The bitwise operators possess this interface:
Returns a variable representing the result of the bitwise not.
Here is an example using the bit-wise
and_
:
Here is an example using the bit-wise
``and_`` via the ``&`` operator
:
.. code-block:: python
import theano.tensor as T
x,y = T.imatrices('x','y')
z =
T.and_(x,y)
z =
x & y
Mathematical
...
...
doc/library/tensor/index.txt
浏览文件 @
81953c26
.. _libdoc_tensor:
==================================================
:mod:`tensor` -- Types and Ops for Symbolic numpy
==================================================
...
...
doc/library/tensor/shared_randomstreams.txt
浏览文件 @
81953c26
...
...
@@ -109,9 +109,10 @@ Reference
.. method:: updates()
:returns: a list of all the (state, new_state) update pairs from the
random variables it has returned. This can be a convenient shortcut
to enumerating all the random variables in a large graph in the
``update`` paramter of function.
random variables it has returned.
This can be a convenient shortcut to enumerating all the random
variables in a large graph in the ``update`` parameter of function.
.. method:: seed(meta_seed)
...
...
doc/tutorial/index.txt
浏览文件 @
81953c26
...
...
@@ -18,6 +18,9 @@ of Theano. Let's import that subpackage under a handy name. I like
If that worked you're ready for the tutorial, otherwise check your
installation (see :ref:`install`).
Throughout the tutorial, bear in mind that there is a :ref:`glossary` to help
you out.
.. toctree::
numpy
...
...
doc/tutorial/loading_and_saving.txt
浏览文件 @
81953c26
.. tutorial_loadsave:
.. _tutorial_loadsave:
==================
Loading and Saving
==================
...
...
doc/tutorial/symbolic_graphs.txt
浏览文件 @
81953c26
...
...
@@ -76,7 +76,7 @@ x
InplaceDimShuffle{x,x}.0
Note that the second input is not 2 as we would have expected. This is
because 2 was first :
ref
:`broadcasted <broadcasting>` to a matrix of
because 2 was first :
term
:`broadcasted <broadcasting>` to a matrix of
same shape as x. This is done by using the op ``DimShuffle`` :
>>> type(y.owner.inputs[1])
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论