Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
f761867c
提交
f761867c
authored
1月 18, 2010
作者:
James Bergstra
浏览文件
操作
浏览文件
下载
差异文件
merge
上级
e427e884
fec1d238
隐藏空白字符变更
内嵌
并排
正在显示
6 个修改的文件
包含
106 行增加
和
43 行删除
+106
-43
glossary.txt
doc/glossary.txt
+98
-33
basic.txt
doc/library/tensor/basic.txt
+0
-5
signal.txt
doc/library/tensor/signal.txt
+3
-0
adding.txt
doc/tutorial/adding.txt
+1
-1
examples.txt
doc/tutorial/examples.txt
+1
-1
modes.txt
doc/tutorial/modes.txt
+3
-3
没有找到文件。
doc/glossary.txt
浏览文件 @
f761867c
...
@@ -6,9 +6,12 @@ Glossary of terminology
...
@@ -6,9 +6,12 @@ Glossary of terminology
.. glossary::
.. glossary::
Apply
Apply
WRITEME
Instances of :class:`Apply` represent the application of an :term:`Op`
to some input :term:`Variable`s to produce some output
:term:`Variable`s. They are like the application of a [symbolic]
mathematical function to some [symbolic] inputs.
b
roadcasting
B
roadcasting
Broadcasting is a mechanism which allows tensors with
Broadcasting is a mechanism which allows tensors with
different numbers of dimensions to be added or multiplied
different numbers of dimensions to be added or multiplied
together by (virtually) replicating the smaller tensor along
together by (virtually) replicating the smaller tensor along
...
@@ -44,13 +47,17 @@ Glossary of terminology
...
@@ -44,13 +47,17 @@ Glossary of terminology
* `SciPy documentation about numpy's broadcasting <http://www.scipy.org/EricsBroadcastingDoc>`_
* `SciPy documentation about numpy's broadcasting <http://www.scipy.org/EricsBroadcastingDoc>`_
* `OnLamp article about numpy's broadcasting <http://www.onlamp.com/pub/a/python/2000/09/27/numerically.html>`_
* `OnLamp article about numpy's broadcasting <http://www.onlamp.com/pub/a/python/2000/09/27/numerically.html>`_
constant
Constant
WRITEME
A variable with an immutable value.
For example, when you type
>>> x = tensor.ivector()
>>> y = x + 3
Then a `constant` is created to represent the ``3`` in the graph.
See also: :class:`gof.Constant`
dynamic
WRITEME
e
lementwise
E
lementwise
An elementwise operation ``f`` on two matrices ``M`` and ``N``
An elementwise operation ``f`` on two matrices ``M`` and ``N``
is one such that:
is one such that:
...
@@ -72,45 +79,103 @@ Glossary of terminology
...
@@ -72,45 +79,103 @@ Glossary of terminology
others. These operations are all instances of :api:`Elemwise
others. These operations are all instances of :api:`Elemwise
<theano.tensor.elemwise.Elemwise>`.
<theano.tensor.elemwise.Elemwise>`.
graph
Expression Graph
WRITEME
A directed, acyclic set of connected :term:`Variable` and
:term:`Apply` nodes that express symbolic functional relationship
between variables. You use Theano by defining expression graphs, and
then compiling them with :term:`theano.function`.
inplace
See also :term:`Variable`, :term:`Op`, :term:`Apply`, and
WRITEME
:term:`Type`, or read more about :ref:`tutorial_graphstructures`.
merge
Destructive
WRITEME
op
An :term:`Op` is destructive (of particular input[s]) if its
WRITEME
computation requires that one or more inputs be overwritten or
otherwise invalidated. For example, :term:`inplace` Ops are
destructive. Destructive Ops can sometimes be faster than
non-destructive alternatives. Theano encourages users not to put
destructive Ops into graphs that are given to :term:`theano.function`,
but instead to trust the optimizations to insert destructive ops
judiciously.
pur
e
Destructive Ops are indicated via a ``destroy_map`` Op attribute. (Se
e
WRITEME
:class:`gof.Op`.
static
WRITEME
type
Graph
See :ref:`tensortypes` or :ref:`type`.
see :term:`expression graph`
Inplace
Inplace computations are computations that destroy their inputs as a
side-effect. For example, if you iterate over a matrix and double
every element, this is an inplace operation because when you are done,
the original input has been overwritten.
Merge
A simple optimization in which redundant :term:`Apply` nodes are
combined. For example, in ``function([x,y], [(x+y)*2, (x+y)*3])`` the merge
optimization will ensure that ``x`` and ``y`` are only added once.
Op
The ``.op`` of an :term:`Apply`, together with its symbolic inputs
fully determines what kind of computation will be carried out for that
``Apply`` at run-time. Mathematical functions such as addition
(``T.add``) and indexing ``x[i]`` are Ops in Theano. Much of the
library documentation is devoted to describing the various Ops that
are provided with Theano, but you can add more.
See also :term:`Variable`, :term:`Type`, and :term:`Apply`,
or read more about :ref:`tutorial_graphstructures`.
Expression
See :term:`Apply`
Storage
The memory that is used to store the value of a Variable. In most
cases storage is internal to a compiled function, but in some cases
(such as :term:`constant` and :term:`<shared variable>` the storage is not internal.
Shared Variable
A :term:`Variable` whose value may be shared between multiple functions. See :func:`shared` and :func:`theano.function <function.function>`.
theano.function
The interface for Theano's compilation from symbolic expression graphs
to callable objects. See :func:`function.function'.
Type
The ``.type`` of a
:term:`Variable` indicates what kinds of values might be computed for it in a
compiled graph.
An instance that inherits from :class:`Type`, and is used as the
``.type`` attribute of a :term:`Variable`.
See also :term:`Variable`, :term:`Op`, and :term:`Apply`,
or read more about :ref:`tutorial_graphstructures`.
Variable
Variable
A :ref:`Variable` is the main data structure you work with when
The the main data structure you work with when using Theano.
using Theano. The symbolic inputs that you operate on are
For example,
Variables and what you get from applying various operations to
these inputs are also Variables. For example, when I type
>>> x = theano.tensor.ivector()
>>> x = theano.tensor.ivector()
>>> y = -x
>>> y = -x
**2
``x`` and ``y`` are both Variables, i.e. instances of the
``x`` and ``y`` are both `Variables`, i.e. instances of the :class:`Variable` class.
:api:`Variable <theano.gof.graph.Variable>` class. The
:term:`Type` of both ``x`` and ``y`` is
``theano.tensor.ivector``.
For more information, see: :ref:`variable`.
See also :term:`Type`, :term:`Op`, and :term:`Apply`,
or read more about :ref:`tutorial_graphstructures`.
view
View
WRITEME
Some Tensor Ops (such as Subtensor and Transpose) can be computed in
constant time by simply re-indexing their inputs. The outputs from
[the Apply instances from] such Ops are called `Views` because their
storage might be aliased to the storage of other variables (the inputs
of the Apply). It is important for Theano to know which Variables are
views of which other ones in order to introduce :term:`Destructive`
Ops correctly.
View Ops are indicated via a ``view_map`` Op attribute. (See
:class:`gof.Op`.
...
...
doc/library/tensor/basic.txt
浏览文件 @
f761867c
...
@@ -485,11 +485,6 @@ Linear Algebra
...
@@ -485,11 +485,6 @@ Linear Algebra
Fourier Transforms
==================
[James has some code for this, but hasn't gotten it into the source tree yet.]
Gradient / Differentiation
Gradient / Differentiation
==========================
==========================
...
...
doc/library/tensor/signal.txt
浏览文件 @
f761867c
...
@@ -17,3 +17,6 @@ TODO: Give examples for how to use these things! They are pretty complicated.
...
@@ -17,3 +17,6 @@ TODO: Give examples for how to use these things! They are pretty complicated.
.. function:: downsample2D(*todo)
.. function:: downsample2D(*todo)
.. function:: fft(*todo)
.. function:: fft(*todo)
[James has some code for this, but hasn't gotten it into the source tree yet.]
doc/tutorial/adding.txt
浏览文件 @
f761867c
...
@@ -110,7 +110,7 @@ and giving ``z`` as output:
...
@@ -110,7 +110,7 @@ and giving ``z`` as output:
>>> f = function([x, y], z)
>>> f = function([x, y], z)
The first argument to :
ref:`function <libdoc_compile_
function>` is a list of Variables
The first argument to :
func:`function <function.
function>` is a list of Variables
that will be provided as inputs to the function. The second argument
that will be provided as inputs to the function. The second argument
is a single Variable *or* a list of Variables. For either case, the second
is a single Variable *or* a list of Variables. For either case, the second
argument is what we want to see as output when we apply the function.
argument is what we want to see as output when we apply the function.
...
...
doc/tutorial/examples.txt
浏览文件 @
f761867c
...
@@ -214,7 +214,7 @@ internal state, and returns the old state value.
...
@@ -214,7 +214,7 @@ internal state, and returns the old state value.
>>> accumulator = function([inc], state, updates=[(state, state+inc)])
>>> accumulator = function([inc], state, updates=[(state, state+inc)])
This code introduces a few new concepts. The ``shared`` function constructs
This code introduces a few new concepts. The ``shared`` function constructs
so-called
*shared variables*
. These are hybrid symbolic and non-symbolic
so-called
:term:shared variables:
. These are hybrid symbolic and non-symbolic
variables. Shared variables can be used in symbolic expressions just like
variables. Shared variables can be used in symbolic expressions just like
the objects returned by ``dmatrices(...)`` but they also have a ``.value``
the objects returned by ``dmatrices(...)`` but they also have a ``.value``
property that defines the value taken by this symbolic variable in *all* the
property that defines the value taken by this symbolic variable in *all* the
...
...
doc/tutorial/modes.txt
浏览文件 @
f761867c
...
@@ -8,7 +8,7 @@ Using different compiling modes
...
@@ -8,7 +8,7 @@ Using different compiling modes
Mode
Mode
====
====
Everytime :
ref:`theano.function <libdoc_compile_
function>` is called
Everytime :
func:`theano.function <function.
function>` is called
the symbolic relationships between the input and output Theano *variables*
the symbolic relationships between the input and output Theano *variables*
are optimized and compiled. The way this compilation occurs
are optimized and compiled. The way this compilation occurs
is controlled by the value of the ``mode`` parameter.
is controlled by the value of the ``mode`` parameter.
...
@@ -25,7 +25,7 @@ The default mode is typically ``FAST_RUN``, but it can be controlled via
...
@@ -25,7 +25,7 @@ The default mode is typically ``FAST_RUN``, but it can be controlled via
the environment variable ``THEANO_DEFAULT_MODE``, which can in turn be
the environment variable ``THEANO_DEFAULT_MODE``, which can in turn be
overridden by setting `theano.compile.mode.default_mode` directly,
overridden by setting `theano.compile.mode.default_mode` directly,
which can in turn be overridden by passing the keyword argument to
which can in turn be overridden by passing the keyword argument to
:
ref:`theano.function <libdoc_compile_
function>`.
:
func:`theano.function <function.
function>`.
================= =============================================================== ===============================================================================
================= =============================================================== ===============================================================================
short name Full constructor What does it do?
short name Full constructor What does it do?
...
@@ -91,7 +91,7 @@ ProfileMode
...
@@ -91,7 +91,7 @@ ProfileMode
Beside checking for errors, another important task is to profile your
Beside checking for errors, another important task is to profile your
code. For this Theano uses a special mode called ProfileMode which has
code. For this Theano uses a special mode called ProfileMode which has
to be passed as an argument to :
ref:`theano.function <libdoc_compile_
function>`. Using the ProfileMode is a three-step process.
to be passed as an argument to :
func:`theano.function <function.
function>`. Using the ProfileMode is a three-step process.
Creating a ProfileMode Instance
Creating a ProfileMode Instance
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论