Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
1aa76646
提交
1aa76646
authored
2月 01, 2016
作者:
Francesco Visin
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Merge graphstructures from tutorial into extending
上级
eba65e5e
全部展开
显示空白字符变更
内嵌
并排
正在显示
6 个修改的文件
包含
4 行增加
和
188 行删除
+4
-188
graphstructures.txt
doc/extending/graphstructures.txt
+0
-0
symbolic_graph_opt.png
doc/extending/pics/symbolic_graph_opt.png
+0
-0
symbolic_graph_unopt.png
doc/extending/pics/symbolic_graph_unopt.png
+0
-0
glossary.txt
doc/glossary.txt
+4
-4
index.txt
doc/tutorial/index.txt
+0
-1
symbolic_graphs.txt
doc/tutorial/symbolic_graphs.txt
+0
-183
没有找到文件。
doc/extending/graphstructures.txt
浏览文件 @
1aa76646
差异被折叠。
点击展开。
doc/
tutorial
/pics/symbolic_graph_opt.png
→
doc/
extending
/pics/symbolic_graph_opt.png
浏览文件 @
1aa76646
File moved
doc/
tutorial
/pics/symbolic_graph_unopt.png
→
doc/
extending
/pics/symbolic_graph_unopt.png
浏览文件 @
1aa76646
File moved
doc/glossary.txt
浏览文件 @
1aa76646
...
@@ -63,7 +63,7 @@ Glossary
...
@@ -63,7 +63,7 @@ Glossary
then compiling them with :term:`theano.function`.
then compiling them with :term:`theano.function`.
See also :term:`Variable`, :term:`Op`, :term:`Apply`, and
See also :term:`Variable`, :term:`Op`, :term:`Apply`, and
:term:`Type`, or read more about :ref:`
tutorial_
graphstructures`.
:term:`Type`, or read more about :ref:`graphstructures`.
Destructive
Destructive
An :term:`Op` is destructive (of particular input[s]) if its
An :term:`Op` is destructive (of particular input[s]) if its
...
@@ -108,7 +108,7 @@ Glossary
...
@@ -108,7 +108,7 @@ Glossary
are provided with Theano, but you can add more.
are provided with Theano, but you can add more.
See also :term:`Variable`, :term:`Type`, and :term:`Apply`,
See also :term:`Variable`, :term:`Type`, and :term:`Apply`,
or read more about :ref:`
tutorial_
graphstructures`.
or read more about :ref:`graphstructures`.
Optimizer
Optimizer
An instance of :class:`Optimizer`, which has the capacity to provide
An instance of :class:`Optimizer`, which has the capacity to provide
...
@@ -141,7 +141,7 @@ Glossary
...
@@ -141,7 +141,7 @@ Glossary
``.type`` attribute of a :term:`Variable`.
``.type`` attribute of a :term:`Variable`.
See also :term:`Variable`, :term:`Op`, and :term:`Apply`,
See also :term:`Variable`, :term:`Op`, and :term:`Apply`,
or read more about :ref:`
tutorial_
graphstructures`.
or read more about :ref:`graphstructures`.
Variable
Variable
The the main data structure you work with when using Theano.
The the main data structure you work with when using Theano.
...
@@ -153,7 +153,7 @@ Glossary
...
@@ -153,7 +153,7 @@ Glossary
``x`` and ``y`` are both `Variables`, i.e. instances of the :class:`Variable` class.
``x`` and ``y`` are both `Variables`, i.e. instances of the :class:`Variable` class.
See also :term:`Type`, :term:`Op`, and :term:`Apply`,
See also :term:`Type`, :term:`Op`, and :term:`Apply`,
or read more about :ref:`
tutorial_
graphstructures`.
or read more about :ref:`graphstructures`.
View
View
Some Tensor Ops (such as Subtensor and Transpose) can be computed in
Some Tensor Ops (such as Subtensor and Transpose) can be computed in
...
...
doc/tutorial/index.txt
浏览文件 @
1aa76646
...
@@ -56,7 +56,6 @@ Advanced configuration and debugging
...
@@ -56,7 +56,6 @@ Advanced configuration and debugging
.. toctree::
.. toctree::
modes
modes
symbolic_graphs
printing_drawing
printing_drawing
debug_faq
debug_faq
nan_tutorial
nan_tutorial
...
...
doc/tutorial/symbolic_graphs.txt
deleted
100644 → 0
浏览文件 @
eba65e5e
.. _tutorial_graphstructures:
================
Graph Structures
================
Theano Graphs
=============
Debugging or profiling code written in Theano is not that simple if you
do not know what goes on under the hood. This chapter is meant to
introduce you to a required minimum of the inner workings of Theano.
For more detail see :ref:`extending`.
The first step in writing Theano code is to write down all mathematical
relations using symbolic placeholders (**variables**). When writing down
these expressions you use operations like ``+``, ``-``, ``**``,
``sum()``, ``tanh()``. All these are represented internally as **ops**.
An *op* represents a certain computation on some type of inputs
producing some type of output. You can see it as a *function definition*
in most programming languages.
Theano builds internally a graph structure composed of interconnected
**variable** nodes, **op** nodes and **apply** nodes. An
*apply* node represents the application of an *op* to some
*variables*. It is important to draw the difference between the
definition of a computation represented by an *op* and its application
to some actual data which is represented by the *apply* node. For more
detail about these building blocks refer to :ref:`variable`, :ref:`op`,
:ref:`apply`. Here is an example of a graph:
**Code**
.. testcode::
import theano.tensor as T
x = T.dmatrix('x')
y = T.dmatrix('y')
z = x + y
**Diagram**
.. _tutorial-graphfigure:
.. figure:: apply.png
:align: center
Interaction between instances of Apply (blue), Variable (red), Op (green),
and Type (purple).
.. # COMMENT
WARNING: hyper-links and ref's seem to break the PDF build when placed
into this figure caption.
Arrows in this figure represent references to the
Python objects pointed at. The blue
box is an :ref:`Apply` node. Red boxes are :ref:`Variable` nodes. Green
circles are :ref:`Ops <op>`. Purple boxes are :ref:`Types <type>`.
The graph can be traversed starting from outputs (the result of some
computation) down to its inputs using the owner field.
Take for example the following code:
>>> import theano
>>> x = theano.tensor.dmatrix('x')
>>> y = x * 2.
If you enter ``type(y.owner)`` you get ``<class 'theano.gof.graph.Apply'>``,
which is the apply node that connects the op and the inputs to get this
output. You can now print the name of the op that is applied to get
*y*:
>>> y.owner.op.name
'Elemwise{mul,no_inplace}'
Hence, an elementwise multiplication is used to compute *y*. This
multiplication is done between the inputs:
>>> len(y.owner.inputs)
2
>>> y.owner.inputs[0]
x
>>> y.owner.inputs[1]
DimShuffle{x,x}.0
Note that the second input is not 2 as we would have expected. This is
because 2 was first :term:`broadcasted <broadcasting>` to a matrix of
same shape as *x*. This is done by using the op ``DimShuffle`` :
>>> type(y.owner.inputs[1])
<class 'theano.tensor.var.TensorVariable'>
>>> type(y.owner.inputs[1].owner)
<class 'theano.gof.graph.Apply'>
>>> y.owner.inputs[1].owner.op # doctest: +SKIP
<theano.tensor.elemwise.DimShuffle object at 0x106fcaf10>
>>> y.owner.inputs[1].owner.inputs
[TensorConstant{2.0}]
Starting from this graph structure it is easier to understand how
*automatic differentiation* proceeds and how the symbolic relations
can be *optimized* for performance or stability.
Automatic Differentiation
=========================
Having the graph structure, computing automatic differentiation is
simple. The only thing :func:`tensor.grad` has to do is to traverse the
graph from the outputs back towards the inputs through all *apply*
nodes (*apply* nodes are those that define which computations the
graph does). For each such *apply* node, its *op* defines
how to compute the *gradient* of the node's outputs with respect to its
inputs. Note that if an *op* does not provide this information,
it is assumed that the *gradient* is not defined.
Using the
`chain rule <http://en.wikipedia.org/wiki/Chain_rule>`_
these gradients can be composed in order to obtain the expression of the
*gradient* of the graph's output with respect to the graph's inputs .
A following section of this tutorial will examine the topic of :ref:`differentiation<tutcomputinggrads>`
in greater detail.
Optimizations
=============
When compiling a Theano function, what you give to the
:func:`theano.function <function.function>` is actually a graph
(starting from the output variables you can traverse the graph up to
the input variables). While this graph structure shows how to compute
the output from the input, it also offers the possibility to improve the
way this computation is carried out. The way optimizations work in
Theano is by identifying and replacing certain patterns in the graph
with other specialized patterns that produce the same results but are either
faster or more stable. Optimizations can also detect
identical subgraphs and ensure that the same values are not computed
twice or reformulate parts of the graph to a GPU specific version.
For example, one (simple) optimization that Theano uses is to replace
the pattern :math:`\frac{xy}{y}` by *x.*
Further information regarding the optimization
:ref:`process<optimization>` and the specific :ref:`optimizations<optimizations>` that are applicable
is respectively available in the library and on the entrance page of the documentation.
**Example**
Symbolic programming involves a change of paradigm: it will become clearer
as we apply it. Consider the following example of optimization:
>>> import theano
>>> a = theano.tensor.vector("a") # declare symbolic variable
>>> b = a + a ** 10 # build symbolic expression
>>> f = theano.function([a], b) # compile function
>>> print(f([0, 1, 2])) # prints `array([0,2,1026])`
[ 0. 2. 1026.]
>>> theano.printing.pydotprint(b, outfile="./pics/symbolic_graph_unopt.png", var_with_name_simple=True) # doctest: +SKIP
The output file is available at ./pics/symbolic_graph_unopt.png
>>> theano.printing.pydotprint(f, outfile="./pics/symbolic_graph_opt.png", var_with_name_simple=True) # doctest: +SKIP
The output file is available at ./pics/symbolic_graph_opt.png
.. |g1| image:: ./pics/symbolic_graph_unopt.png
:width: 500 px
.. |g2| image:: ./pics/symbolic_graph_opt.png
:width: 500 px
We used :func:`theano.printing.pydotprint` to visualize the optimized graph
(right), which is much more compact than the unoptimized graph (left).
====================================================== =====================================================
Unoptimized graph Optimized graph
====================================================== =====================================================
|g1| |g2|
====================================================== =====================================================
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论