Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
261bfadf
提交
261bfadf
authored
10月 21, 2011
作者:
nouiz
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request #137 from pascanur/op_documentations
Op documentation
上级
3b4d5e68
668a1cee
显示空白字符变更
内嵌
并排
正在显示
2 个修改的文件
包含
62 行增加
和
38 行删除
+62
-38
extending_theano.txt
doc/cifarSC2011/extending_theano.txt
+27
-25
op.txt
doc/extending/op.txt
+35
-13
没有找到文件。
doc/cifarSC2011/extending_theano.txt
浏览文件 @
261bfadf
...
@@ -10,8 +10,8 @@ Theano graphs
...
@@ -10,8 +10,8 @@ Theano graphs
- Theano works with symbolic graphs
- Theano works with symbolic graphs
- Those graphs are bi-partite graphs (graph with 2 types of nodes)
- Those graphs are bi-partite graphs (graph with 2 types of nodes)
- Th
ose 2 nodes types are Apply
and Variable nodes
- Th
e 2 types are Apply nodes
and Variable nodes
- Apply node
have a link to the Op that it
execute
- Apply node
s have a link to the Op they
execute
Inputs and Outputs are lists of Theano variables
Inputs and Outputs are lists of Theano variables
...
@@ -50,33 +50,35 @@ Op contract
...
@@ -50,33 +50,35 @@ Op contract
.. ../extending/op.txt
.. ../extending/op.txt
There is 2 mandatory function. The first is :func:`make_node`. The
There are 2 mandatory methods that one needs to implement.
second is the one that do/tell the computation to do at run
The first one is :func:`make_node`. The second one
time. Currently you have 4 posibility: implement the :func:`perform`
would describe the computations that are required to be done
and/or :func:`c_code <Op.c_code>` (and other related :ref:`c functions
at run time. Currently there are 2 different possibilites:
<cop>`), or the :func:`make_thunk` function. The ``perform`` allow you
implement the :func:`perform`
to easily wrap an existing python function in Theano. The ``c_code``
and/or :func:`c_code <Op.c_code>` (and other related :ref:`c methods
and related function allow you to have your op generate c code and
<cop>`), or the :func:`make_thunk` method. The ``perform`` allows
have Theano compile and link to it. The ``make_thunk`` function will
to easily wrap an existing python function into Theano. The ``c_code``
be called during compilation and should generate a ``thunk``: a
and related methods allow the op to generate c code that will be
function that when called will do the wanted computation. This is
compiled and linked by Theano. On the other hand, the ``make_thunk``
usefull if you want to generate code and compile it yourself. For
method will be called only once during compilation and should generate
example, this allow you to use PyCUDA to compile gpu code.
a ``thunk``: a standalone function that when called will do the wanted computations.
This is usefull if you want to generate code and compile it yourself. For
There is 2 mandatory/highly suggested function. They are needed to for a basic
example, this allows you to use PyCUDA to compile gpu code.
optimization that merge duplicate computation in a Theano function. So
if you don't want Theano to do you computation multiple time for no
Also there are 2 methods that are highly recommended to be implemented. They are
good reason, implement them! Those function are :func:`__eq__` and
needed in order to merge duplicate computations involving your op. So if you
do not want Theano to execute your op multiple times with the same inputs,
do implement them. Those methods are :func:`__eq__` and
:func:`__hash__`.
:func:`__hash__`.
The :func:`infer_shape` method allow
some very interesting
The :func:`infer_shape` method allow
s to infer shape of some variable, somewhere in the
optimization like don't performing the computation of your op just to
middle of the computational graph without actually computing the outputs (when possible).
take the shape your Op's output
.
This could be helpful if one only needs the shape of the output instead of the actual outputs
.
The :func:`grad` method is
needed you want want differentiation to
The :func:`grad` method is
required if you want to differentiate some cost whose expression
work with
your op.
includes
your op.
The :func:`__str__` is usefull to
have a better printing of you op
.
The :func:`__str__` is usefull to
generate a better name for your op when printing
.
The :func:`R_op` is needed if you want theano.tensor.Rop to work with your op.
The :func:`R_op` is needed if you want theano.tensor.Rop to work with your op.
...
...
doc/extending/op.txt
浏览文件 @
261bfadf
...
@@ -142,13 +142,14 @@ following methods:
...
@@ -142,13 +142,14 @@ following methods:
Optional.
Optional.
This function is needed for shape optimization. ``shapes`` is a
This function is needed for shape optimization. ``shapes`` is a
list with one tuple for each input the Apply node linked to this op
list with one tuple for each input of the Apply node (which corresponds
have. Each tuple contain 1 element for each dimensions of the
to the inputs of the op). Each tuple contains 1 element for
corresponding inputs. The value is the the corresponding
each dimension of the corresponding input. The value is the
dimensions shape of the corresponding inputs.
shape (number of elements) along the corresponding dimension of that
specific input.
This sound complicated, but this is just the corresponding inputs
While this might sound complicated, it is nothing more then the shape
shape in symbolic variable
.
of each input as symbolic variables (one per dimension)
.
The function should return a list with one tuple for each output.
The function should return a list with one tuple for each output.
Each tuple should contain the corresponding output's shape.
Each tuple should contain the corresponding output's shape.
...
@@ -161,9 +162,30 @@ following methods:
...
@@ -161,9 +162,30 @@ following methods:
Optional.
Optional.
This function is needed for theano.tensor.Rop to work with this op.
This function implements the application of the R-operator on the
function represented by your op. Let assume that function is :math:`f`,
TODO: add more detail.
with input :math:`x`, applying the R-operator means computing the
Jacobian of :math:`f` and right-multiplying it by :math:`v`, the evaluation
point, namely: :math:`\frac{\partial f}{\partial x} v`.
``inputs`` are the symbolic variables corresponding to the value of
the input where you want to evaluate the jacobian, and ``eval_points``
are the symbolic variables corresponding to the value you want to
right multiply the jacobian with.
Same conventions as for the grad method hold. If your op is not
differentiable, you can return None. Note that in contrast to
the method :func:`grad`, for :func:`R_op` you need to return the
same number of outputs as there are ouputs of the op. You can think
of it in the following terms. You have all your inputs concatenated
into a single vector :math:`x`. You do the same with the evaluation
points (which are as many as inputs and of the shame shape) and obtain
another vector :math:`v`. For each output, you reshape it into a vector,
compute the jacobian of that vector with respect to :math:`x` and
multiply it by :math:`v`. As a last step you reshape each of these
vectors you obtained for each outputs (that have the same shape as
the outputs) back to their corresponding shapes and return them as the
output of the :func:`R_op` method.
.. attribute:: default_output
.. attribute:: default_output
...
@@ -180,15 +202,15 @@ following methods:
...
@@ -180,15 +202,15 @@ following methods:
Syntactic shortcut to make_node which returns the output
Syntactic shortcut to make_node which returns the output
Variables of the Op.
Variables of the Op.
*Default:* this is
done for you by Op
.
*Default:* this is
implemented in the parent class and you do not need to change it
.
.. function:: __str__()
.. function:: __str__()
*Default:* python default: module_path_to_your_class.CLASSNAME
*Default:* python default: module_path_to_your_class.CLASSNAME
This allow
you to have a better printing of Op. If an Op have parameter
This allow
s for better printing of the Op. If the Op parameterizable, it is highly
it is highly recommented that it make the ``__str__`` function
recommended to implement this method, showing the value of the different parameters
print the name of the op and the Op's parameters values
.
in the current instance's name
.
At a bare minimum, a new Op must define ``make_node`` and ``perform``, which have no defaults.
At a bare minimum, a new Op must define ``make_node`` and ``perform``, which have no defaults.
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论