Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
8feaa75a
提交
8feaa75a
authored
8月 14, 2015
作者:
Iban Harlouchet
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
numpydoc for theano/compile/ops.py
上级
1077f41d
显示空白字符变更
内嵌
并排
正在显示
1 个修改的文件
包含
148 行增加
和
92 行删除
+148
-92
ops.py
theano/compile/ops.py
+148
-92
没有找到文件。
theano/compile/ops.py
浏览文件 @
8feaa75a
"""This file contains auxiliary Ops, used during the compilation phase
"""
and Ops building class (:class:`FromFunctionOp`) and decorator
This file contains auxiliary Ops, used during the compilation phase and Ops
(:func:`as_op`) that help make new Ops more rapidly.
building class (:class:`FromFunctionOp`) and decorator (:func:`as_op`) that
help make new Ops more rapidly.
"""
"""
import
copy
import
copy
...
@@ -18,14 +19,19 @@ import numpy
...
@@ -18,14 +19,19 @@ import numpy
def
register_view_op_c_code
(
type
,
code
,
version
=
()):
def
register_view_op_c_code
(
type
,
code
,
version
=
()):
""" Tell ViewOp how to generate C code for a Theano Type
"""
Tell ViewOp how to generate C code for a Theano Type.
:param type: A Theano type. It must be the Theano class itself and not an
instance of the class.
Parameters
:param code: C code that returns a view for the Theano type 'type'.
----------
Use
%(iname)
s and
%(oname)
s for the input and output C
type : Theano type
variable names respectively.
It must be the Theano class itself and not an instance of the class.
:param version: A number indicating the version of the code, for cache.
code : C code
Returns a view for the Theano type 'type'. Use
%(iname)
s and
%(oname)
s
for the input and output C variable names respectively.
version
A number indicating the version of the code, for cache.
"""
"""
ViewOp
.
c_code_and_version
[
type
]
=
(
code
,
version
)
ViewOp
.
c_code_and_version
[
type
]
=
(
code
,
version
)
...
@@ -33,7 +39,9 @@ def register_view_op_c_code(type, code, version=()):
...
@@ -33,7 +39,9 @@ def register_view_op_c_code(type, code, version=()):
class
ViewOp
(
gof
.
Op
):
class
ViewOp
(
gof
.
Op
):
"""
"""
Returns an inplace view of the input. Used internally by Theano.
Returns an inplace view of the input. Used internally by Theano.
"""
"""
view_map
=
{
0
:
[
0
]}
view_map
=
{
0
:
[
0
]}
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
# In the C code, the name of the input variable is %(iname)s,
# In the C code, the name of the input variable is %(iname)s,
...
@@ -96,9 +104,9 @@ class OutputGuard(ViewOp):
...
@@ -96,9 +104,9 @@ class OutputGuard(ViewOp):
Only the AddDestroyHandler optimizer tries to insert them in the graph.
Only the AddDestroyHandler optimizer tries to insert them in the graph.
This Op is declared as destructive while it is not destroying
This Op is declared as destructive while it is not destroying
anything.
anything. It returns a view. This is used to prevent destruction of
It returns a view. This is used to prevent destruction of the output
the output
variables of a Theano function.
variables of a Theano function.
There is a mechanism in Theano that should prevent this, but the use
There is a mechanism in Theano that should prevent this, but the use
of OutputGuard adds a safeguard: it may be possible for some optimization
of OutputGuard adds a safeguard: it may be possible for some optimization
...
@@ -106,6 +114,7 @@ class OutputGuard(ViewOp):
...
@@ -106,6 +114,7 @@ class OutputGuard(ViewOp):
making in-place optimizations.
making in-place optimizations.
TODO: find a current full explanation.
TODO: find a current full explanation.
"""
"""
destroy_map
=
{
0
:
[
0
]}
destroy_map
=
{
0
:
[
0
]}
...
@@ -115,14 +124,19 @@ _output_guard = OutputGuard()
...
@@ -115,14 +124,19 @@ _output_guard = OutputGuard()
def
register_deep_copy_op_c_code
(
typ
,
code
,
version
=
()):
def
register_deep_copy_op_c_code
(
typ
,
code
,
version
=
()):
""" Tell DeepCopyOp how to generate C code for a Theano Type
"""
Tell DeepCopyOp how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not an
instance of the class.
Parameters
:param code: C code that deep copies the Theano type 'typ'.
----------
Use
%(iname)
s and
%(oname)
s for the input and output C
typ : Theano type
variable names respectively.
It must be the Theano class itself and not an instance of the class.
:param version: A number indicating the version of the code, for cache.
code: C code
Deep copies the Theano type 'typ'. Use
%(iname)
s and
%(oname)
s for the
input and output C variable names respectively.
version
A number indicating the version of the code, for cache.
"""
"""
DeepCopyOp
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
DeepCopyOp
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
...
@@ -189,15 +203,20 @@ deep_copy_op = DeepCopyOp()
...
@@ -189,15 +203,20 @@ deep_copy_op = DeepCopyOp()
def
register_shape_c_code
(
type
,
code
,
version
=
()):
def
register_shape_c_code
(
type
,
code
,
version
=
()):
""" Tell Shape Op how to generate C code for a Theano Type
"""
Tell Shape Op how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not an
instance of the class.
Parameters
:param code: C code that return a vector representing the shape
----------
for the Theano type 'typ'.
typ : Theano type
Use
%(iname)
s and
%(oname)
s for the input and output C
It must be the Theano class itself and not an instance of the class.
variable names respectively.
code : C code
:param version: A number indicating the version of the code, for cache.
Returns a vector representing the shape for the Theano type 'typ'.
Use
%(iname)
s and
%(oname)
s for the input and output C variable names
respectively.
version
A number indicating the version of the code, for cache.
"""
"""
Shape
.
c_code_and_version
[
type
]
=
(
code
,
version
)
Shape
.
c_code_and_version
[
type
]
=
(
code
,
version
)
...
@@ -206,8 +225,12 @@ class Shape(gof.Op):
...
@@ -206,8 +225,12 @@ class Shape(gof.Op):
"""
"""
L{Op} to return the shape of a matrix.
L{Op} to return the shape of a matrix.
@note: Non-differentiable.
Notes
-----
Non-differentiable.
"""
"""
_f16_ok
=
True
_f16_ok
=
True
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
...
@@ -293,8 +316,12 @@ class Shape_i(gof.Op):
...
@@ -293,8 +316,12 @@ class Shape_i(gof.Op):
"""
"""
L{Op} to return the shape of a matrix.
L{Op} to return the shape of a matrix.
@note: Non-differentiable.
Notes
-----
Non-differentiable.
"""
"""
_f16_ok
=
True
_f16_ok
=
True
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
...
@@ -381,18 +408,24 @@ class Shape_i(gof.Op):
...
@@ -381,18 +408,24 @@ class Shape_i(gof.Op):
def
shape_i
(
var
,
i
,
fgraph
=
None
):
def
shape_i
(
var
,
i
,
fgraph
=
None
):
"""Equivalent of var.shape[i], but apply if possible the shape
"""
feature optimization
Equivalent of var.shape[i], but apply if possible the shape feature
optimization.
This is useful in optimization that need to get the shape. This
This is useful in optimization that need to get the shape. This
remove the need of the following shape_feature optimization that
remove the need of the following shape_feature optimization that
convert it. So this speed up optimization and remove Equilibrium
convert it. So this speed up optimization and remove Equilibrium
max iteration problems.
max iteration problems.
:param var: the variable we want to take the shape of
Parameters
:param i: The shape dimensions we want
----------
:param fgraph: optional. If var.fgraph do not exist, the fgraph that
var
have the shape_feature to introduce var in to get the optimized shape.
The variable we want to take the shape of.
i
The shape dimensions we want
fgraph : optional
If var.fgraph do not exist, the fgraph that have the shape_feature to
introduce var in to get the optimized shape.
"""
"""
if
fgraph
is
None
and
hasattr
(
var
,
'fgraph'
):
if
fgraph
is
None
and
hasattr
(
var
,
'fgraph'
):
...
@@ -421,15 +454,20 @@ def shape_i(var, i, fgraph=None):
...
@@ -421,15 +454,20 @@ def shape_i(var, i, fgraph=None):
def
register_shape_i_c_code
(
typ
,
code
,
check_input
,
version
=
()):
def
register_shape_i_c_code
(
typ
,
code
,
check_input
,
version
=
()):
""" Tell Shape_i how to generate C code for a Theano Type
"""
Tell Shape_i how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not
an instance of the class.
Parameters
:param code: C code that gets the shape of dimensions
%(i)
s for the
----------
Theano type 'typ'.
typ : Theano type
Use
%(iname)
s and
%(oname)
s for the input and output C
It must be the Theano class itself and not an instance of the class.
variable names respectively.
code : C code
:param version: A number indicating the version of the code, for cache.
Gets the shape of dimensions
%(i)
s for the Theano type 'typ'.
Use
%(iname)
s and
%(oname)
s for the input and output C variable names
respectively.
version
A number indicating the version of the code, for cache.
"""
"""
Shape_i
.
c_code_and_version
[
typ
]
=
(
code
,
check_input
,
version
)
Shape_i
.
c_code_and_version
[
typ
]
=
(
code
,
check_input
,
version
)
...
@@ -459,6 +497,7 @@ class FromFunctionOp(gof.Op):
...
@@ -459,6 +497,7 @@ class FromFunctionOp(gof.Op):
Also the gradient is undefined in the resulting op and Theano will
Also the gradient is undefined in the resulting op and Theano will
raise an error if you attempt to get the gradient of a graph
raise an error if you attempt to get the gradient of a graph
containing this op.
containing this op.
"""
"""
def
__init__
(
self
,
fn
,
itypes
,
otypes
,
infer_shape
):
def
__init__
(
self
,
fn
,
itypes
,
otypes
,
infer_shape
):
...
@@ -519,29 +558,29 @@ class FromFunctionOp(gof.Op):
...
@@ -519,29 +558,29 @@ class FromFunctionOp(gof.Op):
def
as_op
(
itypes
,
otypes
,
infer_shape
=
None
):
def
as_op
(
itypes
,
otypes
,
infer_shape
=
None
):
"""
"""
Decorator that converts a function into a basic Theano op that
Decorator that converts a function into a basic Theano op that
will call
will call
the supplied function as its implementation.
the supplied function as its implementation.
It takes an optional infer_shape parameter that should be a
It takes an optional infer_shape parameter that should be a
callable with
callable with
this signature:
this signature:
def infer_shape(node, input_shapes):
def infer_shape(node, input_shapes):
...
...
return output_shapes
return output_shapes
Here `input_shapes` and `output_shapes` are lists of tuples that
Here `input_shapes` and `output_shapes` are lists of tuples that represent
represent the shape of the corresponding inputs/outputs.
the shape of the corresponding inputs/outputs.
This should not be used when performance is a concern since the
very basic nature of the resulting Op may interfere with certain
graph optimizations.
Example usage:
This should not be used when performance is a concern since the very basic
nature of the resulting Op may interfere with certain graph optimizations.
Examples
--------
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
otypes=[theano.tensor.fmatrix])
otypes=[theano.tensor.fmatrix])
def numpy_dot(a, b):
def numpy_dot(a, b):
return numpy.dot(a, b)
return numpy.dot(a, b)
"""
"""
if
not
isinstance
(
itypes
,
(
list
,
tuple
)):
if
not
isinstance
(
itypes
,
(
list
,
tuple
)):
itypes
=
[
itypes
]
itypes
=
[
itypes
]
...
@@ -565,18 +604,19 @@ def as_op(itypes, otypes, infer_shape=None):
...
@@ -565,18 +604,19 @@ def as_op(itypes, otypes, infer_shape=None):
def
register_rebroadcast_c_code
(
typ
,
code
,
version
=
()):
def
register_rebroadcast_c_code
(
typ
,
code
,
version
=
()):
"""Tell Rebroadcast how to generate C code for a Theano Type
"""
Tell Rebroadcast how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not an
instance of the class.
typ : Theano type
It must be the Theano class itself and not an instance of the class.
:param code: C code that checks if the dimension
%(axis)
s is of
code : C code
shape 1 for the Theano type 'typ'. Use
%(iname)
s and
That checks if the dimension
%(axis)
s is of shape 1 for the Theano type
%(oname)
s for the input and output C variable names
'typ'. Use
%(iname)
s and
%(oname)
s for the input and output C variable
respectively, and
%(axis)
s for the axis that we need to
names respectively, and
%(axis)
s for the axis that we need to check.
check. This code is put in a loop for all axes.
This code is put in a loop for all axes.
version
A number indicating the version of the code, for cache.
:param version: A number indicating the version of the code, for cache.
"""
"""
Rebroadcast
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
Rebroadcast
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
...
@@ -585,17 +625,23 @@ class Rebroadcast(gof.Op):
...
@@ -585,17 +625,23 @@ class Rebroadcast(gof.Op):
"""
"""
Change the input's broadcastable fields in some predetermined way.
Change the input's broadcastable fields in some predetermined way.
:code:`Rebroadcast((0, True), (1, False))(x)` would make :code:`x`
See Also
broadcastable in axis 0 and not broadcastable in axis 1
--------
unbroadcast <theano.tensor.unbroadcast>
addbroadcast <theano.tensor.addbroadcast>
patternbroadcast <theano.tensor.patternbroadcast>
.. seealso::
Notes
-----
Works inplace and works for CudaNdarrayType.
:func:`unbroadcast <theano.tensor.unbroadcast>`
Example
:func:`addbroadcast <theano.tensor.addbroadcast>`
-------
:func:`patternbroadcast <theano.tensor.patternbroadcast>`
`Rebroadcast((0, True), (1, False))(x)` would make `x` broadcastable in
axis 0 and not broadcastable in axis 1.
..note: works inplace and works for CudaNdarrayType
"""
"""
view_map
=
{
0
:
[
0
]}
view_map
=
{
0
:
[
0
]}
_f16_ok
=
True
_f16_ok
=
True
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
...
@@ -717,17 +763,23 @@ class Rebroadcast(gof.Op):
...
@@ -717,17 +763,23 @@ class Rebroadcast(gof.Op):
def
register_specify_shape_c_code
(
typ
,
code
,
version
=
(),
def
register_specify_shape_c_code
(
typ
,
code
,
version
=
(),
c_support_code_apply
=
None
):
c_support_code_apply
=
None
):
""" Tell SpecifyShape how to generate C code for a Theano Type
"""
Tell SpecifyShape how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and
not an instance of the class.
Parameters
:param code: C code that checks the shape and returns a view for
----------
the Theano type 'typ'. Use
%(iname)
s and
%(oname)
s
typ : Theano type
for the input and output C variable names
It must be the Theano class itself and not an instance of the class.
respectively.
%(shape)
s is the vector of shape of
code : C code
%(iname)
s. Check that its length is good.
Checks the shape and returns a view for the Theano type 'typ'.
:param version: A number indicating the version of the code, for cache.
Use
%(iname)
s and
%(oname)
s for the input and output C variable names
:param c_support_code_apply: extra code.
respectively.
%(shape)
s is the vector of shape of
%(iname)
s.
Check that its length is good.
version
A number indicating the version of the code, for cache.
c_support_code_apply
Extra code.
"""
"""
SpecifyShape
.
c_code_and_version
[
typ
]
=
(
code
,
version
,
SpecifyShape
.
c_code_and_version
[
typ
]
=
(
code
,
version
,
c_support_code_apply
)
c_support_code_apply
)
...
@@ -742,12 +794,16 @@ class SpecifyShape(gof.Op):
...
@@ -742,12 +794,16 @@ class SpecifyShape(gof.Op):
the case most of the time if we only take the shape of the output.
the case most of the time if we only take the shape of the output.
Maybe there are other optimizations that will mess with this.
Maybe there are other optimizations that will mess with this.
@note: Maybe in the future we will never do the assert!
Notes
@note: We currently don't support specifying partial shape information.
-----
Maybe in the future we will never do the assert!
We currently don't support specifying partial shape information.
TODO : test this op with sparse and cuda ndarray. Do C code for them too.
@todo: test this op with sparse and cuda ndarray.
Do C code for them too.
"""
"""
view_map
=
{
0
:
[
0
]}
view_map
=
{
0
:
[
0
]}
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
# In the C code, the name of the input variable is %(iname)s,
# In the C code, the name of the input variable is %(iname)s,
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论