Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
a266b4c1
提交
a266b4c1
authored
4月 13, 2010
作者:
Frederic Bastien
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
apply rebroadcast opt during graph compilation in fct unbroadcast and addbroadcast.
This is done to don't insert Rerebroadcast op that do nothing after those optimization are applied. This happen with GpuJoin.
上级
3a3a18f8
隐藏空白字符变更
内嵌
并排
正在显示
3 个修改的文件
包含
74 行增加
和
3 行删除
+74
-3
basic.py
theano/tensor/basic.py
+8
-3
opt.py
theano/tensor/opt.py
+24
-0
test_basic.py
theano/tensor/tests/test_basic.py
+42
-0
没有找到文件。
theano/tensor/basic.py
浏览文件 @
a266b4c1
...
@@ -2682,15 +2682,20 @@ class Rebroadcast(Op):
...
@@ -2682,15 +2682,20 @@ class Rebroadcast(Op):
def
addbroadcast
(
x
,
*
axes
):
def
addbroadcast
(
x
,
*
axes
):
"""
"""
Make the input broadcastable in the specified axes.
Make the input broadcastable in the specified axes.
We apply the opt here to don't pollute the graph especially during the gpu optimization
"""
"""
return
Rebroadcast
(
*
[(
axis
,
True
)
for
axis
in
axes
])(
x
)
rval
=
Rebroadcast
(
*
[(
axis
,
True
)
for
axis
in
axes
])(
x
)
return
theano
.
tensor
.
opt
.
apply_rebroadcast_opt
(
rval
)
def
unbroadcast
(
x
,
*
axes
):
def
unbroadcast
(
x
,
*
axes
):
"""
"""
Make the input impossible to broadcast in the specified axes.
Make the input impossible to broadcast in the specified axes.
We apply the opt here to don't pollute the graph especially during the gpu optimization
"""
"""
r
eturn
Rebroadcast
(
*
[(
axis
,
False
)
for
axis
in
axes
])(
x
)
r
val
=
Rebroadcast
(
*
[(
axis
,
False
)
for
axis
in
axes
])(
x
)
return
theano
.
tensor
.
opt
.
apply_rebroadcast_opt
(
rval
)
class
Join
(
Op
):
class
Join
(
Op
):
...
...
theano/tensor/opt.py
浏览文件 @
a266b4c1
...
@@ -806,6 +806,30 @@ def local_rebroadcast_lift(node):
...
@@ -806,6 +806,30 @@ def local_rebroadcast_lift(node):
rval
=
[
T
.
Rebroadcast
(
*
axis
.
items
())(
iinput
)]
rval
=
[
T
.
Rebroadcast
(
*
axis
.
items
())(
iinput
)]
return
rval
return
rval
def
apply_rebroadcast_opt
(
rval
):
"""
Apply as many times as required the optimization local_useless_rebroadcast
and local_rebroadcast_lift.
:param rval: a Variable
:retrun: a Variable. The same if not optimisation can be applied.
"""
changed
=
True
while
changed
and
rval
.
owner
:
changed
=
False
rval2
=
theano
.
tensor
.
opt
.
local_useless_rebroadcast
.
transform
(
rval
.
owner
)
if
rval2
:
assert
len
(
rval2
)
==
1
rval
=
rval2
[
0
]
changed
=
True
if
rval
.
owner
:
rval2
=
theano
.
tensor
.
opt
.
local_rebroadcast_lift
.
transform
(
rval
.
owner
)
if
rval2
:
assert
len
(
rval2
)
==
1
rval
=
rval2
[
0
]
changed
=
True
return
rval
##################
##################
# Reshape opts #
# Reshape opts #
...
...
theano/tensor/tests/test_basic.py
浏览文件 @
a266b4c1
...
@@ -2507,6 +2507,48 @@ def test_autocast():
...
@@ -2507,6 +2507,48 @@ def test_autocast():
finally
:
finally
:
ac
.
__exit__
()
ac
.
__exit__
()
def
test_unbroadcast_addbroadcast
():
"""
test that the unbroadcast fct don't insert not needed broadcast
and fuse consecutive Rebroadcast op
"""
x
=
matrix
()
assert
unbroadcast
(
x
,
0
)
is
x
assert
unbroadcast
(
x
,
1
)
is
x
assert
unbroadcast
(
x
,
1
,
0
)
is
x
assert
unbroadcast
(
x
,
0
,
1
)
is
x
assert
addbroadcast
(
x
,
0
)
is
not
x
assert
addbroadcast
(
x
,
1
)
is
not
x
assert
addbroadcast
(
x
,
1
,
0
)
.
owner
.
inputs
[
0
]
is
x
assert
unbroadcast
(
addbroadcast
(
x
,
0
),
0
)
is
x
assert
addbroadcast
(
unbroadcast
(
x
,
0
),
0
)
is
not
x
x
=
row
()
assert
unbroadcast
(
x
,
0
)
is
not
x
assert
unbroadcast
(
x
,
1
)
is
x
assert
unbroadcast
(
x
,
1
,
0
)
is
not
x
assert
unbroadcast
(
x
,
0
,
1
)
is
not
x
assert
addbroadcast
(
x
,
0
)
is
x
assert
addbroadcast
(
x
,
1
)
.
owner
.
inputs
[
0
]
is
x
assert
addbroadcast
(
x
,
1
,
0
)
.
owner
.
inputs
[
0
]
is
x
assert
addbroadcast
(
x
,
0
,
1
)
.
owner
.
inputs
[
0
]
is
x
assert
unbroadcast
(
addbroadcast
(
x
,
1
),
1
)
is
x
assert
addbroadcast
(
unbroadcast
(
x
,
1
),
1
)
is
not
x
#the first broadcast is remove the broadcast, so the second
#should not make one
assert
unbroadcast
(
unbroadcast
(
x
,
0
),
0
)
.
owner
.
inputs
[
0
]
is
x
#test that consecutive Rebroadcast op are fused
x
=
TensorType
(
dtype
=
'float64'
,
broadcastable
=
(
True
,
True
))()
assert
unbroadcast
(
unbroadcast
(
x
,
1
),
0
)
.
owner
.
inputs
[
0
]
is
x
assert
addbroadcast
(
unbroadcast
(
x
,
1
),
0
)
.
owner
.
inputs
[
0
]
is
x
assert
addbroadcast
(
unbroadcast
(
x
,
0
),
0
)
is
x
if
__name__
==
'__main__'
:
if
__name__
==
'__main__'
:
if
1
:
if
1
:
unittest
.
main
()
unittest
.
main
()
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论