提交 c5847af9 authored 作者: Pascal Lamblin's avatar Pascal Lamblin

Use shape_of instead of explicitly inserting Shape_i

These Shape_i nodes will not always be optimized away otherwise, and they prevent local_shape_to_shape_i from being applied. Instead, we manually apply the equivalent of local_shape_to_shape_i, by using shape_of[...] directly.
上级 450dfcb2
......@@ -924,7 +924,7 @@ class ShapeFeature(object):
# worst case, we loop over shape_of and replace things
raise NotImplementedError(s_i)
# s_i is x.shape[i], we change it to Shape_i.
# s_i is x.shape[i] for some x, we change it to shape_of[x][i]
if (s_i.owner and
isinstance(s_i.owner.op, Subtensor) and
s_i.owner.inputs[0].owner and
......@@ -940,9 +940,13 @@ class ShapeFeature(object):
idx = idx[0]
try:
i = get_scalar_constant_value(idx)
s_i = Shape_i(i)(s_i.owner.inputs[0].owner.inputs[0])
except NotScalarConstantError:
pass
else:
# Executed only if no exception was raised
x = s_i.owner.inputs[0].owner.inputs[0]
# x should already have been imported, and should be in shape_of.
s_i = self.shape_of[x][i]
if s_i.type.dtype[:3] in ('int', 'uint'):
if getattr(s_i.type, 'ndim', 0):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论