提交 d91fac31 authored 作者: Pascal Lamblin's avatar Pascal Lamblin

Update comment.

上级 41976981
...@@ -3381,14 +3381,12 @@ class Join(Op): ...@@ -3381,14 +3381,12 @@ class Join(Op):
if len(as_tensor_variable_args) == 1: if len(as_tensor_variable_args) == 1:
bcastable = list(as_tensor_variable_args[0].type.broadcastable) bcastable = list(as_tensor_variable_args[0].type.broadcastable)
else: else:
# When the axis is fixed, the broadcastable dimensions remain, except # When the axis is fixed, a dimension should be
# for the axis dimension. # broadcastable if at least one of the inputs is
# All concatenated elements must also have the same broadcastable # broadcastable on that dimension (see justification below),
# dimensions. # except for the axis dimension.
# initialize bcastable all false, and then fill in some trues with # Initialize bcastable all false, and then fill in some trues with
# the loops -- a dimension should be broadcastable if at least one # the loops.
# of the inputs is broadcastable on that dimension (see
# justification below)
bcastable = [False] * len(as_tensor_variable_args[0].type.broadcastable) bcastable = [False] * len(as_tensor_variable_args[0].type.broadcastable)
ndim = len(bcastable) ndim = len(bcastable)
if isinstance(axis, int): if isinstance(axis, int):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论