提交 ff79c173 authored 作者: James Bergstra's avatar James Bergstra

commented out a useless print in tensor.nnet

上级 0d989978
...@@ -415,7 +415,7 @@ def local_softmax_with_bias(node): ...@@ -415,7 +415,7 @@ def local_softmax_with_bias(node):
non_vectors = [] non_vectors = []
for x_in in x.owner.inputs: for x_in in x.owner.inputs:
if list(x_in.type.broadcastable) == [True, False]: if list(x_in.type.broadcastable) == [True, False]:
print isinstance(x_in.owner.op, tensor.DimShuffle) # print isinstance(x_in.owner.op, tensor.DimShuffle)
#since specialization comes relatively late in optimization, #since specialization comes relatively late in optimization,
# we don't want to put in extra DimShuffles un-necessarily. # we don't want to put in extra DimShuffles un-necessarily.
if x_in.owner and isinstance(x_in.owner.op, tensor.DimShuffle)\ if x_in.owner and isinstance(x_in.owner.op, tensor.DimShuffle)\
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论