提交 77b676e3 authored 作者: Frederic's avatar Frederic

Remove old init parameter that was removed in the past. Recently, we added new…

Remove old init parameter that was removed in the past. Recently, we added new one. So there was a conflict.
上级 4cf97863
...@@ -869,7 +869,7 @@ class CrossentropySoftmax1HotWithBiasDx (gof.Op): ...@@ -869,7 +869,7 @@ class CrossentropySoftmax1HotWithBiasDx (gof.Op):
# typically we should not need the gradient w.r.t. dy). # typically we should not need the gradient w.r.t. dy).
y_idx_range = tensor.arange(y_idx.shape[0]) y_idx_range = tensor.arange(y_idx.shape[0])
g_dy = tensor.sum( g_dy = tensor.sum(
g_dx * tensor.AdvancedIncSubtensor((y_idx_range, y_idx))( g_dx * tensor.AdvancedIncSubtensor()(
sm, tensor.fill(dy, -1), y_idx_range, y_idx), sm, tensor.fill(dy, -1), y_idx_range, y_idx),
axis=1) axis=1)
g_sm = dy.dimshuffle(0, 'x') * g_dx g_sm = dy.dimshuffle(0, 'x') * g_dx
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论