提交 4b7f6f87 authored 作者: James Bergstra's avatar James Bergstra

deprecated using a non-scalar cost with tensor.grad()

上级 24d00ecd
...@@ -2857,6 +2857,11 @@ def grad(cost, wrt, g_cost=None, consider_constant=[], warn_type=False): ...@@ -2857,6 +2857,11 @@ def grad(cost, wrt, g_cost=None, consider_constant=[], warn_type=False):
if not isinstance(cost, TensorVariable): if not isinstance(cost, TensorVariable):
raise TypeError('In tensor.grad(), cost argument should be a TensorVariable.', cost) raise TypeError('In tensor.grad(), cost argument should be a TensorVariable.', cost)
if cost.type.ndim:
_warn('the passing of a non-scalar cost to theano.tensor.grad() is deprecated.'
' Use the lower-level '
'theano.gradient if you really want to do this')
if g_cost is None: if g_cost is None:
g_cost = ones_like(cost) g_cost = ones_like(cost)
inputs = gof.graph.inputs([cost]) inputs = gof.graph.inputs([cost])
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论