提交 8af3f78f authored 作者: Frederic's avatar Frederic

Update the doc on tensor.grad to tell we use the new return type.

上级 3b33800c
......@@ -259,9 +259,8 @@ def grad(cost, wrt, g_cost=None, consider_constant=None, warn_type=False,
:return: symbolic expression of gradient of `cost` with respect to `wrt`.
If an element of `wrt` is not differentiable with respect
to the output, then a zero variable is returned.
If `wrt` is a list/tuple, longer then 1, a list will be returned.
DEPRECATION: In Theano 0.5, grad will return an object of the same
type as `wrt`: a list/tuple or TensorVariable in all case.
It return an object of same type as `wrt`: a list/tuple
or TensorVariable in all case.
This function is a wrapper around the more general function
`theano.gradient.grad_sources_inputs``.
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论