提交 e97d5b9f authored 作者: Frederic's avatar Frederic

fix typo following code review.

上级 06ff55bb
......@@ -28,8 +28,8 @@ AddConfigVar('cast_policy',
# python 2.* define int / int to return int and int // int to return int.
# python 3* define int / int to return float and int // int to return int.
# numpy 1.6.1 do as the python 2.*. I think we should not change it faster
# then numpy. When we will do the transition, we should create an int_warn
# numpy 1.6.1 behaves as python 2.*. I think we should not change it faster
# than numpy. When we will do the transition, we should create an int_warn
# and floatX_warn option.
AddConfigVar('int_division',
"What to do when one computes x / y, where both x and y are of "
......
......@@ -259,8 +259,8 @@ def grad(cost, wrt, g_cost=None, consider_constant=None, warn_type=False,
:return: symbolic expression of gradient of `cost` with respect to `wrt`.
If an element of `wrt` is not differentiable with respect
to the output, then a zero variable is returned.
It return an object of same type as `wrt`: a list/tuple
or TensorVariable in all case.
It returns an object of same type as `wrt`: a list/tuple
or TensorVariable in all cases.
This function is a wrapper around the more general function
`theano.gradient.grad_sources_inputs``.
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论