提交 14a0070c authored 作者: Frederic's avatar Frederic

doc fix following review.

上级 11d1efbc
...@@ -6,7 +6,7 @@ ...@@ -6,7 +6,7 @@
.. moduleauthor:: LISA .. moduleauthor:: LISA
Normally you should not call directly those Ops! Theano should automatically transform cpu ops to there gpu equivalent. So this list is just useful to let people know what is implemented on the gpu. Normally you should not call directly those Ops! Theano should automatically transform cpu ops to their gpu equivalent. So this list is just useful to let people know what is implemented on the gpu.
Basic Op Basic Op
======== ========
......
...@@ -94,12 +94,12 @@ of symbolic differentiation). ...@@ -94,12 +94,12 @@ of symbolic differentiation).
Computing the Jacobian Computing the Jacobian
====================== ======================
Theano implement :func:`theano.gradient.jacobian` macro that does all Theano implements :func:`theano.gradient.jacobian` macro that does all
what is needed to compute the jacobian. The following text explain how what is needed to compute the Jacobian. The following text explains how
to do it manually. to do it manually.
In order to manually compute the Jacobian of some function ``y`` with In order to manually compute the Jacobian of some function ``y`` with
respect to some parameter ``x`` we need to use the ``scan``. What we respect to some parameter ``x`` we need to use ``scan``. What we
do is to loop over the entries in ``y`` and compute the gradient of do is to loop over the entries in ``y`` and compute the gradient of
``y[i]`` with respect to ``x``. ``y[i]`` with respect to ``x``.
...@@ -138,13 +138,13 @@ matrix, which corresponds to the Jacobian. ...@@ -138,13 +138,13 @@ matrix, which corresponds to the Jacobian.
Computing the Hessian Computing the Hessian
===================== =====================
Theano implement :func:`theano.gradient.hessian` macro that does all Theano implements :func:`theano.gradient.hessian` macro that does all
what is needed to compute the hessian. The following text explain how that is needed to compute the Hessian. The following text explains how
to do it manually. to do it manually.
You can compute the hessian manually as the jacobian. The only You can compute the Hessian manually as the Jacobian. The only
difference is that now, instead of computing the Jacobian of some expression difference is that now, instead of computing the Jacobian of some expression
``y``, we compute the jacobian of ``T.grad(cost,x)``, where ``cost`` is some ``y``, we compute the Jacobian of ``T.grad(cost,x)``, where ``cost`` is some
scalar. scalar.
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论