提交 37ef1949 authored 作者: Frederic's avatar Frederic

Link the tutorial on the jacobian/hessian to the implementation in theano.gradient.

上级 5dc2f764
......@@ -94,9 +94,14 @@ of symbolic differentiation).
Computing the Jacobian
======================
In order to compute the Jacobian of some function ``y`` with respect to some
parameter ``x`` we need to use the ``scan``. What we do is to loop over the
entries in ``y`` and compute the gradient of ``y[i]`` with respect to ``x``.
Theano implement :func:`theano.gradient.jacobian` macro that does all
what is needed to compute the jacobian. The following text explain how
to do it manually.
In order to manually compute the Jacobian of some function ``y`` with
respect to some parameter ``x`` we need to use the ``scan``. What we
do is to loop over the entries in ``y`` and compute the gradient of
``y[i]`` with respect to ``x``.
.. note::
......@@ -129,10 +134,15 @@ matrix, which corresponds to the Jacobian.
seems possible. The reason is that ``y_i`` will not be a function of
``x`` anymore, while ``y[i]`` still is.
Computing the Hessian
=====================
Similar to computing the Jacobian we can also compute the Hessian. The only
Theano implement :func:`theano.gradient.hessian` macro that does all
what is needed to compute the hessian. The following text explain how
to do it manually.
You can compute the hessian manually as the jacobian. The only
difference is that now, instead of computing the Jacobian of some expression
``y``, we compute the jacobian of ``T.grad(cost,x)``, where ``cost`` is some
scalar.
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论