提交 02f5d851 authored 作者: Frederic Bastien's avatar Frederic Bastien

small typo fix.

上级 a712caa2
...@@ -163,7 +163,7 @@ In general, for any **scalar** expression ``s``, ``T.grad(s, w)`` provides ...@@ -163,7 +163,7 @@ In general, for any **scalar** expression ``s``, ``T.grad(s, w)`` provides
the theano expression for computing :math:`\frac{\partial s}{\partial w}`. In the theano expression for computing :math:`\frac{\partial s}{\partial w}`. In
this way Theano can be used for doing **efficient** symbolic differentiation this way Theano can be used for doing **efficient** symbolic differentiation
(as (as
the expression return by ``TT.grad`` will be optimized during compilation) even for the expression return by ``T.grad`` will be optimized during compilation) even for
function with many inputs. ( see `automatic differentiation <http://en.wikipedia.org/wiki/Automatic_differentiation>`_ for a description function with many inputs. ( see `automatic differentiation <http://en.wikipedia.org/wiki/Automatic_differentiation>`_ for a description
of symbolic differentiation). of symbolic differentiation).
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论