提交 2660f285 authored 作者: Frederic Bastien's avatar Frederic Bastien

give a little bit more info about theano gradient in the doc.

上级 93f46dfd
......@@ -8,7 +8,7 @@ arrays efficiently. Theano features:
* **tight integration with numpy** -- Use `numpy.ndarray` in Theano-compiled functions.
* **transparent use of a GPU** -- Perform data-intensive calculations up to 140x faster than with CPU.(float32 only)
* **symbolic differentiation** -- Let Theano do your derivatives.
* **efficient symbolic differentiation** -- Theano does your derivatives for function with one or many inputs.
* **speed and stability optimizations** -- Get the right answer for ``log(1+x)`` even when ``x`` is really tiny.
* **dynamic C code generation** -- Evaluate expressions faster.
* **extensive unit-testing and self-verification** -- Detect and diagnose many types of mistake.
......
......@@ -144,6 +144,8 @@ array([[ 0.25 , 0.19661193],
The resulting function computes the gradient of its first argument
with respect to the second. In this way, Theano can be used for
`automatic differentiation <http://en.wikipedia.org/wiki/Automatic_differentiation>`_.
As opposed to what this page tell, Theano do efficient symbolic differentiation
even for function with many inputs.
.. note::
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论