提交 0f1c5ac3 authored 作者: Frédéric Bastien's avatar Frédéric Bastien

Merge pull request #2107 from vdumoulin/docfix

keep_dims -> keepdims in theano.tensor.nnet documentation
...@@ -119,14 +119,14 @@ ...@@ -119,14 +119,14 @@
:note: this insert a particular op. But this op don't yet :note: this insert a particular op. But this op don't yet
implement the Rop for hessian free. If you want that, implement implement the Rop for hessian free. If you want that, implement
this equivalent code that have the Rop implemented this equivalent code that have the Rop implemented
``exp(x)/exp(x).sum(1, keep_dims=True)``. Theano should ``exp(x)/exp(x).sum(1, keepdims=True)``. Theano should
optimize this by inserting the softmax op itself. The code of optimize this by inserting the softmax op itself. The code of
the softmax op is more numeriacaly stable by using this code: the softmax op is more numeriacaly stable by using this code:
.. code-block:: python .. code-block:: python
e_x = exp(x - x.max(axis=1, keep_dims=True)) e_x = exp(x - x.max(axis=1, keepdims=True))
out = e_x / e_x.sum(axis=1, keep_dims=True) out = e_x / e_x.sum(axis=1, keepdims=True)
Example of use: Example of use:
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论