提交 f1a4191c authored 作者: abergeron's avatar abergeron

Merge pull request #4250 from gokul-uf/fix_softmax_doc

Fixes grammatical errors in theano.nnet.softmax doc
...@@ -120,12 +120,8 @@ ...@@ -120,12 +120,8 @@
The softmax function will, when applied to a matrix, compute the softmax values row-wise. The softmax function will, when applied to a matrix, compute the softmax values row-wise.
:note: this insert a particular op. But this op don't yet :note: this supports hessian free as well. The code of
implement the Rop for hessian free. If you want that, implement the softmax op is more numerically stable because it uses this code:
this equivalent code that have the Rop implemented
``exp(x)/exp(x).sum(1, keepdims=True)``. Theano should
optimize this by inserting the softmax op itself. The code of
the softmax op is more numeriacaly stable by using this code:
.. code-block:: python .. code-block:: python
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论