提交 21a5a3c7 authored 作者: Frederic's avatar Frederic

Expend doc for softmax.

上级 aad5341a
......@@ -116,6 +116,20 @@
The softmax function will, when applied to a matrix, compute the softmax values row-wise.
:note: this insert a particular op. But this op don't yet
implement the Rop for hessian free. If you want that, implement
this equivalent code that have the Rop implemented
``exp(x)/exp(x).sum(1, keep_dims=True)``. Theano should
optimize this by inserting the softmax op itself. The code of
the softmax op is more numeriacaly stable by using this code:
.. code-block:: python
e_x = exp(x - x.max(axis=1, keep_dims=True))
out = e_x / e_x.sum(axis=1, keep_dims=True)
Example of use:
.. code-block:: python
x,y,b = T.dvectors('x','y','b')
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论