提交 238c3d62 authored 作者: Gokula Krishnan's avatar Gokula Krishnan

fixes grammatical errors

上级 08963819
......@@ -120,12 +120,12 @@
The softmax function will, when applied to a matrix, compute the softmax values row-wise.
:note: this insert a particular op. But this op don't yet
implement the Rop for hessian free. If you want that, implement
this equivalent code that have the Rop implemented
``exp(x)/exp(x).sum(1, keepdims=True)``. Theano should
:note: this uses a particular operation. But this method doesn't yet
implement the Row Operation for hessian free. If you want that, you can use
this equivalent code that has implemented the Row Operation
``exp(x)/exp(x).sum(1, keepdims=True)``. Theano would
optimize this by inserting the softmax op itself. The code of
the softmax op is more numeriacaly stable by using this code:
the softmax op is more numerically stable because it uses this code:
.. code-block:: python
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论