提交 66b6d812 authored 作者: Martin Thoma's avatar Martin Thoma

theano/tensor/nnet/nnet.py: Wrote docstring for Softmax

上级 52a98808
......@@ -385,7 +385,11 @@ softmax_grad = SoftmaxGrad()
class Softmax(gof.Op):
"""
WRITEME
Softmax activation function
:math:`\\varphi(\\mathbf{x})_j =
\\frac{e^{\mathbf{x}_j}}{\sum_{k=1}^K e^{\mathbf{x}_k}}`
where :math:`K` is the total number of neurons in the layer. This
activation function gets applied row-wise.
"""
nin = 1
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论