提交 c8537817 authored 作者: Frédéric Bastien's avatar Frédéric Bastien

Merge pull request #2794 from MartinThoma/master

theano/tensor/nnet/nnet.py: Wrote docstring for Softmax
...@@ -385,7 +385,11 @@ softmax_grad = SoftmaxGrad() ...@@ -385,7 +385,11 @@ softmax_grad = SoftmaxGrad()
class Softmax(gof.Op): class Softmax(gof.Op):
""" """
WRITEME Softmax activation function
:math:`\\varphi(\\mathbf{x})_j =
\\frac{e^{\mathbf{x}_j}}{\sum_{k=1}^K e^{\mathbf{x}_k}}`
where :math:`K` is the total number of neurons in the layer. This
activation function gets applied row-wise.
""" """
nin = 1 nin = 1
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论