提交 0debd861 authored 作者: vipulraheja's avatar vipulraheja 提交者: Arnaud Bergeron

Fix reference numbering for docs

上级 36cd242a
......@@ -2423,7 +2423,7 @@ def h_softmax(x, batch_size, n_outputs, n_classes, n_outputs_per_class,
def elu(x, alpha=1):
"""
Compute the element-wise exponential linear activation function.
Compute the element-wise exponential linear activation function [2]_.
.. versionadded:: 0.8.0
......@@ -2441,7 +2441,7 @@ def elu(x, alpha=1):
References
-----
.. [1] Djork-Arne Clevert, Thomas Unterthiner, Sepp Hochreiter
.. [2] Djork-Arne Clevert, Thomas Unterthiner, Sepp Hochreiter
"Fast and Accurate Deep Network Learning by
Exponential Linear Units (ELUs)" <http://arxiv.org/abs/1511.07289>`.
"""
......@@ -2449,7 +2449,7 @@ def elu(x, alpha=1):
def selu(x):
"""Compute the element-wise Scaled Exponential Linear unit.
"""Compute the element-wise Scaled Exponential Linear unit [3]_.
.. versionadded:: 0.9.0
......@@ -2465,7 +2465,7 @@ def selu(x):
References
----------
.. Klambauer G, Unterthiner T, Mayr A, Hochreiter S.
.. [3] Klambauer G, Unterthiner T, Mayr A, Hochreiter S.
"Self-Normalizing Neural Networks" <https://arxiv.org/abs/1706.02515>
"""
alpha = 1.6732632423543772848170429916717
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论