提交 91d0e7dc authored 作者: Iban Harlouchet's avatar Iban Harlouchet 提交者: Arnaud Bergeron

testcode for tensor/nnet/nnet.txt

上级 270fd908
......@@ -42,8 +42,10 @@
Example:
.. code-block:: python
.. testcode::
import theano.tensor as T
x,y,b = T.dvectors('x','y','b')
W = T.dmatrix('W')
y = T.nnet.sigmoid(T.dot(W,x) + b)
......@@ -102,7 +104,7 @@
.. note:: The underlying code will return an exact 0 if an element of x is too small.
.. code-block:: python
.. testcode::
x,y,b = T.dvectors('x','y','b')
W = T.dmatrix('W')
......@@ -124,14 +126,14 @@
optimize this by inserting the softmax op itself. The code of
the softmax op is more numeriacaly stable by using this code:
.. code-block:: python
.. testcode::
e_x = exp(x - x.max(axis=1, keepdims=True))
out = e_x / e_x.sum(axis=1, keepdims=True)
Example of use:
.. code-block:: python
.. testcode::
x,y,b = T.dvectors('x','y','b')
W = T.dmatrix('W')
......@@ -155,7 +157,7 @@
to the binary cross-entropy (note that this assumes that x will
contain values between 0 and 1):
.. code-block:: python
.. testcode::
x, y, b = T.dvectors('x', 'y', 'b')
W = T.dmatrix('W')
......@@ -191,7 +193,7 @@
correct class (which is typically the training criterion in
classification settings).
.. code-block:: python
.. testcode::
y = T.nnet.softmax(T.dot(W, x) + b)
cost = T.nnet.categorical_crossentropy(y, o)
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论