提交 08aa59ef authored 作者: Iban Harlouchet's avatar Iban Harlouchet 提交者: Arnaud Bergeron

testcode for doc/extending/unittest.txt

上级 42c0e2f3
...@@ -39,7 +39,9 @@ A unittest is a subclass of ``unittest.TestCase``, with member ...@@ -39,7 +39,9 @@ A unittest is a subclass of ``unittest.TestCase``, with member
functions with names that start with the string ``test``. For functions with names that start with the string ``test``. For
example: example:
.. code-block:: python .. testcode::
import unittest
class MyTestCase(unittest.TestCase): class MyTestCase(unittest.TestCase):
def test0(self): def test0(self):
...@@ -115,7 +117,7 @@ built-in unittest module uses metaclasses to know about all the ...@@ -115,7 +117,7 @@ built-in unittest module uses metaclasses to know about all the
them all, printing '.' for passed tests, and a stack trace for them all, printing '.' for passed tests, and a stack trace for
exceptions. The standard footer code in theano's test files is: exceptions. The standard footer code in theano's test files is:
.. code-block:: python .. testcode::
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()
...@@ -134,7 +136,7 @@ To run all the tests in one or more ``TestCase`` subclasses: ...@@ -134,7 +136,7 @@ To run all the tests in one or more ``TestCase`` subclasses:
To run just a single ``MyTestCase`` member test function called ``test0``: To run just a single ``MyTestCase`` member test function called ``test0``:
.. code-block:: python .. testcode::
MyTestCase('test0').debug() MyTestCase('test0').debug()
...@@ -186,6 +188,7 @@ Example: ...@@ -186,6 +188,7 @@ Example:
.. code-block:: python .. code-block:: python
import unittest import unittest
class TestTensorDot(unittest.TestCase): class TestTensorDot(unittest.TestCase):
def test_validity(self): def test_validity(self):
# do stuff # do stuff
...@@ -201,7 +204,9 @@ functionality which is shared amongst all test methods in the test ...@@ -201,7 +204,9 @@ functionality which is shared amongst all test methods in the test
case (i.e initializing data, parameters, seeding random number case (i.e initializing data, parameters, seeding random number
generators -- more on this later) generators -- more on this later)
.. code-block:: python .. testcode:: writeUnitest
import unittest
class TestTensorDot(unittest.TestCase): class TestTensorDot(unittest.TestCase):
def setUp(self): def setUp(self):
...@@ -238,7 +243,7 @@ Example: ...@@ -238,7 +243,7 @@ Example:
Avoid hard-coding variables, as in the following case: Avoid hard-coding variables, as in the following case:
.. code-block:: python .. testcode:: writeUnitest
self.assertTrue(numpy.all(f(self.avals,self.bvals)==numpy.array([[25,25,30,28],[21,18,14,25]]))) self.assertTrue(numpy.all(f(self.avals,self.bvals)==numpy.array([[25,25,30,28],[21,18,14,25]])))
...@@ -275,6 +280,8 @@ Example: ...@@ -275,6 +280,8 @@ Example:
.. code-block:: python .. code-block:: python
import unittest
class TestTensorDot(unittest.TestCase): class TestTensorDot(unittest.TestCase):
... ...
def test_3D_dot_fail(self): def test_3D_dot_fail(self):
...@@ -300,7 +307,9 @@ Example: ...@@ -300,7 +307,9 @@ Example:
.. code-block:: python .. code-block:: python
f = T.function([a,b],[c],mode='FAST_RUN') from theano import function
f = function([a,b],[c],mode='FAST_RUN')
Whenever possible, unit tests should omit this parameter. Leaving Whenever possible, unit tests should omit this parameter. Leaving
out the mode will ensure that unit tests use the default mode. out the mode will ensure that unit tests use the default mode.
...@@ -334,7 +343,7 @@ another (i.e always pass or always fail). ...@@ -334,7 +343,7 @@ another (i.e always pass or always fail).
Instead of using ``numpy.random.seed`` to do this, we encourage users to Instead of using ``numpy.random.seed`` to do this, we encourage users to
do the following: do the following:
.. code-block:: python .. testcode::
from theano.tests import unittest_tools from theano.tests import unittest_tools
...@@ -367,7 +376,9 @@ machine) can simply set ``config.unittests.rseed`` to 'random' (see ...@@ -367,7 +376,9 @@ machine) can simply set ``config.unittests.rseed`` to 'random' (see
Similarly, to provide a seed to numpy.random.RandomState, simply use: Similarly, to provide a seed to numpy.random.RandomState, simply use:
.. code-block:: python .. testcode::
import numpy
rng = numpy.random.RandomState(unittest_tools.fetch_seed()) rng = numpy.random.RandomState(unittest_tools.fetch_seed())
# OR providing an explicit seed # OR providing an explicit seed
...@@ -413,7 +424,9 @@ at point ``x`` is approximated as: ...@@ -413,7 +424,9 @@ at point ``x`` is approximated as:
Here is the prototype for the verify_grad function. Here is the prototype for the verify_grad function.
>>> def verify_grad(fun, pt, n_tests=2, rng=None, eps=1.0e-7, abs_tol=0.0001, rel_tol=0.0001): .. code-block:: python
def verify_grad(fun, pt, n_tests=2, rng=None, eps=1.0e-7, abs_tol=0.0001, rel_tol=0.0001):
``verify_grad`` raises an Exception if the difference between the analytic gradient and ``verify_grad`` raises an Exception if the difference between the analytic gradient and
numerical gradient (computed through the Finite Difference Method) of a random numerical gradient (computed through the Finite Difference Method) of a random
...@@ -445,7 +458,7 @@ In the general case, you can define ``fun`` as you want, as long as it ...@@ -445,7 +458,7 @@ In the general case, you can define ``fun`` as you want, as long as it
takes as inputs Theano symbolic variables and returns a sinble Theano takes as inputs Theano symbolic variables and returns a sinble Theano
symbolic variable: symbolic variable:
.. code-block:: python .. testcode::
def test_verify_exprgrad(): def test_verify_exprgrad():
def fun(x,y,z): def fun(x,y,z):
...@@ -460,7 +473,7 @@ symbolic variable: ...@@ -460,7 +473,7 @@ symbolic variable:
Here is an example showing how to use ``verify_grad`` on an Op instance: Here is an example showing how to use ``verify_grad`` on an Op instance:
.. code-block:: python .. testcode::
def test_flatten_outdimNone(): def test_flatten_outdimNone():
# Testing gradient w.r.t. all inputs of an op (in this example the op # Testing gradient w.r.t. all inputs of an op (in this example the op
...@@ -474,7 +487,7 @@ an Op's inputs. This is useful in particular when the gradient w.r.t. some of ...@@ -474,7 +487,7 @@ an Op's inputs. This is useful in particular when the gradient w.r.t. some of
the inputs cannot be computed by finite difference (e.g. for discrete inputs), the inputs cannot be computed by finite difference (e.g. for discrete inputs),
which would cause ``verify_grad`` to crash. which would cause ``verify_grad`` to crash.
.. code-block:: python .. testcode::
def test_crossentropy_softmax_grad(): def test_crossentropy_softmax_grad():
op = tensor.nnet.crossentropy_softmax_argmax_1hot_with_bias op = tensor.nnet.crossentropy_softmax_argmax_1hot_with_bias
...@@ -511,7 +524,12 @@ this is common, two helper functions exists to make your lives easier: ...@@ -511,7 +524,12 @@ this is common, two helper functions exists to make your lives easier:
Here is an example of ``makeTester`` generating testcases for the Dot Here is an example of ``makeTester`` generating testcases for the Dot
product op: product op:
.. code-block:: python .. testcode::
from numpy import dot
from numpy.random import rand
from theano.tensor.tests.test_basic import makeTester
DotTester = makeTester(name = 'DotTester', DotTester = makeTester(name = 'DotTester',
op = dot, op = dot,
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论