提交 07661f2a authored 作者: David Warde-Farley's avatar David Warde-Farley

Merge pull request #647 from nouiz/doc

Doc
......@@ -142,43 +142,112 @@ Op example
return eval_points
return self.grad(inputs, eval_points)
Test it!
Try it!
.. code-block:: python
x = theano.tensor.matrix()
f = theano.function([x], DoubleOp()(x))
import numpy
inp = numpy.random.rand(5, 5)
inp = numpy.random.rand(5, 4)
out = f(inp)
assert numpy.allclose(inp * 2, out)
print inp
print out
How to test it
--------------
Theano has some functions to simplify testing. These help test the
``infer_shape``, ``grad`` and ``R_op`` methods. Put the following code
in a file and execute it with the ``nosetests`` program.
Basic tests
===========
Basic tests are done by you just by using the Op and checking that it
returns the right answer. If you detect an error, you must raise an
exception. You can use the `assert` keyword to automatically raise an
``AssertionError``.
.. code-block:: python
from theano.tests import unittest_tools as utt
from theano import config
class test_Double(utt.InferShapeTester):
def setUp(self):
super(test_Double, self).setUp()
self.op_class = DoubleOp
self.op = DoubleOp()
def test_basic(self):
x = theano.tensor.matrix()
f = theano.function([x], self.op(x))
inp = numpy.asarray(numpy.random.rand(5, 4), dtype=config.floatX)
out = f(inp)
# Compare the result computed to the expected value.
assert numpy.allclose(inp * 2, out)
Testing the infer_shape
=======================
When a class inherits from the ``InferShapeTester`` class, it gets the
`self._compile_and_check` method that tests the Op ``infer_shape``
method. It tests that the Op gets optimized out of the graph if only
the shape of the output is needed and not the output
itself. Additionally, it checks that such an optimized graph computes
the correct shape, by comparing it to the actual shape of the computed
output.
`self._compile_and_check` compiles a Theano function. It takes as
parameters the lists of input and output Theano variables, as would be
provided to theano.function, and a list of real values to pass to the
compiled function (don't use shapes that are symmetric, e.g. (3, 3),
as they can easily to hide errors). It also takes the Op class to
verify that no Ops of that type appear in the shape-optimized graph.
If there is an error, the function raises an exception. If you want to
see it fail, you can implement an incorrect ``infer_shape``.
.. code-block:: python
def test_infer_shape(self):
x = theano.tensor.matrix()
self._compile_and_check([x], # theano.function inputs
[self.op(x)], # theano.function outputs
# Always use not square matrix!
# inputs data
[numpy.asarray(numpy.random.rand(5, 4),
dtype=config.floatX)],
# Op that should be removed from the graph.
self.op_class)
Testing the gradient
--------------------
====================
The function :ref:`verify_grad <validating_grad>`
verifies the gradient of an Op or Theano graph. It compares the
analytic (symbolically computed) gradient and the numeric
gradient (computed through the Finite Difference Method).
To verify the grad method of the DoubleOp, you can use this:
If there is an error, the function raises an exception. If you want to
see it fail, you can implement an incorrect gradient (for instance, by removing
the multiplication by 2).
.. code-block:: python
import numpy
import theano.tests
theano.tests.unittest_tools.verify_grad(DoubleOp(), [numpy.random.rand(5,7,2)])
If nothing happens, then it works! If you want to see it fail, you can
implement a wrong gradient (for instance removing the multiplication by 2).
def test_grad(self):
theano.tests.unittest_tools.verify_grad(self.op,
[numpy.random.rand(5, 7, 2)])
Testing the Rop
---------------
===============
The functions :func:`RopLop_checker.check_mat_rop_lop`,
:func:`RopLop_checker.check_rop_lop` and :func:`RopLop_checker.check_nondiff_rop` allow to test the implemntation of the Rop of one function.
The class :class:`RopLop_checker`, give the functions
:func:`RopLop_checker.check_mat_rop_lop`,
:func:`RopLop_checker.check_rop_lop` and
:func:`RopLop_checker.check_nondiff_rop` that allow to test the
implementation of the Rop method of one Op.
To verify the Rop method of the DoubleOp, you can use this:
......@@ -186,19 +255,18 @@ To verify the Rop method of the DoubleOp, you can use this:
import numpy
import theano.tests
from theano.tensor.tests.test_rop import RopLop_checker
class test_Double(RopLop_checker):
from theano.tests.test_rop import RopLop_checker
class test_DoubleRop(RopLop_checker):
def setUp(self):
super(test_Double, self).setUp()
super(test_DoubleRop, self).setUp()
def test_double_rop(self):
self.check_rop_lop(DoubleOp()(self.x), self.in_shape)
assert False
self.check_rop_lop(DoubleRop()(self.x), self.in_shape)
You can use `nosetests` to run it as all other test in Theano or you can run it like that in a python shell:
.. code-block:: python
t = test_Double("test_double_rop")
t = test_DoubleRop("test_double_rop")
t.setUp()
t.test_double_rop()
......
......@@ -172,6 +172,7 @@ class InferShapeTester(unittest.TestCase):
def _compile_and_check(self, inputs, outputs, numeric_inputs, cls,
excluding=None):
"""This tests the infer_shape method only"""
mode = self.mode
if excluding:
mode = mode.excluding(*excluding)
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论