提交 506dec50 authored 作者: Razvan Pascanu's avatar Razvan Pascanu

Merge pull request #204 from nouiz/rop_doc

Added doc about R_op.
......@@ -133,6 +133,15 @@ Op example
def grad(self, inputs, output_grads):
return [output_grads[0] * 2]
def R_op(self, inputs, eval_points):
# R_op can receive None as eval_points.
# That mean there is no diferientiable path through that input
# If this imply that you cannot compute some outputs,
# return None for those.
if eval_points[0] is None:
return eval_points
return self.grad(inputs, eval_points)
Test it!
.. code-block:: python
......@@ -165,6 +174,34 @@ To verify the grad method of the DoubleOp, you can use this:
If nothing happens, then it works! If you want to see it fail, you can
implement a wrong gradient (for instance removing the multiplication by 2).
Testing the Rop
---------------
The functions :func:`RopLop_checker.check_mat_rop_lop`,
:func:`RopLop_checker.check_rop_lop` and :func:`RopLop_checker.check_nondiff_rop` allow to test the implemntation of the Rop of one function.
To verify the Rop method of the DoubleOp, you can use this:
.. code-block:: python
import numpy
import theano.tests
from theano.tensor.tests.test_rop import RopLop_checker
class test_Double(RopLop_checker):
def setUp(self):
super(test_Double, self).setUp()
def test_double_rop(self):
self.check_rop_lop(DoubleOp()(self.x), self.in_shape)
assert False
You can use `nosetests` to run it as all other test in Theano or you can run it like that in a python shell:
.. code-block:: python
t = test_Double("test_double_rop")
t.setUp()
t.test_double_rop()
Exercises 8
-----------
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论