提交 38f3f8c5 authored 作者: Frederic's avatar Frederic

Add code to test the example

上级 5cfd421a
......@@ -32,7 +32,6 @@ you should check the strides and alignment.
.. code-block:: python
class Fibby(theano.Op):
"""
An arbitrarily generalized Fibbonacci sequence
"""
......@@ -239,3 +238,26 @@ argument for parameter ``node``. It tests using
function ``get_constant_value``, which determines if a
Variable (``x``) is guaranteed to be a constant, and if so, what constant.
Test the optimization
=====================
Here is some code that test the optimization is applied only when needed.
.. code-block:: python
# Test it don't apply when not needed
x = T.dvector()
f = function([x], fibby(x))
#theano.printing.debugprint(f)
f(numpy.random.rand(5))
topo = f.maker.fgraph.toposort()
assert len(topo) == 1
assert isinstance(topo[0].op, Fibby)
# Test that the optimization get applied
f_zero = function([], fibby(T.zeros([5])))
#theano.printing.debugprint(f_zero)
f_zero()
topo = f_zero.maker.fgraph.toposort()
assert len(topo) == 1
assert isinstance(topo[0].op, theano.compile.ops.DeepCopyOp)
......@@ -950,6 +950,22 @@ class T_fibby(unittest.TestCase):
except NotScalarConstantError:
pass
# Test it don't apply when not needed
x = T.dvector()
f = function([x], fibby(x))
#theano.printing.debugprint(f)
f(numpy.random.rand(5))
topo = f.maker.fgraph.toposort()
assert len(topo) == 1
assert isinstance(topo[0].op, Fibby)
# Test that the optimization get applied
f_zero = function([], fibby(T.zeros([5])))
#theano.printing.debugprint(f_zero)
f_zero()
topo = f_zero.maker.fgraph.toposort()
assert len(topo) == 1
assert isinstance(topo[0].op, theano.compile.ops.DeepCopyOp)
class T_graphstructures(unittest.TestCase):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论