提交 d861fd61 authored 作者: Frederic's avatar Frederic

Fix doc and associated test about get_scalar_constant_value

fixes gh-1300
上级 97fff534
...@@ -208,7 +208,7 @@ TODO: talk about OPTIMIZATION STAGES ...@@ -208,7 +208,7 @@ TODO: talk about OPTIMIZATION STAGES
.. code-block:: python .. code-block:: python
from theano.tensor.opt import get_constant_value from theano.tensor.opt import get_scalar_constant_value, NotScalarConstantError
# Remove any fibby(zeros(...)) # Remove any fibby(zeros(...))
@theano.tensor.opt.register_specialize @theano.tensor.opt.register_specialize
...@@ -217,9 +217,9 @@ TODO: talk about OPTIMIZATION STAGES ...@@ -217,9 +217,9 @@ TODO: talk about OPTIMIZATION STAGES
if node.op == fibby: if node.op == fibby:
x = node.inputs[0] x = node.inputs[0]
try: try:
if numpy.all(0 == get_constant_value(x)): if numpy.all(0 == get_scalar_constant_value(x)):
return [x] return [x]
except TypeError: except NotScalarConstantError:
pass pass
The ``register_specialize`` decorator is what activates our optimization, and The ``register_specialize`` decorator is what activates our optimization, and
......
...@@ -928,6 +928,8 @@ class T_fibby(unittest.TestCase): ...@@ -928,6 +928,8 @@ class T_fibby(unittest.TestCase):
fibby = Fibby() fibby = Fibby()
from theano.tensor.opt import (get_scalar_constant_value,
NotScalarConstantError)
# Remove any fibby(zeros(...)) # Remove any fibby(zeros(...))
@theano.tensor.opt.register_specialize @theano.tensor.opt.register_specialize
...@@ -938,7 +940,7 @@ class T_fibby(unittest.TestCase): ...@@ -938,7 +940,7 @@ class T_fibby(unittest.TestCase):
try: try:
if numpy.all(0 == get_scalar_constant_value(x)): if numpy.all(0 == get_scalar_constant_value(x)):
return [x] return [x]
except TypeError: except NotScalarConstantError:
pass pass
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论