提交 d861fd61 authored 作者: Frederic's avatar Frederic

Fix doc and associated test about get_scalar_constant_value

fixes gh-1300
上级 97fff534
......@@ -208,7 +208,7 @@ TODO: talk about OPTIMIZATION STAGES
.. code-block:: python
from theano.tensor.opt import get_constant_value
from theano.tensor.opt import get_scalar_constant_value, NotScalarConstantError
# Remove any fibby(zeros(...))
@theano.tensor.opt.register_specialize
......@@ -217,9 +217,9 @@ TODO: talk about OPTIMIZATION STAGES
if node.op == fibby:
x = node.inputs[0]
try:
if numpy.all(0 == get_constant_value(x)):
if numpy.all(0 == get_scalar_constant_value(x)):
return [x]
except TypeError:
except NotScalarConstantError:
pass
The ``register_specialize`` decorator is what activates our optimization, and
......
......@@ -928,6 +928,8 @@ class T_fibby(unittest.TestCase):
fibby = Fibby()
from theano.tensor.opt import (get_scalar_constant_value,
NotScalarConstantError)
# Remove any fibby(zeros(...))
@theano.tensor.opt.register_specialize
......@@ -938,7 +940,7 @@ class T_fibby(unittest.TestCase):
try:
if numpy.all(0 == get_scalar_constant_value(x)):
return [x]
except TypeError:
except NotScalarConstantError:
pass
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论