HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags 'optimizer=fast_compile'. If that does not work, Theano optimization can be disabled with 'optimizer=None'.
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags 'optimizer=fast_compile'. If that does not work, Theano optimization can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
Arguably the most useful information is approximately half-way through
Arguably the most useful information is approximately half-way through
the error message, where the kind of error is displayed along with its
the error message, where the kind of error is displayed along with its
...
@@ -71,7 +67,7 @@ the faulty line, while ``exception_verbosity=high`` will display a
...
@@ -71,7 +67,7 @@ the faulty line, while ``exception_verbosity=high`` will display a
debugprint of the apply node. Using these hints, the end of the error
debugprint of the apply node. Using these hints, the end of the error
message becomes :
message becomes :
.. testoutput::
.. code-block:: none
Backtrace when the node is created:
Backtrace when the node is created:
File "test0.py", line 8, in <module>
File "test0.py", line 8, in <module>
...
@@ -101,7 +97,7 @@ following example. Here, we use ``exception_verbosity=high`` and
...
@@ -101,7 +97,7 @@ following example. Here, we use ``exception_verbosity=high`` and
``optimizer=None`` would and it could therefore be used instead of test values.
``optimizer=None`` would and it could therefore be used instead of test values.
.. testcode:: testvalues
.. testcode:: testvalue
import numpy
import numpy
import theano
import theano
...
@@ -137,7 +133,7 @@ following example. Here, we use ``exception_verbosity=high`` and
...
@@ -137,7 +133,7 @@ following example. Here, we use ``exception_verbosity=high`` and
Running the above code generates the following error message:
Running the above code generates the following error message:
.. testoutput:: testvalues
.. testoutput:: testvalue
Traceback (most recent call last):
Traceback (most recent call last):
File "test1.py", line 31, in <module>
File "test1.py", line 31, in <module>
...
@@ -166,7 +162,7 @@ Running the above code generates the following error message:
...
@@ -166,7 +162,7 @@ Running the above code generates the following error message:
If the above is not informative enough, by instrumenting the code ever
If the above is not informative enough, by instrumenting the code ever
so slightly, we can get Theano to reveal the exact source of the error.
so slightly, we can get Theano to reveal the exact source of the error.
.. testcode:: testvalues
.. code-block:: python
# enable on-the-fly graph computations
# enable on-the-fly graph computations
theano.config.compute_test_value = 'warn'
theano.config.compute_test_value = 'warn'
...
@@ -185,7 +181,7 @@ of error can thus be identified with much more precision and much earlier in
...
@@ -185,7 +181,7 @@ of error can thus be identified with much more precision and much earlier in
the compilation pipeline. For example, running the above code yields the
the compilation pipeline. For example, running the above code yields the
following error message, which properly identifies *line 24* as the culprit.
following error message, which properly identifies *line 24* as the culprit.
.. testoutput:: testvalues
.. code-block:: node
Traceback (most recent call last):
Traceback (most recent call last):
File "test2.py", line 24, in <module>
File "test2.py", line 24, in <module>
...
@@ -393,12 +389,13 @@ can be achieved as follows:
...
@@ -393,12 +389,13 @@ can be achieved as follows:
f(0) # log(0) * 0 = -inf * 0 = NaN
f(0) # log(0) * 0 = -inf * 0 = NaN
.. testoutput:: compiled
.. testoutput:: compiled
:options: +NORMALIZE_WHITESPACE
*** NaN detected ***
*** NaN detected ***
Elemwise{Composite{(log(i0) * i0)}} [@A] ''
Elemwise{Composite{(log(i0) * i0)}} [@A] ''
|x [@B]
|x [@B]
Inputs : [array(0.0)]
Inputs : [array(0.0)]
Outputs: [array(nan)]
Outputs: [array(nan)]
To help understand what is happening in your graph, you can
To help understand what is happening in your graph, you can
disable the ``local_elemwise_fusion`` and all ``inplace``
disable the ``local_elemwise_fusion`` and all ``inplace``
...
@@ -430,12 +427,11 @@ the execution of the node can garbage collect its inputs that aren't
...
@@ -430,12 +427,11 @@ the execution of the node can garbage collect its inputs that aren't
needed anymore by the Theano function. This can be done with the Theano
needed anymore by the Theano function. This can be done with the Theano
flag:
flag:
.. testcode:: compiled
.. code-block:: python
allow_gc=False
allow_gc=False
.. TODO: documentation for link.WrapLinkerMany
.. TODO: documentation for link.WrapLinkerMany
...
@@ -453,26 +449,47 @@ Consider this example script ("ex.py"):
...
@@ -453,26 +449,47 @@ Consider this example script ("ex.py"):
.. testcode::
.. testcode::
import theano
import theano
import numpy
import numpy
import theano.tensor as T
import theano.tensor as T
a = T.dmatrix('a')
a = T.dmatrix('a')
b = T.dmatrix('b')
b = T.dmatrix('b')
f = theano.function([a, b], [a * b])
f = theano.function([a, b], [a * b])
# matrices chosen so dimensions are unsuitable for multiplication
# matrices chosen so dimensions are unsuitable for multiplication