HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags optimizer=fast_compile
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags 'optimizer=fast_compile'. If that does not work, Theano optimization can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
Arguably the most useful information is approximately half-way through
Arguably the most useful information is approximately half-way through
...
@@ -66,9 +66,10 @@ caused the error, as well as the input types, shapes, strides and
...
@@ -66,9 +66,10 @@ caused the error, as well as the input types, shapes, strides and
scalar values.
scalar values.
The two hints can also be helpful when debugging. Using the theano flag
The two hints can also be helpful when debugging. Using the theano flag
``optimizer=fast_compile`` can in some cases tell you the faulty line,
``optimizer=fast_compile`` or ``optimizer=None`` can often tell you
while ``exception_verbosity=high`` will display a debugprint of the
the faulty line, while ``exception_verbosity=high`` will display a
apply node. Using these hints, the end of the error message becomes :
debugprint of the apply node. Using these hints, the end of the error
message becomes :
.. code-block:: bash
.. code-block:: bash
...
@@ -84,8 +85,8 @@ apply node. Using these hints, the end of the error message becomes :
...
@@ -84,8 +85,8 @@ apply node. Using these hints, the end of the error message becomes :
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags optimizer=fast_compile
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags 'optimizer=fast_compile'. If that does not work, Theano optimization can be disabled with 'optimizer=None'.
If the above is not informative enough, by instrumenting the code ever
If the above is not informative enough, by instrumenting the code ever
so slightly, we can get Theano to reveal the exact source of the error.
so slightly, we can get Theano to reveal the exact source of the error.
...
@@ -182,13 +190,13 @@ following error message, which properly identifies *line 24* as the culprit.
...
@@ -182,13 +190,13 @@ following error message, which properly identifies *line 24* as the culprit.
Traceback (most recent call last):
Traceback (most recent call last):
File "test2.py", line 24, in <module>
File "test2.py", line 24, in <module>
h1 = T.dot(x, func_of_W1)
h1 = T.dot(x, func_of_W1)
File "/data/lisa/exp/jeasebas/Theano/theano/tensor/basic.py", line 4734, in dot
File "PATH_TO_THEANO/theano/tensor/basic.py", line 4734, in dot
return _dot(a, b)
return _dot(a, b)
File "/data/lisa/exp/jeasebas/Theano/theano/gof/op.py", line 545, in __call__
File "PATH_TO_THEANO/theano/gof/op.py", line 545, in __call__
required = thunk()
required = thunk()
File "/data/lisa/exp/jeasebas/Theano/theano/gof/op.py", line 752, in rval
File "PATH_TO_THEANO/theano/gof/op.py", line 752, in rval
r = p(n, [x[0] for x in i], o)
r = p(n, [x[0] for x in i], o)
File "/data/lisa/exp/jeasebas/Theano/theano/tensor/basic.py", line 4554, in perform
File "PATH_TO_THEANO/theano/tensor/basic.py", line 4554, in perform