HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags 'optimizer=fast_compile'. If that does not work, Theano optimization can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
Arguably the most useful information is approximately half-way through
the error message, where the kind of error is displayed along with its
cause (`ValueError: Input dimension mis-match. (input[0].shape[0] = 3,
input[1].shape[0] = 2`).
Below it, some other information is given, such as the apply node that
caused the error, as well as the input types, shapes, strides and
scalar values.
The two hints can also be helpful when debugging. Using the theano flag
``optimizer=fast_compile`` or ``optimizer=None`` can often tell you
the faulty line, while ``exception_verbosity=high`` will display a
debugprint of the apply node. Using these hints, the end of the error
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags 'optimizer=fast_compile'. If that does not work, Theano optimization can be disabled with 'optimizer=None'.
If the above is not informative enough, by instrumenting the code ever
so slightly, we can get Theano to reveal the exact source of the error.
.. code-block:: python
.. code-block:: python
...
@@ -108,18 +183,22 @@ value. This allows Theano to evaluate symbolic expressions on-the-fly (by
...
@@ -108,18 +183,22 @@ value. This allows Theano to evaluate symbolic expressions on-the-fly (by
calling the ``perform`` method of each op), as they are being defined. Sources
calling the ``perform`` method of each op), as they are being defined. Sources
of error can thus be identified with much more precision and much earlier in
of error can thus be identified with much more precision and much earlier in
the compilation pipeline. For example, running the above code yields the
the compilation pipeline. For example, running the above code yields the
following error message, which properly identifies *line 23* as the culprit.
following error message, which properly identifies *line 24* as the culprit.
.. code-block:: bash
.. code-block:: bash
Traceback (most recent call last):
Traceback (most recent call last):
File "test2.py", line 23, in <module>
File "test2.py", line 24, in <module>
h1 = T.dot(x,func_of_W1)
h1 = T.dot(x, func_of_W1)
File "/u/desjagui/workspace/PYTHON/Theano/theano/gof/op.py", line 360, in __call__
File "PATH_TO_THEANO/theano/tensor/basic.py", line 4734, in dot
node.op.perform(node, input_vals, output_storage)
return _dot(a, b)
File "/u/desjagui/workspace/PYTHON/Theano/theano/tensor/basic.py", line 4458, in perform
File "PATH_TO_THEANO/theano/gof/op.py", line 545, in __call__
required = thunk()
File "PATH_TO_THEANO/theano/gof/op.py", line 752, in rval
r = p(n, [x[0] for x in i], o)
File "PATH_TO_THEANO/theano/tensor/basic.py", line 4554, in perform
z[0] = numpy.asarray(numpy.dot(x, y))
z[0] = numpy.asarray(numpy.dot(x, y))
ValueError: ('matrices are not aligned', (5, 10), (20, 10))
ValueError: matrices are not aligned
The ``compute_test_value`` mechanism works as follows:
The ``compute_test_value`` mechanism works as follows: