提交 671c57f0 authored 作者: James Bergstra's avatar James Bergstra

docs

上级 ec4c1888
.. _debugmode: .. _debugmode:
=============== =================
Using DebugMode :mod:`debugmode`
=============== =================
.. module:: debugmode
:platform: Unix, Windows
:synopsis: defines DebugMode
.. moduleauthor:: LISA
Guide
=====
The DebugMode evaluation mode (available via ``mode='DEBUG_MODE'``, The DebugMode evaluation mode (available via ``mode='DEBUG_MODE'``,
...@@ -30,7 +38,7 @@ DebugMode is used as follows: ...@@ -30,7 +38,7 @@ DebugMode is used as follows:
If any problem is detected, DebugMode will raise an exception according to If any problem is detected, DebugMode will raise an exception according to
what went wrong, either at call time (e.g. ``f(5)``) or compile time (e.g what went wrong, either at call time (``f(5)``) or compile time (
``f = theano.function(x, 10*x, mode='DEBUG_MODE')``). These exceptions ``f = theano.function(x, 10*x, mode='DEBUG_MODE')``). These exceptions
should *not* be ignored; talk to your local Theano guru or email the should *not* be ignored; talk to your local Theano guru or email the
users list if you cannot make the exception go away. users list if you cannot make the exception go away.
...@@ -41,94 +49,163 @@ In the example above, there is no way to guarantee that a future call to say, ...@@ -41,94 +49,163 @@ In the example above, there is no way to guarantee that a future call to say,
If you instantiate DebugMode using the constructor ``compile.DebugMode`` If you instantiate DebugMode using the constructor ``compile.DebugMode``
rather than the keyword ``DEBUG_MODE`` you can configure its behaviour via rather than the keyword ``DEBUG_MODE`` you can configure its behaviour via
constructor arguments. See :ref:`DebugMode <compile_debugMode>` for details. constructor arguments.
Reference
==========
.. class:: DebugMode(Mode)
Evaluation Mode that detects internal theano errors.
This mode catches several kinds of internal error:
- inconsistent c_code and perform implementations (see `BadCLinkerOutput`)
- a variable replacing another when their runtime values don't match. This is a symptom of
an incorrect optimization step, or faulty Op implementation (raises `BadOptimization`)
- stochastic optimization ordering (raises `StochasticOrder`)
- incomplete `destroy_map` specification (raises `BadDestroyMap`)
- an op that returns an illegal value not matching the output Variable Type (raises
InvalidValueError)
Each of these exceptions inherits from the more generic `DebugModeError`.
If there are no internal errors, this mode behaves like FAST_RUN or FAST_COMPILE, but takes
a little longer and uses more memory.
If there are internal errors, this mode will raise an `DebugModeError` exception.
:remark: The work of debugging is implemented by the `_Maker`, `_Linker`, and
`_VariableEquivalenceTracker` classes.
The keyword version of DebugMode (which you get by using ``mode='DEBUG_MODE``)
is quite strict, and can raise several different Exception types.
There following are DebugMode exceptions you might encounter:
.. attibute:: stability_patience = config.THEANO_DEBUGMODE_PATIENCE
DebugModeError When checking for the stability of optimization, recompile the graph this many times.
-------------- Default 10.
This is a generic error. All the other exceptions inherit from this one. .. attribute:: check_c_code = config.THEANO_DEBUGMODE_CHECK_C
This error is typically not raised directly.
However, you can use ``except DebugModeError: ...`` to catch any of the more
specific types of Exception.
Should we evaluate (and check) the `c_code` implementations?
``True`` -> yes, ``False`` -> no.
Default yes.
.. attribute:: check_py_code = config.THEANO_DEBUGMODE_CHECK_PY
Should we evaluate (and check) the `perform` implementations?
``True`` -> yes, ``False`` -> no.
Default yes.
.. attribute:: check_isfinite = config.THEANO_DEBUGMODE_CHECK_FINITE
Should we check for (and complain about) ``NaN``/``Inf`` ndarray elements?
``True`` -> yes, ``False`` -> no.
Default yes.
.. attribute:: require_matching_strides = config.THEANO_DEBUGMODE_CHECK_STRIDES
Check for (and complain about) Ops whose python and C
outputs are ndarrays with different strides. (This can catch bugs, but
is generally overly strict.)
0 -> no check, 1 -> warn, 2 -> err.
Default warn.
.. method:: __init__(self, optimizer='fast_run', stability_patience=None, check_c_code=None, check_py_code=None, check_isfinite=None, require_matching_strides=None, linker=None)
Initialize member variables.
If any of these arguments (except optimizer) is not None, it overrides the class default.
The linker arguments is not used. It is set their to allow Mode.requiring() and some other fct to work with DebugMode too.
The keyword version of DebugMode (which you get by using ``mode='DEBUG_MODE``)
is quite strict, and can raise several different Exception types.
There following are DebugMode exceptions you might encounter:
BadCLinkerOutput ..class:: DebugModeError
----------------
This exception means that python (``perform``) and c (``c_code``) for an Op This is a generic error. All the other exceptions inherit from this one.
didn't compute the same thing like they were supposed to. This error is typically not raised directly.
The problem might be a bug in either ``perform`` or ``c_code`` (or both). However, you can use ``except DebugModeError: ...`` to catch any of the more
specific types of Exception.
BadOptimization .. class:: BadCLinkerOutput
---------------
This exception indicates that an Optimization replaced one variable (say V1) This exception means that python (``perform``) and c (``c_code``) for an Op
with another one (say V2) but at runtime, the values for V1 and V2 were didn't compute the same thing like they were supposed to.
different. This is something that optimizations are not supposed to do. The problem might be a bug in either ``perform`` or ``c_code`` (or both).
It can be tricky to identify the one-true-cause of an optimization error, but
this exception provides a lot of guidance. Most of the time, the
exception object will indicate which optimization was at fault.
The exception object also contains information such as a snapshot of the
before/after graph where the optimization introduced the error.
.. class:: BadOptimization
BadDestroyMap This exception indicates that an Optimization replaced one variable (say V1)
------------- with another one (say V2) but at runtime, the values for V1 and V2 were
different. This is something that optimizations are not supposed to do.
This happens when an Op's ``perform()`` or ``c_code()`` modifies an input that it wasn't It can be tricky to identify the one-true-cause of an optimization error, but
supposed to. If either the ``perform`` or ``c_code`` implementation of an Op this exception provides a lot of guidance. Most of the time, the
might modify any input, it has to advertise that fact via the ``destroy_map`` exception object will indicate which optimization was at fault.
attribute. The exception object also contains information such as a snapshot of the
before/after graph where the optimization introduced the error.
For detailed documentation on the ``destroy_map`` attribute, see :ref:`inplace`.
.. class:: BadDestroyMap
BadViewMap This happens when an Op's ``perform()`` or ``c_code()`` modifies an input that it wasn't
---------- supposed to. If either the ``perform`` or ``c_code`` implementation of an Op
might modify any input, it has to advertise that fact via the ``destroy_map``
attribute.
This happens when an Op's perform() or c_code() creates an alias or alias-like For detailed documentation on the ``destroy_map`` attribute, see :ref:`inplace`.
dependency between an input and an output... and it didn't warn the
optimization system via the ``view_map`` attribute.
For detailed documentation on the ``view_map`` attribute, see :ref:`views`. .. class:: BadViewMap
This happens when an Op's perform() or c_code() creates an alias or alias-like
dependency between an input and an output... and it didn't warn the
optimization system via the ``view_map`` attribute.
StochasticOrder For detailed documentation on the ``view_map`` attribute, see :ref:`views`.
---------------
This happens when an optimization does not perform the same graph operations
in the same order when run several times in a row. This can happen if any
steps are ordered by ``id(object)`` somehow, such as via the default object
hash function. A Stochastic optimization invalidates the pattern of work
whereby we debug in DEBUG_MODE and then run the full-size jobs in FAST_RUN.
.. class:: StochasticOrder
This happens when an optimization does not perform the same graph operations
in the same order when run several times in a row. This can happen if any
steps are ordered by ``id(object)`` somehow, such as via the default object
hash function. A Stochastic optimization invalidates the pattern of work
whereby we debug in DEBUG_MODE and then run the full-size jobs in FAST_RUN.
InvalidValueError .. class:: InvalidValueError
-----------------
This happens when some Op's ``perform`` or ``c_code`` implementation computes This happens when some Op's ``perform`` or ``c_code`` implementation computes
an output that is invalid with respect to the type of the corresponding output an output that is invalid with respect to the type of the corresponding output
variable. Like if it returned a complex-valued ndarray for a ``dscalar`` variable. Like if it returned a complex-valued ndarray for a ``dscalar``
Type. Type.
This can also be triggered when floating-point values such as NaN and Inf are This can also be triggered when floating-point values such as NaN and Inf are
introduced into the computations. It indicates which Op created the first introduced into the computations. It indicates which Op created the first
NaN. These floating-point values can be allowed by passing the NaN. These floating-point values can be allowed by passing the
``check_isfinite=False`` argument to DebugMode. ``check_isfinite=False`` argument to DebugMode.
...@@ -14,6 +14,8 @@ ...@@ -14,6 +14,8 @@
:maxdepth: 1 :maxdepth: 1
function function
io
mode
module module
debugmode debugmode
profilemode profilemode
......
...@@ -22,11 +22,17 @@ This documentation covers Theano module-wise. ...@@ -22,11 +22,17 @@ This documentation covers Theano module-wise.
There are also some top-level imports that you might find more convenient: There are also some top-level imports that you might find more convenient:
.. describe:: function
.. module:: theano
:platform: Unix, Windows
:synopsis: Theano top-level import
.. moduleauthor:: LISA
.. function:: function(...)
Alias for :func:`function.function` Alias for :func:`function.function`
.. describe:: Param .. class:: Param
Alias for :class:`function.Param` Alias for :class:`function.Param`
......
...@@ -10,7 +10,7 @@ class Param(object): ...@@ -10,7 +10,7 @@ class Param(object):
def __init__(self, variable, default=None, name=None, mutable=False, strict=False, def __init__(self, variable, default=None, name=None, mutable=False, strict=False,
implicit=None): implicit=None):
""" """
:param variable: A node in an expression graph to set with each function call. :param variable: A variable in an expression graph to use as a compiled-function parameter
:param default: The default value to use at call-time (can also be a Container where :param default: The default value to use at call-time (can also be a Container where
the function will find a value at call-time.) the function will find a value at call-time.)
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论