提交 97e12dae authored 作者: Frederic's avatar Frederic

Added the local_log_softmax to the list of optimization in the doc

Tell how to get the full list.
上级 b7924531
...@@ -16,22 +16,22 @@ The descriptions are brief and point to further reading. ...@@ -16,22 +16,22 @@ The descriptions are brief and point to further reading.
If you would like to add an additional optimization, refer to If you would like to add an additional optimization, refer to
:ref:`optimization` in the guide to extending Theano. :ref:`optimization` in the guide to extending Theano.
.. #COMMENT .. note::
Since the print_summary method has been added to several OpDBs and This list is partial.
optimizers, it is possible to compute an accurate and up-to-date
optimization list by typing The print_summary method allow several OpDBs and optimizers to list the optimization executed.
This allow to have an up-to-date list.
python -c 'import theano; theano.compile.FAST_RUN.optimizer.print_summary()' python -c 'import theano; theano.compile.FAST_RUN.optimizer.print_summary()'
python -c 'import theano; theano.compile.FAST_COMPILE.optimizer.print_summary()'
etc. python -c 'import theano; theano.compile.FAST_COMPILE.optimizer.print_summary()'
========================================================= ========= ============ ========================================================= ========= ============ =============
Optimization FAST_RUN FAST_COMPILE Optimization FAST_RUN FAST_COMPILE Stabilization
========================================================= ========= ============ ========================================================= ========= ============ =============
:term:`merge` x x :term:`merge` x x
:term:`constant folding<constant folding>` x :term:`constant folding<constant folding>` x x
:term:`shape promotion<shape promotion>` x :term:`shape promotion<shape promotion>` x
:term:`fill cut<fill cut>` x :term:`fill cut<fill cut>` x
:term:`inc_subtensor srlz.<inc_subtensor serialization>` x :term:`inc_subtensor srlz.<inc_subtensor serialization>` x
...@@ -53,7 +53,8 @@ Optimization FAST_RUN FAST_COMPILE ...@@ -53,7 +53,8 @@ Optimization FAST_RUN FAST_COMPILE
:term:`inplace_random` x :term:`inplace_random` x
:term:`elemwise fusion` x :term:`elemwise fusion` x
:term:`GPU transfer` x :term:`GPU transfer` x
========================================================= ========= ============ :term:`local_log_softmax` x x
========================================================= ========= ============ =============
.. glossary:: .. glossary::
...@@ -252,5 +253,8 @@ Optimization FAST_RUN FAST_COMPILE ...@@ -252,5 +253,8 @@ Optimization FAST_RUN FAST_COMPILE
See :func:`theano.sandbox.cuda.opt.*`. See :func:`theano.sandbox.cuda.opt.*`.
local_log_softmax
This is a stabilization optimization.
It can happen due to rounding problem that the softmax probability of one value get to 0.
Taking the log of 0, would generate -inf that will probably generate NaN later.
We return a closer answer.
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论