提交 b3965fab authored 作者: Pascal Lamblin's avatar Pascal Lamblin

Small reformulations in doc.

上级 ebfa6163
...@@ -38,8 +38,8 @@ output. ...@@ -38,8 +38,8 @@ output.
Shape inference problem Shape inference problem
======================= =======================
Theano do shape information propagation in the graph. Sometimes this Theano propagates shape information in the graph. Sometimes this
can had error. Example: can lead to errors. For example:
.. code-block:: python .. code-block:: python
...@@ -71,10 +71,10 @@ can had error. Example: ...@@ -71,10 +71,10 @@ can had error. Example:
# |Shape_i{1} [@55959184] '' 0 # |Shape_i{1} [@55959184] '' 0
# | |<TensorType(float64, matrix)> [@55583888] # | |<TensorType(float64, matrix)> [@55583888]
print f(xv,yv)# DONT RAISE AN ERROR AS SHOULD BE. print f(xv,yv)# DOES NOT RAISE AN ERROR AS SHOULD BE.
#[8,4] #[8,4]
f = theano.function([x,y], z)# Don't take the shape. f = theano.function([x,y], z)# Do not take the shape.
theano.printing.debugprint(f) theano.printing.debugprint(f)
#Join [@44540496] '' 0 #Join [@44540496] '' 0
# |0 [@44540432] # |0 [@44540432]
...@@ -84,22 +84,25 @@ can had error. Example: ...@@ -84,22 +84,25 @@ can had error. Example:
f(xv,yv) f(xv,yv)
# Raise a dimensions mismatch error. # Raise a dimensions mismatch error.
As you see, when you ask for the shape of some computation(join in the As you see, when you ask for the shape of some computation (join in the
example), we sometimes compute the shape without executing the example), we sometimes compute an inferred shape directly, without executing
computation(there is no join in the first output or debugprint). the computation itself (there is no join in the first output or debugprint).
This make the computation of the shape faster, but can hide error. In This makes the computation of the shape faster, but it can hide errors. In
the example, the computation of the shape of join is done on the first the example, the computation of the shape of join is done on the first
theano variable in the join, not on the other. theano variable in the join, not on the other.
This can probably happen with many other op as elemwise, dot, ... This can probably happen with many other op as elemwise, dot, ...
Indeed, to make some optimizations (for speed or stability, for instance),
Theano can assume that the computation is correct and consistent
in the first place, this is the case here.
You can detect those problem by running the code without this You can detect those problem by running the code without this
optimization with the theano flag optimization, with the Theano flag
`optimizer_excluding=local_shape_to_shape_i`. You can also have the `optimizer_excluding=local_shape_to_shape_i`. You can also have the
same effect by running in the mode FAST_COMPILE(won't apply this same effect by running in the mode FAST_COMPILE (it will not apply this
optimization and most other optimization too) or DEBUG_MODE(will test optimization, nor most other optimizations) or DEBUG_MODE (it will test
before and after all optimizations(much slower)). before and after all optimizations (much slower)).
Specifing exact shape Specifing exact shape
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论