提交 be622607 authored 作者: Frederic's avatar Frederic

Update doc about printing.

上级 bc8fc0f1
......@@ -12,18 +12,18 @@
Guide
======
Symbolic printing: the Print() Op
----------------------------------
Printing during execution
-------------------------
Intermediate values in a computation cannot be printed in
the normal python way with the print statement, because Theano has no *statements*.
Instead there is the `Print` Op.
Instead there is the :class:`Print` Op.
>>> x = T.dvector()
>>> hello_world_op = Print('hello world')
>>> hello_world_op = printing.Print('hello world')
>>> printed_x = hello_world_op(x)
>>> f = function([x], printed_x)
>>> f([1,2,3])
>>> f([1, 2, 3])
>>> # output: "hello world __str__ = [ 1. 2. 3.]"
If you print more than one thing in a function like `f`, they will not
......@@ -39,15 +39,15 @@ Printing graphs
---------------
Theano provides two functions (:func:`theano.pp` and
:func:`theano.debugprint`) to print a graph to the terminal before or after
:func:`theano.printing.debugprint`) to print a graph to the terminal before or after
compilation. These two functions print expression graphs in different ways:
:func:`pp` is more compact and math-like, :func:`debugprint` is more verbose.
Theano also provides :func:`pydotprint` that creates a png image of the function.
Theano also provides :func:`theano.printing.pydotprint` that creates a png image of the function.
1) The first is :func:`theano.pp`.
1) The first is :func:`theano.pp`.
>>> x = T.dscalar('x')
>>> y = x**2
>>> y = x ** 2
>>> gy = T.grad(y, x)
>>> pp(gy) # print out the gradient prior to optimization
'((fill((x ** 2), 1.0) * 2) * (x ** (2 - 1)))'
......@@ -71,56 +71,63 @@ iteration number or other kinds of information in the name.
To make graphs legible, :func:`pp` hides some Ops that are actually in the graph. For example,
automatic DimShuffles are not shown.
2) The second function to print a graph is :func:`theano.printing.debugprint(variable_or_function, depth=-1)`
2) The second function to print a graph is :func:`theano.printing.debugprint`
>>> theano.printing.debugprint(f.maker.fgraph.outputs[0])
Elemwise{mul,no_inplace} 46950805397392
2.0 46950805310800
x 46950804895504
Elemwise{mul,no_inplace} [@A] ''
|TensorConstant{2.0} [@B]
|x [@C]
Each line printed represents a Variable in the graph.
The line `` x 46950804895504`` means the variable named 'x' at memory
location 46950804895504. If you accidentally have two variables called 'x' in
your graph, their different memory locations will be your clue.
The line ``|x [@C`` means the variable named ``x`` with debugprint identifier
[@C] is an input of the Elemwise. If you accidentally have two variables called ``x`` in
your graph, their different debugprint identifier will be your clue.
The line `` 2.0 46950805310800`` means that there is a constant 2.0 at the
given memory location.
The line ``|TensorConstant{2.0} [@B]`` means that there is a constant 2.0
wit this debugprint identifier.
The line `` Elemwise{mul,no_inplace} 46950805397392`` is indented less than
The line ``Elemwise{mul,no_inplace} [@A] ''`` is indented less than
the other ones, because it means there is a variable computed by multiplying
the other (more indented) ones together.
the other (more indented) ones together.
The ``|`` symbol are just there to help read big graph. The group
together inputs to a node.
Sometimes, you'll see a Variable but not the inputs underneath. That can
happen when that Variable has already been printed. Where else has it been
printed? Look for the memory address using the Find feature of your text
printed? Look for debugprint identifier using the Find feature of your text
editor.
>>> theano.printing.debugprint(gy)
Elemwise{mul} 46950804894224
Elemwise{mul} 46950804735120
Elemwise{second,no_inplace} 46950804626128
Elemwise{pow,no_inplace} 46950804625040
x 46950658736720
2 46950804039760
1.0 46950804625488
2 46950804039760
Elemwise{pow} 46950804737616
x 46950658736720
Elemwise{sub} 46950804736720
2 46950804039760
InplaceDimShuffle{} 46950804736016
1 46950804735760
Elemwise{mul} [@A] ''
|Elemwise{mul} [@B] ''
| |Elemwise{second,no_inplace} [@C] ''
| | |Elemwise{pow,no_inplace} [@D] ''
| | | |x [@E]
| | | |TensorConstant{2} [@F]
| | |TensorConstant{1.0} [@G]
| |TensorConstant{2} [@F]
|Elemwise{pow} [@H] ''
|x [@E]
|Elemwise{sub} [@I] ''
|TensorConstant{2} [@F]
|InplaceDimShuffle{} [@J] ''
|TensorConstant{1} [@K]
>>> theano.printing.debugprint(gy, depth=2)
Elemwise{mul} 46950804894224
Elemwise{mul} 46950804735120
Elemwise{pow} 46950804737616
Elemwise{mul} [@A] ''
|Elemwise{mul} [@B] ''
|Elemwise{pow} [@C] ''
If the depth parameter is provided, it limits the nuber of levels that are
shown.
3) The function :func:`theano.printing.pydotprint(fct, outfile=SOME_DEFAULT_VALUE)` will print a compiled theano function to a png file.
3) The function :func:`theano.printing.pydotprint` will print a compiled theano function to a png file.
In the image, Apply nodes (the applications of ops) are shown as boxes and variables are shown as ovals.
The number at the end of each label indicates graph position.
......@@ -170,10 +177,13 @@ Reference
running the function will print the value that `x` takes in the graph.
.. function:: theano.printing.pp(*args)
.. autofunction:: theano.printing.debugprint
TODO
.. function:: theano.pp(*args)
.. autofunction:: theano.printing.debugprint
Just a shortcut to :func:`theano.printing.pp`
.. autofunction:: theano.printing.pp(*args)
.. autofunction:: theano.printing.pydotprint
......@@ -187,7 +187,7 @@ Theano provides two functions (:func:`theano.pp` and
:func:`theano.printing.debugprint`) to print a graph to the terminal before or after
compilation. These two functions print expression graphs in different ways:
:func:`pp` is more compact and math-like, :func:`debugprint` is more verbose.
Theano also provides :func:`pydotprint` that creates a png image of the function.
Theano also provides :func:`theano.printing.pydotprint` that creates a png image of the function.
You can read about them in :ref:`libdoc_printing`.
......
......@@ -30,7 +30,7 @@ _logger = logging.getLogger("theano.printing")
def debugprint(obj, depth=-1, print_type=False,
file=None, ids='CHAR', stop_on_name=False):
"""Print a computation graph to file
"""Print a computation graph as text to stdout or a file.
:type obj: Variable, Apply, or Function instance
:param obj: symbolic thing to print
......@@ -56,12 +56,12 @@ def debugprint(obj, depth=-1, print_type=False,
The first part of the text identifies whether it is an input
(if a name or type is printed) or the output of some Apply (in which case
the Op is printed).
The second part of the text is the memory location of the Variable.
The second part of the text is an identifier of the Variable.
If print_type is True, we add a part containing the type of the Variable
If a Variable is encountered multiple times in the depth-first search,
it is only printed recursively the first time. Later, just the Variable
and its memory location are printed.
identifier is printed.
If an Apply has multiple outputs, then a '.N' suffix will be appended
to the Apply's identifier, to indicate which output a line corresponds to.
......@@ -461,7 +461,9 @@ pprint.assign(lambda pstate, r: hasattr(pstate, 'target')
LeafPrinter())
pp = pprint
"""
Print to the terminal a math-like expression.
"""
# colors not used: orange, amber#FFBF00, purple, pink,
# used by default: green, blue, grey, red
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论