提交 fa5f3ed9 authored 作者: Christof Angermueller's avatar Christof Angermueller

Replace pydotprint_variables by pydotprint and minor changes

上级 81e6bba0
...@@ -304,6 +304,8 @@ Consider the following logistic regression model: ...@@ -304,6 +304,8 @@ Consider the following logistic regression model:
>>> train = theano.function(inputs=[x,y], outputs=[prediction, xent], updates=[[w, w-0.01*gw], [b, b-0.01*gb]], name = "train") >>> train = theano.function(inputs=[x,y], outputs=[prediction, xent], updates=[[w, w-0.01*gw], [b, b-0.01*gb]], name = "train")
>>> predict = theano.function(inputs=[x], outputs=prediction, name = "predict") >>> predict = theano.function(inputs=[x], outputs=prediction, name = "predict")
We will now make use of Theano's printing features to compare the unoptimized
graph (``prediction``) to the optimized graph (``predict``).
- Pretty Printing - Pretty Printing
...@@ -314,6 +316,8 @@ TensorConstant{0.5})' ...@@ -314,6 +316,8 @@ TensorConstant{0.5})'
- Debug Print - Debug Print
The graph before optimization:
>>> theano.printing.debugprint(prediction) # doctest: +NORMALIZE_WHITESPACE >>> theano.printing.debugprint(prediction) # doctest: +NORMALIZE_WHITESPACE
Elemwise{gt,no_inplace} [@A] '' Elemwise{gt,no_inplace} [@A] ''
|Elemwise{true_div,no_inplace} [@B] '' |Elemwise{true_div,no_inplace} [@B] ''
...@@ -333,6 +337,8 @@ TensorConstant{0.5})' ...@@ -333,6 +337,8 @@ TensorConstant{0.5})'
|DimShuffle{x} [@O] '' |DimShuffle{x} [@O] ''
|TensorConstant{0.5} [@P] |TensorConstant{0.5} [@P]
The graph after optimization:
>>> theano.printing.debugprint(predict) # doctest: +NORMALIZE_WHITESPACE >>> theano.printing.debugprint(predict) # doctest: +NORMALIZE_WHITESPACE
Elemwise{Composite{GT(scalar_sigmoid((-((-i0) - i1))), i2)}} [@A] '' 4 Elemwise{Composite{GT(scalar_sigmoid((-((-i0) - i1))), i2)}} [@A] '' 4
|CGemv{inplace} [@B] '' 3 |CGemv{inplace} [@B] '' 3
...@@ -352,18 +358,24 @@ TensorConstant{0.5})' ...@@ -352,18 +358,24 @@ TensorConstant{0.5})'
- Picture Printing of Graphs - Picture Printing of Graphs
``pydotprint`` requires graphviz and pydot. ``pydotprint`` requires graphviz and pydot.
The graph before optimization:
>>> theano.printing.pydotprint(prediction, outfile="pics/logreg_pydotprint_prediction.png", var_with_name_simple=True) >>> theano.printing.pydotprint(prediction, outfile="pics/logreg_pydotprint_prediction.png", var_with_name_simple=True)
The output file is available at pics/logreg_pydotprint_prediction.png The output file is available at pics/logreg_pydotprint_prediction.png
.. image:: ./pics/logreg_pydotprint_prediction.png .. image:: ./pics/logreg_pydotprint_prediction.png
:width: 800 px :width: 800 px
The graph after optimization:
>>> theano.printing.pydotprint(predict, outfile="pics/logreg_pydotprint_predict.png", var_with_name_simple=True) >>> theano.printing.pydotprint(predict, outfile="pics/logreg_pydotprint_predict.png", var_with_name_simple=True)
The output file is available at pics/logreg_pydotprint_predict.png The output file is available at pics/logreg_pydotprint_predict.png
.. image:: ./pics/logreg_pydotprint_predict.png .. image:: ./pics/logreg_pydotprint_predict.png
:width: 800 px :width: 800 px
The optimized training graph:
>>> theano.printing.pydotprint(train, outfile="pics/logreg_pydotprint_train.png", var_with_name_simple=True) >>> theano.printing.pydotprint(train, outfile="pics/logreg_pydotprint_train.png", var_with_name_simple=True)
The output file is available at pics/logreg_pydotprint_train.png The output file is available at pics/logreg_pydotprint_train.png
......
...@@ -5,5 +5,5 @@ f = theano.function([a], b) # compile function ...@@ -5,5 +5,5 @@ f = theano.function([a], b) # compile function
print f([0,1,2]) print f([0,1,2])
# prints `array([0,2,1026])` # prints `array([0,2,1026])`
theano.printing.pydotprint_variables(b, outfile="pics/f_unoptimized.png", var_with_name_simple=True) theano.printing.pydotprint(b, outfile="pics/f_unoptimized.png", var_with_name_simple=True)
theano.printing.pydotprint(f, outfile="pics/f_optimized.png", var_with_name_simple=True) theano.printing.pydotprint(f, outfile="pics/f_optimized.png", var_with_name_simple=True)
...@@ -159,12 +159,14 @@ as we apply it. Consider the following example of optimization: ...@@ -159,12 +159,14 @@ as we apply it. Consider the following example of optimization:
>>> f = theano.function([a], b) # compile function >>> f = theano.function([a], b) # compile function
>>> print f([0, 1, 2]) # prints `array([0,2,1026])` >>> print f([0, 1, 2]) # prints `array([0,2,1026])`
[ 0. 2. 1026.] [ 0. 2. 1026.]
>>> theano.printing.pydotprint(b, outfile="./pics/symbolic_graph_unopt.png", var_with_name_simple=True)
The output file is available at ./pics/symbolic_graph_unopt.png
>>> theano.printing.pydotprint(f, outfile="./pics/symbolic_graph_opt.png", var_with_name_simple=True) >>> theano.printing.pydotprint(f, outfile="./pics/symbolic_graph_opt.png", var_with_name_simple=True)
The output file is available at ./pics/symbolic_graph_opt.png The output file is available at ./pics/symbolic_graph_opt.png
.. |g1| image:: ../hpcs2011_tutorial/pics/f_unoptimized.png .. |g1| image:: ./pics/symbolic_graph_unopt.png
:width: 300 px :width: 500 px
.. |g2| image:: ./pics/symbolic_graph_opt.png .. |g2| image:: ./pics/symbolic_graph_opt.png
:width: 500 px :width: 500 px
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论