提交 315c2e86 authored 作者: Olivier Breuleux's avatar Olivier Breuleux

expanded on the advanced tutorial

上级 848a8ffa
...@@ -28,8 +28,6 @@ Glossary of terminology ...@@ -28,8 +28,6 @@ Glossary of terminology
to know, for any operation which supports broadcasting, which to know, for any operation which supports broadcasting, which
dimensions will need to be broadcasted. When applicable, this dimensions will need to be broadcasted. When applicable, this
information is given in the :term:`Type` of a :term:`Result`. information is given in the :term:`Type` of a :term:`Result`.
For more information, see the article about broadcasting_.
See also: See also:
...@@ -64,6 +62,9 @@ Glossary of terminology ...@@ -64,6 +62,9 @@ Glossary of terminology
graph graph
WRITEME WRITEME
inplace
WRITEME
op op
WRITEME WRITEME
...@@ -86,8 +87,9 @@ Glossary of terminology ...@@ -86,8 +87,9 @@ Glossary of terminology
type type
WRITEME WRITEME
view
WRITEME
.. _broadcasting: concepts/broadcasting.html
......
=========
Example 1
=========
.. rubric:: Contents
.. toctree::
:maxdepth: 2
type
==========================
Making the ``double`` type
==========================
WRITEME
=========
Example 2
=========
.. rubric:: Contents
.. toctree::
:maxdepth: 2
type
==========================
Making the ``cons`` type
==========================
WRITEME
...@@ -10,17 +10,69 @@ Before tackling this tutorial, it is highly recommended to read the ...@@ -10,17 +10,69 @@ Before tackling this tutorial, it is highly recommended to read the
The advanced tutorial is meant to give the reader a greater The advanced tutorial is meant to give the reader a greater
understanding of the building blocks of Theano by making him or her understanding of the building blocks of Theano. It contains two
define a framework for basic arithmetic on doubles from scratch, as examples which cover most of the conceptual space associated with
well as custom optimizations for this system. This tutorial should be :ref:`type` and :ref:`op` and then expands on other important matters
of most use to users who want to extend Theano with custom types and such as optimization.
operations related to these types. Users who want to extend Theano
with new operations on tensors should check the
This tutorial should be of most use to users who want to extend Theano
with custom types and operations related to these types. Users who
want to extend Theano with new operations on tensors should check the
:ref:`tensoroptutorial`, but it is a good idea to read this tutorial :ref:`tensoroptutorial`, but it is a good idea to read this tutorial
as well since it probably provides better grounding for the many as well since it probably provides better grounding for the many
concepts at work here. concepts at work here.
---------------------------------------
`Example 1`_
Making a basic arithmetic system on doubles
`Example 2`_
Making a higher-level type: ``cons`` (pair)
`Views and inplace operations`_
A guide to making Ops that return a :term:`view` on their inputs or
operate :term:`inplace` on them.
`Graph optimization`_
A guide to the different ways of defining new custom optimizations
to simplify the computation graph and/or improve its numerical
stability or other desirable properties.
`Tips`_
Tips and tricks about writing types, ops and optimizations. This
page is good reference - check it and come back to it!
`Wrapping up`_
A guide to what to look at next
---------------------------------------
.. rubric:: Contents
.. toctree::
:maxdepth: 2
ex1/index
ex2/index
inplace
optimization
tips
wrapup
.. _Example 1: ex1/index.html
.. _Example 2: ex2/index.html
.. _Views and inplace operations: inplace.html
.. _Graph optimization: optimization.html
.. _Tips: tips.html
.. _Wrapping up: wrapup.html
============================
Views and inplace operations
============================
WRITEME
==================
Graph optimization
==================
WRITEME
====
Tips
====
Don't define new Ops unless you have to
=======================================
It is usually not very useful to define Ops that can be easily
implemented using other already existing Ops. For example, instead of
writing a "sum_square_difference" Op, you should probably just write a
simple function:
.. code-block:: python
from theano import tensor as T
def sum_square_difference(a, b):
return T.sum((a - b)**2)
Even without taking Theano's optimizations into account, it is likely
to work just as well as a custom implementation. It also supports all
data types, tensors of all dimensions as well as broadcasting, whereas
a custom implementation would probably only bother to support
contiguous vectors/matrices of doubles...
Use Theano's high order Ops when applicable
===========================================
Theano provides some generic Op classes which allow you to generate a
lot of ops at a lesser effort. For instance, Elemwise can be used to
make :term:`elementwise` operations easily whereas DimShuffle can be
used to make transpose-like transformations. These higher order Ops
are mostly Tensor-related, as this is Theano's specialty. An exposé of
them can therefore be found in :ref:`tensoroptools`.
.. _opchecklist:
Op Checklist
============
Use this list to make sure you haven't forgotten anything when
defining a new Op. It might not be exhaustive but it covers a lot of
common mistakes.
WRITEME
===========
Wrapping up
===========
WRITEME
...@@ -262,6 +262,13 @@ initialize a state with a matrix of zeros: ...@@ -262,6 +262,13 @@ initialize a state with a matrix of zeros:
# [ 0. 0. 0. 0. 0.]] # [ 0. 0. 0. 0. 0.]]
Nesting Modules
===============
WRITEME
**Next:** `Tools`_ **Next:** `Tools`_
.. _Tools: tools.html .. _Tools: tools.html
......
=========
Tutorials
=========
.. toctree::
:maxdepth: 2
basic/index
advanced/index
tensorop
tensoroptools
.. _tensoroptutorial:
===============================
How to make a new Op on tensors
===============================
This tutorial aims to explain how to create a new operation operating
on numpy's ndarrays and using Theano's Tensor type. It is optional but
recommended to go through the :ref:`advtutorial` beforehand, which
explains more in detail the purpose of each of the methods you will
define here.
The operation we will implement will be multiplication of two matrices
of doubles. Of course, this operation already exists in Theano, but so
do all simple operations and a tutorial works better when all concepts
are kept as simple as possible. We will proceed by steps: the first
step is to implement the Op in Python using numpy's multiplication
operator. In the second step, we will extend our Op to (optionally)
operate inplace on its inputs. In the third step, which is the most
difficult, we will give our Op a solid C implementation.
Implementing a new Op in Python
===============================
This is actually very simple to do. You are required to define two
methods - one to create the :ref:`apply` node every time your Op is
applied to some inputs, declaring the outputs in the process and
another to operate on the inputs. There is also one optional method
you may define which will compute the gradient of your Op.
Extending the Op to work inplace
================================
WRITEME
Writing a C implementation
==========================
WRITEME
What's next
===========
Theano provides several special Ops that can make your job
easier. Check the :ref:`tensoroptools` to see if you can leverage them
to do what you need.
It is highly recommended that you read the :ref:`opchecklist` before
making any new Op. This can avoid you a lot of problems.
.. _tensoroptools:
===============
Tensor Op Tools
===============
WRITEME - describe how to use Elemwise here
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论