提交 4b131ce7 authored 作者: Arnaud Bergeron's avatar Arnaud Bergeron

Add mentions of __props__

Reflow paragraphs Remove flops() since it's not really important.
上级 13c43940
...@@ -42,25 +42,22 @@ Inputs and Outputs are lists of Theano variables. ...@@ -42,25 +42,22 @@ Inputs and Outputs are lists of Theano variables.
how to make a quality contribution. how to make a quality contribution.
Op Contract Op Structure
=========== ============
This an overview of the methods you typically have to implement to
make a new op. It does not provide extensive coverage of all the
possibilities you may encounter or need. For that refer to
:ref:`op_contract`.
.. code-block:: python .. code-block:: python
import theano import theano
class MyOp(theano.Op): class MyOp(theano.Op):
def make_node(self, *inputs): __props__ = ()
pass
def __eq__(self, other): def make_node(self, *inputs):
pass
def __hash__(self):
pass
def __str__(self):
pass pass
# Python implementation: # Python implementation:
...@@ -72,11 +69,13 @@ Op Contract ...@@ -72,11 +69,13 @@ Op Contract
# ... # ...
pass pass
# others implementation (pycuda, ...): # Other implementations (pycuda, ...):
def make_thunk(self, node, storage_map, _, _2): def make_thunk(self, node, storage_map, _, _2):
pass pass
# optional: # optional:
check_input = True
def __init__(self, ...): def __init__(self, ...):
pass pass
...@@ -89,43 +88,45 @@ Op Contract ...@@ -89,43 +88,45 @@ Op Contract
def infer_shape(node, (i0_shapes, ...)): def infer_shape(node, (i0_shapes, ...)):
pass pass
def flops(self, inputs, outputs):
pass
check_input = True
.. ../extending/op.txt .. ../extending/op.txt
There are two mandatory methods that one needs to implement. There are two mandatory methods that one needs to implement. The
The first one is :func:`make_node`. The second one first one is :func:`make_node`. The second one would describe the
would describe the computations that are required to be done computations that are required to be done at run time. Currently there
at run time. Currently there are 2 different possibilites: are 2 different possibilites: implement the :func:`perform` and/or
implement the :func:`perform` :func:`c_code <Op.c_code>` methods (and other related :ref:`c methods
and/or :func:`c_code <Op.c_code>` methods (and other related :ref:`c methods <cop>`), or the :func:`make_thunk` method. ``perform`` allows to
<cop>`), or the :func:`make_thunk` method. ``perform`` allows easily wrap an existing Python function into Theano. ``c_code`` and
to easily wrap an existing Python function into Theano. ``c_code`` the related methods allow the op to generate C code that will be
and the related methods allow the op to generate C code that will be compiled and linked by Theano. On the other hand, ``make_thunk`` will
compiled and linked by Theano. On the other hand, ``make_thunk`` be called only once during compilation and should generate a
will be called only once during compilation and should generate ``thunk``: a standalone function that when called will do the wanted
a ``thunk``: a standalone function that when called will do the wanted computations. computations. This is useful if you want to generate code and compile
This is useful if you want to generate code and compile it yourself. For it yourself. For example, this allows you to use PyCUDA to compile GPU
example, this allows you to use PyCUDA to compile GPU code. code.
Also there are two methods whose implementations are highly recommended. They are The :attr:`__props__` attribute serves to make Op generate an appropriate
needed in order to merge duplicate computations involving your op. So if you :func:`__eq__` and :func:`__hash__` for your Op. It must be a tuple
do not want Theano to execute your op multiple times with the same inputs, that lists the properties that influence how the computation is
do implement them. Those methods are :func:`__eq__` and performed (Ususally these are those that you set in
:func:`__hash__`. :func:`__init__`). If you don't have any properties, then you should
set this attribute to the emtpy tuple `()`.
:func:`__eq__` and :func:`__hash__` will be used by the optimization
phase to merge nodes that are doing a equivalent compuation (same
inputs, same operation). It is especially important that two Ops that
compare equal (have the same values for all the properties listed in
__props__ and the same type) compute the same thing when presented
with the same inputs.
Also note that this attribute will also generate a suitable
:func:`__str__` method for your Op. You may override this default
with a custom one if you want another format for the output.
The :func:`infer_shape` method allows to infer the shape of some variable, somewhere in the The :func:`infer_shape` method allows to infer the shape of some variable, somewhere in the
middle of the computational graph without actually computing the outputs (when possible). middle of the computational graph without actually computing the outputs (when possible).
This could be helpful if one only needs the shape of the output instead of the actual outputs. This could be helpful if one only needs the shape of the output instead of the actual outputs.
The :func:`flops` method allows to have the number of mega flops and
giga flops per second printed by the memory profiler. It takes as
inputs two lists: one for the inputs and one for the outputs. They
contain tuples that are the shapes of the corresponding inputs/outputs.
The :func:`grad` method is required if you want to differentiate some cost whose expression The :func:`grad` method is required if you want to differentiate some cost whose expression
includes your op. includes your op.
...@@ -135,8 +136,9 @@ string representation of your op. ...@@ -135,8 +136,9 @@ string representation of your op.
The :func:`R_op` method is needed if you want ``theano.tensor.Rop`` to The :func:`R_op` method is needed if you want ``theano.tensor.Rop`` to
work with your op. work with your op.
The optional boolean :func:'check_input' attribute is used to specify if you want the types used in The optional boolean :attr:`check_input` attribute is used to specify
your op to check their inputs in their c_code. It can be used to speed up compilation, reduce overhead if you want the types used in your op to check their inputs in their
c_code. It can be used to speed up compilation, reduce overhead
(particularly for scalars) and reduce the number of generated C files. (particularly for scalars) and reduce the number of generated C files.
Op Example Op Example
...@@ -147,15 +149,7 @@ Op Example ...@@ -147,15 +149,7 @@ Op Example
import theano import theano
class DoubleOp(theano.Op): class DoubleOp(theano.Op):
def __eq__(self, other): __props__ = ()
return type(self) == type(other)
def __hash__(self):
return hash(type(self))
def __str__(self):
return self.__class__.__name__
def make_node(self, x): def make_node(self, x):
x = theano.tensor.as_tensor_variable(x) x = theano.tensor.as_tensor_variable(x)
return theano.Apply(self, [x], [x.type()]) return theano.Apply(self, [x], [x.type()])
...@@ -327,24 +321,27 @@ For instance, to verify the Rop method of the DoubleOp, you can use this: ...@@ -327,24 +321,27 @@ For instance, to verify the Rop method of the DoubleOp, you can use this:
Testing GPU Ops Testing GPU Ops
--------------- ---------------
Ops to be executed on the GPU should inherit from the ``theano.sandbox.cuda.GpuOp`` Ops to be executed on the GPU should inherit from the
and not ``theano.Op``. This allows Theano to distinguish them. Currently, we ``theano.sandbox.cuda.GpuOp`` and not ``theano.Op``. This allows
use this to test if the NVIDIA driver works correctly with our sum reduction code on the Theano to distinguish them. Currently, we use this to test if the
GPU. NVIDIA driver works correctly with our sum reduction code on the GPU.
Running Your Tests Running Your Tests
================== ==================
To perform your tests, you may select either one of the three following methods: To perform your tests, you may select either one of the three
following methods:
theano-nose theano-nose
----------- -----------
The method of choice to conduct tests is to run the file ``theano-nose``. In a regular The method of choice to conduct tests is to run the file
Theano installation, the latter will be on the operating system's path and directly accessible ``theano-nose``. In a regular Theano installation, the latter will be
from any folder. Otherwise, it can be accessed in the ``Theano/bin`` folder. The following command on the operating system's path and directly accessible from any
lines may be used for the corresponding purposes: folder. Otherwise, it can be accessed in the ``Theano/bin``
folder. The following command lines may be used for the corresponding
purposes:
* ``theano-nose --theano``: Run every test found in Theano's path. * ``theano-nose --theano``: Run every test found in Theano's path.
...@@ -352,23 +349,25 @@ lines may be used for the corresponding purposes: ...@@ -352,23 +349,25 @@ lines may be used for the corresponding purposes:
* ``theano-nose test_file.py``: Run every test found in the file *test_file.py*. * ``theano-nose test_file.py``: Run every test found in the file *test_file.py*.
The following are particularly useful for development purposes since they call for The following are particularly useful for development purposes since
particular classes or even for particular tests: they call for particular classes or even for particular tests:
* ``theano-nose test_file.py:test_DoubleRop``: Run every test found inside the class *test_DoubleRop*. * ``theano-nose test_file.py:test_DoubleRop``: Run every test found inside the class *test_DoubleRop*.
* ``theano-nose test_file.py:test_DoubleRop.test_double_op``: Run only the test *test_double_op* * ``theano-nose test_file.py:test_DoubleRop.test_double_op``: Run only the test *test_double_op*
in the class *test_DoubleRop*. in the class *test_DoubleRop*.
Help with the use and functionalities of ``theano-nose`` may be obtained by running Help with the use and functionalities of ``theano-nose`` may be
it with the command line parameter ``--help (-h)``. obtained by running it with the command line parameter ``--help
(-h)``.
nosetests nosetests
--------- ---------
The command ``nosetests`` can also be used. Although it lacks the useful The command ``nosetests`` can also be used. Although it lacks the
functionalities that ``theano-nose`` provides, ``nosetests`` can be called similarly useful functionalities that ``theano-nose`` provides, ``nosetests``
to ``theano-nose`` from any folder in Python's path like so: can be called similarly to ``theano-nose`` from any folder in Python's
path like so:
``nosetests [suffix similar to the above]``. ``nosetests [suffix similar to the above]``.
...@@ -378,9 +377,10 @@ More documentation on ``nosetests`` is available here: ...@@ -378,9 +377,10 @@ More documentation on ``nosetests`` is available here:
In-file In-file
------- -------
One may also add a block of code similar to the following at the end of the One may also add a block of code similar to the following at the end
file containing a specific test of interest and run the file. In this example, the test of the file containing a specific test of interest and run the
*test_DoubleRop* in the class *test_double_op* would be performed. file. In this example, the test *test_DoubleRop* in the class
*test_double_op* would be performed.
.. code-block:: python .. code-block:: python
...@@ -407,7 +407,8 @@ Modify and execute to compute: x * y. ...@@ -407,7 +407,8 @@ Modify and execute to compute: x * y.
Modify and execute the example to return two outputs: x + y and x - y. Modify and execute the example to return two outputs: x + y and x - y.
You can omit the Rop functions. Try to implement the testing apparatus described above. You can omit the Rop functions. Try to implement the testing apparatus
described above.
(Notice that Theano's current *elemwise fusion* optimization is (Notice that Theano's current *elemwise fusion* optimization is
only applicable to computations involving a single output. Hence, to gain only applicable to computations involving a single output. Hence, to gain
...@@ -453,6 +454,7 @@ signature: ...@@ -453,6 +454,7 @@ signature:
It converts the python function to a callable object that takes as It converts the python function to a callable object that takes as
inputs Theano variables that were declared. inputs Theano variables that were declared.
as_op Example as_op Example
------------- -------------
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论