提交 9501ca28 authored 作者: Frederic's avatar Frederic

Added some text on how to extend Theano.

上级 3ae40423
...@@ -48,6 +48,37 @@ Op contract ...@@ -48,6 +48,37 @@ Op contract
def R_op(self, inputs, eval_points): def R_op(self, inputs, eval_points):
def infer_shape(node, (i0_shapes, ...)) def infer_shape(node, (i0_shapes, ...))
.. ../extending/op.txt
There is 2 mandatory function. The first is :func:`make_node`. The
second is the one that do/tell the computation to do at run
time. Currently you have 4 posibility: implement the :func:`perform`
and/or :func:`c_code <Op.c_code>` (and other related :ref:`c functions
<cop>`), or the :func:`make_thunk` function. The ``perform`` allow you
to easily wrap an existing python function in Theano. The ``c_code``
and related function allow you to have your op generate c code and
have Theano compile and link to it. The ``make_thunk`` function will
be called during compilation and should generate a ``thunk``: a
function that when called will do the wanted computation. This is
usefull if you want to generate code and compile it yourself. For
example, this allow you to use PyCUDA to compile gpu code.
There is 2 mandatory/highly suggested function. They are needed to for a basic
optimization that merge duplicate computation in a Theano function. So
if you don't want Theano to do you computation multiple time for no
good reason, implement them! Those function are :func:`__eq__` and
:func:`__hash__`.
The :func:`infer_shape` method allow some very interesting
optimization like don't performing the computation of your op just to
take the shape your Op's output.
The :func:`grad` method is needed you want want differentiation to
work with your op.
The :func:`__str__` is usefull to have a better printing of you op.
The :func:`R_op` is needed if you want theano.tensor.Rop to work with your op.
Op example Op example
---------- ----------
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论