提交 5f9a1384 authored 作者: james@X40's avatar james@X40

doc: added to Op contract

上级 eab932a1
...@@ -31,12 +31,15 @@ Op's contract ...@@ -31,12 +31,15 @@ Op's contract
An Op is any object which defines the following methods: An Op is any object which defines the following methods:
- **make_node(*inputs)** - **make_node(*inputs)**
- This method is responsible for creating output Results of a suitable Type
to serve as the outputs of this Op's application. It should put these
outputs into an Apply instance, and return the Apply instance.
- This important function creates an Apply node representing the - This important function creates an Apply node representing the
application of the Op on the inputs provided. If the Op cannot be application of the Op on the inputs provided. If the Op cannot be
applied on these inputs, it must raise an appropriate applied on these inputs, it must raise an appropriate
exception. This method is also responsible for creating Results of exception.
the suitable Type to serve as the outputs of the Op's application.
- **__call__(*inputs)** - **__call__(*inputs)**
...@@ -52,20 +55,54 @@ An Op is any object which defines the following methods: ...@@ -52,20 +55,54 @@ An Op is any object which defines the following methods:
method, the inputs are a list of references to data to operate on method, the inputs are a list of references to data to operate on
and output_storage is a list of storage cells where the results of and output_storage is a list of storage cells where the results of
the computation must be put. the computation must be put.
- This function must be determined by the inputs. That is to say, if it is
evaluated once on inputs A and returned B, then if ever inputs C, equal to
A, are presented again, then outputs equal to B must be returned again.
- **__eq__(self, other)**
- Returning True here is a promise to the optimization system that the other
Op will produce exactly the same graph effects (from perform) as this one, given
identical inputs. This means it will produce the same output values, it
will destroy the same inputs (same destroy_map), and will alias outputs to
the same inputs (same view_map).
- **__hash__(self)**
- If two Op instances compare equal, then they MUST return the same hash
value.
- Equally important, this hash value must not change during the lifetime of
self.
- **__ne__(self, other)**
- Recommended, and default: define as (not (self==other))
- **grad(inputs, output_gradients)** *Optional* - **grad(inputs, output_gradients)** *Optional*
- If the Op you are defining is differentiable, you can define its - If the Op you are defining is differentiable, you can define its
gradient symbolically in this method. Both the inputs and gradient symbolically in this method.
output_gradients are Results and you must return one Result for
each input representing the gradient wrt these inputs provided - Both the inputs and output_gradients will be Results. This function must
that the gradients wrt the outputs are computed by return a list containg one Result (or None) for each input.
output_gradients. Each returned Result represents the gradient wrt that input given the
symbolic gradients wrt each output.
- If the output is not differentiable with respect to any inputs, then this
function should be defined to return [None for i in inputs].
- If this method is not defined, then theano assumes it hsa been forgotten.
Symbolic differentiation will fail on a graph that includes this Op.
- For more information on the use of this method, see ``grad``.
For each method, the *default* is what the Op class defines for you. For each method, the *default* is what the Op class defines for you.
For more details you can go see the documentation for :ref:`op`. For more details, including the interface for providing a C implementation of
perform(), refer to the documentation for :ref:`op`.
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论