提交 3e2b234f authored 作者: Frederic's avatar Frederic

Remove detail about as_op and make it clear this isn't recommand.

上级 19cb8885
...@@ -418,10 +418,14 @@ have to be jointly optimized explicitly in the code.) ...@@ -418,10 +418,14 @@ have to be jointly optimized explicitly in the code.)
as_op as_op
===== =====
- Decorator that converts a python function into a basic Theano op as_op is a python decorator that converts a python function into a
that will call the supplied function as its implementation. basic Theano op that will call the supplied function during execution.
- Takes an optional infer_shape parameter that should be a
callable with this signature: This isn't the recommand way way to build an op, but allow quick
implementation.
It takes an optional :func:`infer_shape` parameter that must have this
signature:
.. code-block:: python .. code-block:: python
...@@ -434,34 +438,20 @@ as_op ...@@ -434,34 +438,20 @@ as_op
.. note:: .. note::
This should not be used when performance is a concern since Not providing the `infer_shape` method cause shapes related
the very basic nature of the resulting Op may interfere with optimization to don't work with that op. For example
certain graph optimizations. WHY?!?!? I think it need more detail or should be removed.!?!?! `you_op(inputs, ...).shape` will need the op to be executed just
to get the shape.
.. note:: .. note::
Returns FromFunctionOp(fn, itypes, otypes, infer_shape) As no grad is defined, this mean you won't be able to
differenciate path that include this op.
FromFunctionOp
--------------
- Build a basic Theano Op around a python function.
.. note:: .. note::
Since the resulting Op is very basic and is missing most It convert the python function to a callable object that take as
of the optional functionalities, some optimizations may not inputs Theano variable that was declared.
apply.
If you want to help, you can supply an infer_shape function
that computes the shapes of the output given the shapes of
the inputs.
Also the gradient is undefined in the resulting op and
Theano will raise an error if you attempt to get the
gradient of a graph containing this op.
as_op Example as_op Example
------------- -------------
......
...@@ -453,6 +453,38 @@ class T_extending(unittest.TestCase): ...@@ -453,6 +453,38 @@ class T_extending(unittest.TestCase):
simplify = gof.TopoOptimizer(local_simplify) simplify = gof.TopoOptimizer(local_simplify)
simplify.optimize(e) simplify.optimize(e)
def test_as_op(self):
import theano
import numpy
from theano.compile.ops import as_op
def infer_shape_numpy_dot(node, input_shapes):
ashp, bshp = input_shapes
return [ashp[:-1] + bshp[-1:]]
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
otypes=[theano.tensor.fmatrix],
infer_shape=infer_shape_numpy_dot)
def numpy_add(a, b):
return numpy.add(a, b)
def infer_shape_numpy_add_sub(node, input_shapes):
ashp, bshp = input_shapes
# Both inputs should have that same shape, so we just
# return one of them.
return [ashp[0]]
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
otypes=[theano.tensor.fmatrix],
infer_shape=infer_shape_numpy_add_sub)
def numpy_add(a, b):
return numpy.add(a, b)
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
otypes=[theano.tensor.fmatrix],
infer_shape=infer_shape_numpy_add_sub)
def numpy_sub(a, b):
return numpy.sub(a, b)
class T_introduction(unittest.TestCase): class T_introduction(unittest.TestCase):
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论