Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
a96d5716
提交
a96d5716
authored
4月 18, 2012
作者:
lamblin
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request #611 from dwf/default_argument_fix
Default argument fix
上级
578f4836
f5dce590
全部展开
隐藏空白字符变更
内嵌
并排
正在显示
38 个修改的文件
包含
341 行增加
和
162 行删除
+341
-162
NEWS.txt
NEWS.txt
+7
-1
function.txt
doc/library/compile/function.txt
+7
-7
basic.txt
doc/library/tensor/basic.txt
+0
-0
graphical_models.txt
doc/proposals/graphical_models.txt
+3
-3
pfunc.txt
doc/proposals/pfunc.txt
+36
-36
debugging_with_stepmode.txt
doc/sandbox/debugging_with_stepmode.txt
+7
-4
debugmode.py
theano/compile/debugmode.py
+9
-4
function.py
theano/compile/function.py
+6
-2
mode.py
theano/compile/mode.py
+5
-1
module.py
theano/compile/module.py
+22
-10
pfunc.py
theano/compile/pfunc.py
+5
-1
profilemode.py
theano/compile/profilemode.py
+7
-3
cc.py
theano/gof/cc.py
+17
-5
cmodule.py
theano/gof/cmodule.py
+8
-1
env.py
theano/gof/env.py
+19
-8
graph.py
theano/gof/graph.py
+6
-2
link.py
theano/gof/link.py
+10
-3
opt.py
theano/gof/opt.py
+6
-2
test_destroyhandler.py
theano/gof/tests/test_destroyhandler.py
+13
-4
test_opt.py
theano/gof/tests/test_opt.py
+3
-1
vm.py
theano/gof/vm.py
+3
-1
pycuda_example.py
theano/misc/pycuda_example.py
+15
-7
printing.py
theano/printing.py
+9
-3
basic_ops.py
theano/sandbox/cuda/basic_ops.py
+3
-1
elemwise.py
theano/sandbox/cuda/elemwise.py
+3
-1
test_bench_loopfusion.py
theano/sandbox/cuda/tests/test_bench_loopfusion.py
+6
-1
debug.py
theano/sandbox/debug.py
+19
-11
symbolic_module.py
theano/sandbox/symbolic_module.py
+3
-1
test_rng_mrg.py
theano/sandbox/test_rng_mrg.py
+3
-1
theano_object.py
theano/sandbox/theano_object.py
+16
-14
scan_utils.py
theano/scan_module/scan_utils.py
+8
-1
basic.py
theano/tensor/basic.py
+17
-7
rmodule.py
theano/tensor/deprecated/rmodule.py
+7
-5
elemwise.py
theano/tensor/elemwise.py
+3
-1
test_sigm.py
theano/tensor/nnet/tests/test_sigm.py
+3
-1
test_basic.py
theano/tensor/tests/test_basic.py
+18
-4
test_blas.py
theano/tensor/tests/test_blas.py
+3
-1
test_merge.py
theano/tensor/tests/test_merge.py
+6
-3
没有找到文件。
NEWS.txt
浏览文件 @
a96d5716
...
@@ -15,7 +15,13 @@ Bug fixes
...
@@ -15,7 +15,13 @@ Bug fixes
Note: set_subtensor(x[slice[,...]], new_value) was working correctly
Note: set_subtensor(x[slice[,...]], new_value) was working correctly
in all case as well as inc_subtensor(*, *).
in all case as well as inc_subtensor(*, *).
Note2: If your code have this behavior, we print a warning by default.
Note2: If your code have this behavior, we print a warning by default.
(Frederic B.)
(Frederic B.)
* Fixed an issue whereby config values were used as default arguments,
with those defaults then stuck at old values if the config variables were
changed during program execution. (David W-F)
* Fixed many subtle bugs involving mutable default arguments which may have
led to unexpected behaviour, such as objects sharing instance variables
they were not supposed to share. (David W-F)
Documentation
Documentation
* Added in the tutorial documentation on how to extend Theano.
* Added in the tutorial documentation on how to extend Theano.
...
...
doc/library/compile/function.txt
浏览文件 @
a96d5716
...
@@ -46,7 +46,7 @@ Reference
...
@@ -46,7 +46,7 @@ Reference
.. method:: __init__(variable, borrow=False)
.. method:: __init__(variable, borrow=False)
Initialize attributes from arguments.
Initialize attributes from arguments.
.. class:: Param
.. class:: Param
...
@@ -57,21 +57,21 @@ Reference
...
@@ -57,21 +57,21 @@ Reference
A variable in an expression graph to use as a compiled-function parameter
A variable in an expression graph to use as a compiled-function parameter
.. attribute:: default
.. attribute:: default
The default value to use at call-time (can also be a Container where
The default value to use at call-time (can also be a Container where
the function will find a value at call-time.)
the function will find a value at call-time.)
.. attribute:: name
.. attribute:: name
A string to identify an argument for this parameter in keyword arguments.
A string to identify an argument for this parameter in keyword arguments.
.. attribute:: mutable
.. attribute:: mutable
``True`` means the compiled-function is allowed to modify this
``True`` means the compiled-function is allowed to modify this
argument. ``False`` means it is not allowed.
argument. ``False`` means it is not allowed.
.. attribute:: strict
.. attribute:: strict
If ``False``, a function argument may be copied or cast to match the type
If ``False``, a function argument may be copied or cast to match the type
required by the parameter `variable`. If ``True``, a function argument
required by the parameter `variable`. If ``True``, a function argument
must exactly match the type required by `variable`.
must exactly match the type required by `variable`.
...
@@ -81,7 +81,7 @@ Reference
...
@@ -81,7 +81,7 @@ Reference
Initialize object attributes.
Initialize object attributes.
.. function:: function(inputs, outputs, mode=None, updates=
[], givens=[]
, accept_inplace=False, name=None)
.. function:: function(inputs, outputs, mode=None, updates=
None, givens=None
, accept_inplace=False, name=None)
Return a callable object that will calculate `outputs` from `inputs`.
Return a callable object that will calculate `outputs` from `inputs`.
...
...
doc/library/tensor/basic.txt
浏览文件 @
a96d5716
差异被折叠。
点击展开。
doc/proposals/graphical_models.txt
浏览文件 @
a96d5716
...
@@ -37,7 +37,7 @@ In this way, we could express something like Logistic Regression like this:
...
@@ -37,7 +37,7 @@ In this way, we could express something like Logistic Regression like this:
def sample(self, n):
def sample(self, n):
"""[Symbolically] draw a sample of size n"""
"""[Symbolically] draw a sample of size n"""
def density(self, pt, givens=
{}
):
def density(self, pt, givens=
None
):
"""Conditional Density/Probability of P(self=pt)
"""Conditional Density/Probability of P(self=pt)
Implicitly conditioned on knowing the values of all variables
Implicitly conditioned on knowing the values of all variables
...
@@ -48,7 +48,7 @@ In this way, we could express something like Logistic Regression like this:
...
@@ -48,7 +48,7 @@ In this way, we could express something like Logistic Regression like this:
def mode(self):
def mode(self):
"""Return expression of the most likely value of this distribution"""
"""Return expression of the most likely value of this distribution"""
We would really like to integrate out certain variables sometimes...
We would really like to integrate out certain variables sometimes...
An RBM could be expressed like this:
An RBM could be expressed like this:
...
@@ -71,7 +71,7 @@ An RBM could be expressed like this:
...
@@ -71,7 +71,7 @@ An RBM could be expressed like this:
RBM.hidden.energy(h) # an expression for the free energy
RBM.hidden.energy(h) # an expression for the free energy
v_given_h = RBM.visible.conditional(h) # a random variable
v_given_h = RBM.visible.conditional(h) # a random variable
Rather than program all the training algorithms into an RBM module,
Rather than program all the training algorithms into an RBM module,
the idea would be to express the relationship between RBM variables so that we
the idea would be to express the relationship between RBM variables so that we
could automatically recognize how to do Gibbs sampling, gradient descent on Free
could automatically recognize how to do Gibbs sampling, gradient descent on Free
Energy, etc.
Energy, etc.
...
...
doc/proposals/pfunc.txt
浏览文件 @
a96d5716
...
@@ -13,7 +13,7 @@ changes are proposed to make function-construction calls more
...
@@ -13,7 +13,7 @@ changes are proposed to make function-construction calls more
readable and intuitive, and to make it easier to share values between
readable and intuitive, and to make it easier to share values between
functions.
functions.
The strategy is to
The strategy is to
- introduce a new kind of ``Variable`` (``SharedVariable``) that has a container
- introduce a new kind of ``Variable`` (``SharedVariable``) that has a container
associated with it, and can allow multiple functions to share a value.
associated with it, and can allow multiple functions to share a value.
...
@@ -59,17 +59,17 @@ The proposal is for two new ways of creating a *shared* variable:
...
@@ -59,17 +59,17 @@ The proposal is for two new ways of creating a *shared* variable:
def __init__(self, name, type, value, strict):
def __init__(self, name, type, value, strict):
"""
"""
:param name: The name for this variable (see `Variable`).
:param name: The name for this variable (see `Variable`).
:param type: The type for this variable (see `Variable`).
:param type: The type for this variable (see `Variable`).
:param value: A value to associate with this variable (a new container will be created).
:param value: A value to associate with this variable (a new container will be created).
:param strict: True -> assignments to .value will not be cast or copied, so they must
:param strict: True -> assignments to .value will not be cast or copied, so they must
have the correct type.
have the correct type.
:param container: The container to use for this variable. Illegal to pass this as well
:param container: The container to use for this variable. Illegal to pass this as well
as a value.
as a value.
For more user-friendly constructor, see `shared`
For more user-friendly constructor, see `shared`
"""
"""
...
@@ -79,23 +79,23 @@ The proposal is for two new ways of creating a *shared* variable:
...
@@ -79,23 +79,23 @@ The proposal is for two new ways of creating a *shared* variable:
value = property(...)
value = property(...)
"""Read/write the non-symbolic value associated with this SharedVariable.
"""Read/write the non-symbolic value associated with this SharedVariable.
If the SharedVariable is shared, changes to this value will be visible to all functions using
If the SharedVariable is shared, changes to this value will be visible to all functions using
this SharedVariable. If this SharedVariable is not shared, a change will not be visible to
this SharedVariable. If this SharedVariable is not shared, a change will not be visible to
functions that were created before the change.
functions that were created before the change.
"""
"""
def shared(value, name=None, strict=False, **kwargs):
def shared(value, name=None, strict=False, **kwargs):
"""Return a SharedVariable Variable, initialized with a copy or reference of `value`.
"""Return a SharedVariable Variable, initialized with a copy or reference of `value`.
This function iterates over constructor functions (see `shared_constructor`) to find a
This function iterates over constructor functions (see `shared_constructor`) to find a
suitable SharedVariable subclass.
suitable SharedVariable subclass.
:note:
:note:
By passing kwargs, you effectively limit the set of potential constructors to those that
By passing kwargs, you effectively limit the set of potential constructors to those that
can accept those kwargs.
can accept those kwargs.
"""
"""
...
...
...
@@ -149,25 +149,25 @@ Corner cases and exotic examples can be found in the tests.
...
@@ -149,25 +149,25 @@ Corner cases and exotic examples can be found in the tests.
.. code-block:: python
.. code-block:: python
def pfunc(params, outputs, mode=None, givens=
{}, updates=[]
)
def pfunc(params, outputs, mode=None, givens=
None, updates=None
)
"""Function-constructor for graphs with shared variables.
"""Function-constructor for graphs with shared variables.
:type params: list of either Variable or Param instances.
:type params: list of either Variable or Param instances.
:param params: function parameters, these are not allowed to be shared
:param params: function parameters, these are not allowed to be shared
variables
variables
:type outputs: list of Variables or Out instances
:type outputs: list of Variables or Out instances
:param outputs: expressions to compute
:param outputs: expressions to compute
:param mode: compilation mode
:param mode: compilation mode
:type updates: iterable over pairs (shared_variable, new_expression). List, tuple or dict.
:type updates: iterable over pairs (shared_variable, new_expression). List, tuple or dict.
:param updates: update the values for SharedVariable inputs according to these expressions
:param updates: update the values for SharedVariable inputs according to these expressions
:rtype: theano.compile.Function
:rtype: theano.compile.Function
:returns: a callable object that will compute the outputs (given the inputs)
:returns: a callable object that will compute the outputs (given the inputs)
and update the implicit function arguments according to the `updates`.
and update the implicit function arguments according to the `updates`.
"""
"""
...
...
...
@@ -177,20 +177,20 @@ Corner cases and exotic examples can be found in the tests.
...
@@ -177,20 +177,20 @@ Corner cases and exotic examples can be found in the tests.
def __init__(self, variable, default=None, mutable=False, strict=False):
def __init__(self, variable, default=None, mutable=False, strict=False):
"""
"""
:param variable: A node in an expression graph to set with each function call.
:param variable: A node in an expression graph to set with each function call.
:param default: The default value to use at call-time (can also be a Container where
:param default: The default value to use at call-time (can also be a Container where
the function will find a value at call-time.)
the function will find a value at call-time.)
:param name: A string to identify this parameter from function kwargs.
:param name: A string to identify this parameter from function kwargs.
:param mutable: True -> function is allowed to modify this argument.
:param mutable: True -> function is allowed to modify this argument.
:param strict: False -> function arguments may be copied or cast to match the
:param strict: False -> function arguments may be copied or cast to match the
type required by the parameter `variable`. True -> function arguments must exactly match the type
type required by the parameter `variable`. True -> function arguments must exactly match the type
required by `variable`.
required by `variable`.
:param implicit: see help(theano.io.In)
:param implicit: see help(theano.io.In)
"""
"""
Note that if some update value is not a variable, it will be cast into
Note that if some update value is not a variable, it will be cast into
...
@@ -210,40 +210,40 @@ simple one.
...
@@ -210,40 +210,40 @@ simple one.
import numpy, theano
import numpy, theano
from pfunc import pfunc
from pfunc import pfunc
from sharedvalue import shared
from sharedvalue import shared
from theano import tensor
from theano import tensor
from theano.tensor.nnet import sigmoid
from theano.tensor.nnet import sigmoid
class NNet(object):
class NNet(object):
def __init__(self,
def __init__(self,
input = tensor.dvector('input'),
input = tensor.dvector('input'),
target = tensor.dvector('target'),
target = tensor.dvector('target'),
n_input=1, n_hidden=1, n_output=1, lr=1e-3, **kw):
n_input=1, n_hidden=1, n_output=1, lr=1e-3, **kw):
super(NNet, self).__init__(**kw)
super(NNet, self).__init__(**kw)
self.input = input
self.input = input
self.target = target
self.target = target
self.lr = shared(lr, 'learning_rate')
self.lr = shared(lr, 'learning_rate')
self.w1 = shared(numpy.zeros((n_hidden, n_input)), 'w1')
self.w1 = shared(numpy.zeros((n_hidden, n_input)), 'w1')
self.w2 = shared(numpy.zeros((n_output, n_hidden)), 'w2')
self.w2 = shared(numpy.zeros((n_output, n_hidden)), 'w2')
self.hidden = sigmoid(tensor.dot(self.w1, self.input))
self.hidden = sigmoid(tensor.dot(self.w1, self.input))
self.output = tensor.dot(self.w2, self.hidden)
self.output = tensor.dot(self.w2, self.hidden)
self.cost = tensor.sum((self.output - self.target)**2)
self.cost = tensor.sum((self.output - self.target)**2)
self.sgd_updates = {
self.sgd_updates = {
self.w1: self.w1 - self.lr * tensor.grad(self.cost, self.w1),
self.w1: self.w1 - self.lr * tensor.grad(self.cost, self.w1),
self.w2: self.w2 - self.lr * tensor.grad(self.cost, self.w2)}
self.w2: self.w2 - self.lr * tensor.grad(self.cost, self.w2)}
self.sgd_step = pfunc(
self.sgd_step = pfunc(
params = [self.input, self.target],
params = [self.input, self.target],
outputs = [self.output, self.cost],
outputs = [self.output, self.cost],
updates = self.sgd_updates)
updates = self.sgd_updates)
self.compute_output = pfunc([self.input], self.output)
self.compute_output = pfunc([self.input], self.output)
self.output_from_hidden = pfunc([self.hidden], self.output)
self.output_from_hidden = pfunc([self.hidden], self.output)
doc/sandbox/debugging_with_stepmode.txt
浏览文件 @
a96d5716
...
@@ -17,8 +17,11 @@ purpose of it is to hack it to investigate what your own particular program is d
...
@@ -17,8 +17,11 @@ purpose of it is to hack it to investigate what your own particular program is d
predefined_optimizers)
predefined_optimizers)
class StepMode(Mode):
class StepMode(Mode):
def __init__(self, linker=config.linker, optimizer=config.optimizer):
def __init__(self, linker=None, optimizer=None):
if linker is None:
linker = config.linker
if optimizer is None:
optimizer = config.optimizer
def blah(i, node, th):
def blah(i, node, th):
# This function will be run for each node in your compiled program.
# This function will be run for each node in your compiled program.
# here you can inspect all the values as they are computed,
# here you can inspect all the values as they are computed,
...
@@ -43,14 +46,14 @@ purpose of it is to hack it to investigate what your own particular program is d
...
@@ -43,14 +46,14 @@ purpose of it is to hack it to investigate what your own particular program is d
if i == 39:
if i == 39:
print 'this node is weird...', th.outputs[0][0]
print 'this node is weird...', th.outputs[0][0]
self.provided_linker = linker
self.provided_linker = linker
self.provided_optimizer = optimizer
self.provided_optimizer = optimizer
if isinstance(linker, basestring) or linker is None:
if isinstance(linker, basestring) or linker is None:
linker = predefined_linkers[linker]
linker = predefined_linkers[linker]
self.linker = WrapLinkerMany([linker], [blah])
self.linker = WrapLinkerMany([linker], [blah])
if isinstance(optimizer, basestring) or optimizer is None:
if isinstance(optimizer, basestring) or optimizer is None:
optimizer = predefined_optimizers[optimizer]
optimizer = predefined_optimizers[optimizer]
self._optimizer = optimizer
self._optimizer = optimizer
...
...
theano/compile/debugmode.py
浏览文件 @
a96d5716
...
@@ -504,9 +504,9 @@ def char_from_number(number):
...
@@ -504,9 +504,9 @@ def char_from_number(number):
def
debugprint
(
r
,
prefix
=
''
,
depth
=-
1
,
done
=
None
,
print_type
=
False
,
def
debugprint
(
r
,
prefix
=
''
,
depth
=-
1
,
done
=
None
,
print_type
=
False
,
file
=
sys
.
stdout
,
print_destroy_map
=
False
,
print_view_map
=
False
,
file
=
sys
.
stdout
,
print_destroy_map
=
False
,
order
=
[],
ids
=
'CHAR'
,
stop_on_name
=
False
,
print_view_map
=
False
,
order
=
None
,
ids
=
'CHAR'
,
prefix_child
=
None
):
stop_on_name
=
False
,
prefix_child
=
None
):
"""Print the graph leading to `r` to given depth.
"""Print the graph leading to `r` to given depth.
:param r: Variable instance
:param r: Variable instance
...
@@ -531,6 +531,9 @@ def debugprint(r, prefix='', depth=-1, done=None, print_type=False,
...
@@ -531,6 +531,9 @@ def debugprint(r, prefix='', depth=-1, done=None, print_type=False,
if
depth
==
0
:
if
depth
==
0
:
return
return
if
order
is
None
:
order
=
[]
if
done
is
None
:
if
done
is
None
:
done
=
dict
()
done
=
dict
()
...
@@ -1417,7 +1420,9 @@ class _Linker(gof.link.LocalLinker):
...
@@ -1417,7 +1420,9 @@ class _Linker(gof.link.LocalLinker):
self
.
env
=
None
self
.
env
=
None
self
.
maker
=
maker
self
.
maker
=
maker
def
accept
(
self
,
env
,
no_recycling
=
[]):
def
accept
(
self
,
env
,
no_recycling
=
None
):
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
assert
type
(
self
)
is
_Linker
assert
type
(
self
)
is
_Linker
return
type
(
self
)(
self
.
env
,
self
.
maker
)
.
accept
(
env
,
no_recycling
)
return
type
(
self
)(
self
.
env
,
self
.
maker
)
.
accept
(
env
,
no_recycling
)
...
...
theano/compile/function.py
浏览文件 @
a96d5716
...
@@ -11,7 +11,7 @@ from profiling import ProfileStats
...
@@ -11,7 +11,7 @@ from profiling import ProfileStats
from
pfunc
import
pfunc
from
pfunc
import
pfunc
from
numpy
import
any
#for to work in python 2.4
from
numpy
import
any
#for to work in python 2.4
def
function
(
inputs
,
outputs
=
None
,
mode
=
None
,
updates
=
[],
givens
=
[]
,
def
function
(
inputs
,
outputs
=
None
,
mode
=
None
,
updates
=
None
,
givens
=
None
,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
profile
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
profile
=
None
,
on_unused_input
=
'raise'
):
on_unused_input
=
'raise'
):
...
@@ -80,7 +80,11 @@ def function(inputs, outputs=None, mode=None, updates=[], givens=[],
...
@@ -80,7 +80,11 @@ def function(inputs, outputs=None, mode=None, updates=[], givens=[],
"""
"""
#tuple are used in some tests, as we accepted them in the past
#tuple are used in some tests, as we accepted them in the past
#I prefer to allow it as they act the same as list for what they are used.
#I prefer to allow it as they act the same as list for what they are used.
if
not
isinstance
(
inputs
,(
list
,
tuple
)):
if
updates
is
None
:
updates
=
[]
if
givens
is
None
:
givens
=
[]
if
not
isinstance
(
inputs
,
(
list
,
tuple
)):
raise
Exception
(
"Inputs variable of a Theano function should be contained in a list, even when there is a single input."
)
raise
Exception
(
"Inputs variable of a Theano function should be contained in a list, even when there is a single input."
)
# compute some features of the arguments:
# compute some features of the arguments:
...
...
theano/compile/mode.py
浏览文件 @
a96d5716
...
@@ -245,7 +245,11 @@ class Mode(object):
...
@@ -245,7 +245,11 @@ class Mode(object):
predefined_modes.
predefined_modes.
"""
"""
def
__init__
(
self
,
linker
=
config
.
linker
,
optimizer
=
config
.
optimizer
):
def
__init__
(
self
,
linker
=
None
,
optimizer
=
None
):
if
linker
is
None
:
linker
=
config
.
linker
if
optimizer
is
None
:
optimizer
=
config
.
optimizer
self
.
__setstate__
((
linker
,
optimizer
))
self
.
__setstate__
((
linker
,
optimizer
))
#self.provided_optimizer - typically the `optimizer` arg. But if the `optimizer` arg is
#self.provided_optimizer - typically the `optimizer` arg. But if the `optimizer` arg is
# keyword corresponding to a predefined Query, then this stores the query
# keyword corresponding to a predefined Query, then this stores the query
...
...
theano/compile/module.py
浏览文件 @
a96d5716
...
@@ -241,7 +241,7 @@ class Method(Component):
...
@@ -241,7 +241,7 @@ class Method(Component):
function call.
function call.
"""
"""
outputs
=
None
outputs
=
None
"""function outputs (see `compile.function`)"""
"""function outputs (see `compile.function`)"""
updates
=
{}
updates
=
{}
...
@@ -260,10 +260,10 @@ class Method(Component):
...
@@ -260,10 +260,10 @@ class Method(Component):
"""
"""
mode
=
None
mode
=
None
"""This will override the Module compilation mode for this Method"""
"""This will override the Module compilation mode for this Method"""
def
__init__
(
self
,
inputs
,
outputs
,
updates
=
{}
,
mode
=
None
):
def
__init__
(
self
,
inputs
,
outputs
,
updates
=
None
,
mode
=
None
):
"""Initialize attributes
"""Initialize attributes
:param inputs: value for `Method.inputs`
:param inputs: value for `Method.inputs`
...
@@ -283,6 +283,8 @@ class Method(Component):
...
@@ -283,6 +283,8 @@ class Method(Component):
:type mode: None or any mode accepted by `compile.function`
:type mode: None or any mode accepted by `compile.function`
"""
"""
if
updates
is
None
:
updates
=
{}
super
(
Method
,
self
)
.
__init__
()
super
(
Method
,
self
)
.
__init__
()
self
.
inputs
=
inputs
self
.
inputs
=
inputs
self
.
outputs
=
outputs
self
.
outputs
=
outputs
...
@@ -339,7 +341,7 @@ class Method(Component):
...
@@ -339,7 +341,7 @@ class Method(Component):
"""
"""
return
None
return
None
def
build
(
self
,
mode
,
memo
,
allocate_all
=
False
):
def
build
(
self
,
mode
,
memo
,
allocate_all
=
False
):
"""Compile a function for this Method.
"""Compile a function for this Method.
:param allocate_all: if True, storage will be
:param allocate_all: if True, storage will be
...
@@ -573,7 +575,7 @@ class Composite(Component):
...
@@ -573,7 +575,7 @@ class Composite(Component):
"""
"""
raise
NotImplementedError
raise
NotImplementedError
def
flat_components
(
self
,
include_self
=
False
):
def
flat_components
(
self
,
include_self
=
False
):
"""
"""
Generator that yields each component in a flattened hierarchy
Generator that yields each component in a flattened hierarchy
of composites and components. If include_self is True, the
of composites and components. If include_self is True, the
...
@@ -589,7 +591,7 @@ class Composite(Component):
...
@@ -589,7 +591,7 @@ class Composite(Component):
else
:
else
:
yield
component
yield
component
def
flat_components_map
(
self
,
include_self
=
False
,
path
=
[]
):
def
flat_components_map
(
self
,
include_self
=
False
,
path
=
None
):
"""
"""
Generator that yields (path, component) pairs in a flattened
Generator that yields (path, component) pairs in a flattened
hierarchy of composites and components, where path is a
hierarchy of composites and components, where path is a
...
@@ -600,6 +602,8 @@ class Composite(Component):
...
@@ -600,6 +602,8 @@ class Composite(Component):
If include_self is True, the list will include the Composite
If include_self is True, the list will include the Composite
instances, else it will only yield the list of leaves.
instances, else it will only yield the list of leaves.
"""
"""
if
path
is
None
:
path
=
[]
if
include_self
:
if
include_self
:
yield
path
,
self
yield
path
,
self
for
name
,
component
in
self
.
components_map
():
for
name
,
component
in
self
.
components_map
():
...
@@ -758,7 +762,9 @@ class ComponentList(Composite):
...
@@ -758,7 +762,9 @@ class ComponentList(Composite):
member
.
name
=
'
%
s.
%
i'
%
(
name
,
i
)
member
.
name
=
'
%
s.
%
i'
%
(
name
,
i
)
def
default_initialize
(
self
,
init
=
{},
**
kwinit
):
def
default_initialize
(
self
,
init
=
None
,
**
kwinit
):
if
init
is
None
:
init
=
{}
for
k
,
initv
in
dict
(
init
,
**
kwinit
)
.
iteritems
():
for
k
,
initv
in
dict
(
init
,
**
kwinit
)
.
iteritems
():
self
[
k
]
=
initv
self
[
k
]
=
initv
...
@@ -788,7 +794,9 @@ class ComponentDictInstance(ComponentDictInstanceNoInit):
...
@@ -788,7 +794,9 @@ class ComponentDictInstance(ComponentDictInstanceNoInit):
ComponentDictInstance is meant to be instantiated by ComponentDict.
ComponentDictInstance is meant to be instantiated by ComponentDict.
"""
"""
def
initialize
(
self
,
init
=
{},
**
kwinit
):
def
initialize
(
self
,
init
=
None
,
**
kwinit
):
if
init
is
None
:
init
=
{}
for
k
,
initv
in
dict
(
init
,
**
kwinit
)
.
iteritems
():
for
k
,
initv
in
dict
(
init
,
**
kwinit
)
.
iteritems
():
self
[
k
]
=
initv
self
[
k
]
=
initv
...
@@ -797,7 +805,9 @@ class ComponentDictInstance(ComponentDictInstanceNoInit):
...
@@ -797,7 +805,9 @@ class ComponentDictInstance(ComponentDictInstanceNoInit):
class
ComponentDict
(
Composite
):
class
ComponentDict
(
Composite
):
InstanceType
=
ComponentDictInstance
# Type used by build() to make the instance
InstanceType
=
ComponentDictInstance
# Type used by build() to make the instance
def
__init__
(
self
,
components
=
{},
**
kwcomponents
):
def
__init__
(
self
,
components
=
None
,
**
kwcomponents
):
if
components
is
None
:
components
=
{}
super
(
ComponentDict
,
self
)
.
__init__
()
super
(
ComponentDict
,
self
)
.
__init__
()
components
=
dict
(
components
,
**
kwcomponents
)
components
=
dict
(
components
,
**
kwcomponents
)
for
val
in
components
.
itervalues
():
for
val
in
components
.
itervalues
():
...
@@ -1077,10 +1087,12 @@ class Module(ComponentDict):
...
@@ -1077,10 +1087,12 @@ class Module(ComponentDict):
memo
[
self
]
=
inst
memo
[
self
]
=
inst
return
inst
return
inst
def
_instance_initialize
(
self
,
inst
,
init
=
{}
,
**
kwinit
):
def
_instance_initialize
(
self
,
inst
,
init
=
None
,
**
kwinit
):
"""
"""
Default initialization method.
Default initialization method.
"""
"""
if
init
is
None
:
init
=
{}
for
name
,
value
in
chain
(
init
.
iteritems
(),
kwinit
.
iteritems
()):
for
name
,
value
in
chain
(
init
.
iteritems
(),
kwinit
.
iteritems
()):
inst
[
name
]
=
value
inst
[
name
]
=
value
...
...
theano/compile/pfunc.py
浏览文件 @
a96d5716
...
@@ -322,7 +322,7 @@ class Param(object):
...
@@ -322,7 +322,7 @@ class Param(object):
self
.
implicit
=
implicit
self
.
implicit
=
implicit
def
pfunc
(
params
,
outputs
=
None
,
mode
=
None
,
updates
=
[],
givens
=
[]
,
def
pfunc
(
params
,
outputs
=
None
,
mode
=
None
,
updates
=
None
,
givens
=
None
,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
profile
=
None
,
on_unused_input
=
'raise'
):
profile
=
None
,
on_unused_input
=
'raise'
):
...
@@ -405,6 +405,10 @@ def pfunc(params, outputs=None, mode=None, updates=[], givens=[],
...
@@ -405,6 +405,10 @@ def pfunc(params, outputs=None, mode=None, updates=[], givens=[],
# Then it clones the outputs and the update expressions. This rebuilds a computation graph
# Then it clones the outputs and the update expressions. This rebuilds a computation graph
# from the inputs and the givens.
# from the inputs and the givens.
#
#
if
updates
is
None
:
updates
=
[]
if
givens
is
None
:
givens
=
[]
if
profile
is
None
:
if
profile
is
None
:
profile
=
config
.
profile
profile
=
config
.
profile
# profile -> True or False
# profile -> True or False
...
...
theano/compile/profilemode.py
浏览文件 @
a96d5716
...
@@ -82,9 +82,13 @@ class Profile_Maker(FunctionMaker):
...
@@ -82,9 +82,13 @@ class Profile_Maker(FunctionMaker):
return
ret
return
ret
class
ProfileMode
(
Mode
):
class
ProfileMode
(
Mode
):
def
__init__
(
self
,
linker
=
config
.
linker
,
optimizer
=
config
.
optimizer
):
def
__init__
(
self
,
linker
=
None
,
optimizer
=
None
):
message
=
""
if
linker
is
None
:
profile_stats
=
{}
linker
=
config
.
linker
if
optimizer
is
None
:
optimizer
=
config
.
optimizer
message
=
""
profile_stats
=
{}
self
.
__setstate__
((
linker
,
self
.
__setstate__
((
linker
,
optimizer
,
optimizer
,
message
,
message
,
...
...
theano/gof/cc.py
浏览文件 @
a96d5716
...
@@ -402,8 +402,10 @@ class CLinker(link.Linker):
...
@@ -402,8 +402,10 @@ class CLinker(link.Linker):
def
__init__
(
self
):
def
__init__
(
self
):
self
.
env
=
None
self
.
env
=
None
def
accept
(
self
,
env
,
no_recycling
=
[]
):
def
accept
(
self
,
env
,
no_recycling
=
None
):
"""WRITEME"""
"""WRITEME"""
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
return
type
(
self
)()
.
accept
(
env
,
no_recycling
)
return
type
(
self
)()
.
accept
(
env
,
no_recycling
)
#raise Exception("Cannot accept from a Linker that is already"
#raise Exception("Cannot accept from a Linker that is already"
...
@@ -987,12 +989,18 @@ class CLinker(link.Linker):
...
@@ -987,12 +989,18 @@ class CLinker(link.Linker):
)
)
@staticmethod
@staticmethod
def
cmodule_key_
(
env
,
no_recycling
,
compile_args
=
[],
libraries
=
[]
,
def
cmodule_key_
(
env
,
no_recycling
,
compile_args
=
None
,
libraries
=
None
,
header_dirs
=
[]
,
insert_config_md5
=
True
):
header_dirs
=
None
,
insert_config_md5
=
True
):
"""
"""
Do the actual computation of cmodule_key in a static method
Do the actual computation of cmodule_key in a static method
to allow it to be reused in scalar.Composite.__eq__
to allow it to be reused in scalar.Composite.__eq__
"""
"""
if
compile_args
is
None
:
compile_args
=
[]
if
libraries
is
None
:
libraries
=
[]
if
header_dirs
is
None
:
header_dirs
=
[]
order
=
list
(
env
.
toposort
())
order
=
list
(
env
.
toposort
())
#set of variables that have been computed by nodes we have
#set of variables that have been computed by nodes we have
# seen 'so far' in the loop below
# seen 'so far' in the loop below
...
@@ -1381,7 +1389,9 @@ class OpWiseCLinker(link.LocalLinker):
...
@@ -1381,7 +1389,9 @@ class OpWiseCLinker(link.LocalLinker):
self
.
nice_errors
=
nice_errors
self
.
nice_errors
=
nice_errors
self
.
allow_gc
=
allow_gc
self
.
allow_gc
=
allow_gc
def
accept
(
self
,
env
,
no_recycling
=
[]):
def
accept
(
self
,
env
,
no_recycling
=
None
):
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
return
type
(
self
)(
self
.
fallback_on_perform
)
.
accept
(
env
,
return
type
(
self
)(
self
.
fallback_on_perform
)
.
accept
(
env
,
no_recycling
)
no_recycling
)
...
@@ -1519,7 +1529,9 @@ class DualLinker(link.Linker):
...
@@ -1519,7 +1529,9 @@ class DualLinker(link.Linker):
self
.
env
=
None
self
.
env
=
None
self
.
checker
=
checker
self
.
checker
=
checker
def
accept
(
self
,
env
,
no_recycling
=
[]):
def
accept
(
self
,
env
,
no_recycling
=
None
):
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
return
type
(
self
)(
self
.
checker
)
.
accept
(
env
,
no_recycling
)
return
type
(
self
)(
self
.
checker
)
.
accept
(
env
,
no_recycling
)
# raise Exception("Cannot accept from a Linker that is already "
# raise Exception("Cannot accept from a Linker that is already "
...
...
theano/gof/cmodule.py
浏览文件 @
a96d5716
...
@@ -1411,7 +1411,8 @@ class GCC_compiler(object):
...
@@ -1411,7 +1411,8 @@ class GCC_compiler(object):
@staticmethod
@staticmethod
def
compile_str
(
module_name
,
src_code
,
location
=
None
,
def
compile_str
(
module_name
,
src_code
,
location
=
None
,
include_dirs
=
[],
lib_dirs
=
[],
libs
=
[],
preargs
=
[]):
include_dirs
=
None
,
lib_dirs
=
None
,
libs
=
None
,
preargs
=
None
):
"""
"""
:param module_name: string (this has been embedded in the src_code
:param module_name: string (this has been embedded in the src_code
...
@@ -1435,6 +1436,12 @@ class GCC_compiler(object):
...
@@ -1435,6 +1436,12 @@ class GCC_compiler(object):
"""
"""
#TODO: Do not do the dlimport in this function
#TODO: Do not do the dlimport in this function
if
include_dirs
is
None
:
preargs
=
[]
if
lib_dirs
is
None
:
lib_dirs
=
[]
if
libs
is
None
:
libs
=
[]
if
preargs
is
None
:
if
preargs
is
None
:
preargs
=
[]
preargs
=
[]
else
:
else
:
...
...
theano/gof/env.py
浏览文件 @
a96d5716
...
@@ -80,18 +80,24 @@ class Env(utils.object2):
...
@@ -80,18 +80,24 @@ class Env(utils.object2):
### Special ###
### Special ###
# TODO: document which things that features can do to the env
# TODO: document which things that features can do to the env
def
__init__
(
self
,
inputs
,
outputs
,
features
=
[]
):
def
__init__
(
self
,
inputs
,
outputs
,
features
=
None
):
"""
"""
Create an Env which operates on the subgraph bound by the inputs and
outputs
Create an Env which operates on the subgraph bound by the inputs and
sets.
outputs
sets.
This class keeps a pointer to the inputs and outputs, and also modifies them.
This class keeps a pointer to the inputs and outputs, and also modifies
them.
#TODO: document what variables are[not] set in the env when a feature
is added via the
#TODO: document what variables are[not] set in the env when a feature
constructor. How constructed is the env?
is added via the
constructor. How constructed is the env?
"""
"""
if
features
is
None
:
features
=
[]
# XXX: Unless I'm missing something (but there's no documentation,
# so I probably am) this should be a set.
self
.
_features
=
[]
self
.
_features
=
[]
# All nodes in the subgraph defined by inputs and outputs are cached in nodes
# All nodes in the subgraph defined by inputs and outputs are cached in nodes
...
@@ -109,8 +115,10 @@ class Env(utils.object2):
...
@@ -109,8 +115,10 @@ class Env(utils.object2):
for
input
in
self
.
inputs
:
for
input
in
self
.
inputs
:
if
input
.
owner
is
not
None
:
if
input
.
owner
is
not
None
:
raise
ValueError
(
"One of the provided inputs is the output of an already existing node. "
\
raise
ValueError
(
"One of the provided inputs is the output of"
"If that is okay, either discard that input's owner or use graph.clone."
)
"an already existing node. "
"If that is okay, either discard that "
"input's owner or use graph.clone."
)
self
.
__setup_r__
(
input
)
self
.
__setup_r__
(
input
)
self
.
variables
.
add
(
input
)
self
.
variables
.
add
(
input
)
...
@@ -432,6 +440,9 @@ class Env(utils.object2):
...
@@ -432,6 +440,9 @@ class Env(utils.object2):
### features ###
### features ###
# XXX: This is terribly named. The "extend" method of a list
# takes a sequence, and since this is a kind of container you
# would expect it to do similarly.
def
extend
(
self
,
feature
):
def
extend
(
self
,
feature
):
"""WRITEME
"""WRITEME
Adds a feature to this env. The feature may define one
Adds a feature to this env. The feature may define one
...
...
theano/gof/graph.py
浏览文件 @
a96d5716
...
@@ -675,9 +675,11 @@ def general_toposort(r_out, deps, debug_print = False):
...
@@ -675,9 +675,11 @@ def general_toposort(r_out, deps, debug_print = False):
return
rlist
return
rlist
def
io_toposort
(
i
,
o
,
orderings
=
{}
):
def
io_toposort
(
i
,
o
,
orderings
=
None
):
"""WRITEME
"""WRITEME
"""
"""
if
orderings
is
None
:
orderings
=
{}
#the inputs are used only here in the function that decides what 'predecessors' to explore
#the inputs are used only here in the function that decides what 'predecessors' to explore
iset
=
set
(
i
)
iset
=
set
(
i
)
def
deps
(
obj
):
def
deps
(
obj
):
...
@@ -701,7 +703,7 @@ default_node_formatter = lambda op, argstrings: "%s(%s)" % (op.op,
...
@@ -701,7 +703,7 @@ default_node_formatter = lambda op, argstrings: "%s(%s)" % (op.op,
", "
.
join
(
argstrings
))
", "
.
join
(
argstrings
))
def
is_same_graph
(
var1
,
var2
,
givens
=
{}
,
debug
=
False
):
def
is_same_graph
(
var1
,
var2
,
givens
=
None
,
debug
=
False
):
"""
"""
Return True iff Variables `var1` and `var2` perform the same computation.
Return True iff Variables `var1` and `var2` perform the same computation.
...
@@ -740,6 +742,8 @@ def is_same_graph(var1, var2, givens={}, debug=False):
...
@@ -740,6 +742,8 @@ def is_same_graph(var1, var2, givens={}, debug=False):
====== ====== ====== ======
====== ====== ====== ======
"""
"""
# Lazy import.
# Lazy import.
if
givens
is
None
:
givens
=
{}
global
equal_computations
,
is_same_graph_with_merge
global
equal_computations
,
is_same_graph_with_merge
if
equal_computations
is
None
:
if
equal_computations
is
None
:
from
theano.gof.opt
import
is_same_graph_with_merge
from
theano.gof.opt
import
is_same_graph_with_merge
...
...
theano/gof/link.py
浏览文件 @
a96d5716
...
@@ -299,7 +299,8 @@ def map_storage(env, order, input_storage, output_storage):
...
@@ -299,7 +299,8 @@ def map_storage(env, order, input_storage, output_storage):
return
input_storage
,
output_storage
,
storage_map
return
input_storage
,
output_storage
,
storage_map
def
streamline
(
env
,
thunks
,
order
,
post_thunk_old_storage
=
None
,
no_recycling
=
[],
profiler
=
None
,
nice_errors
=
True
):
def
streamline
(
env
,
thunks
,
order
,
post_thunk_old_storage
=
None
,
no_recycling
=
None
,
profiler
=
None
,
nice_errors
=
True
):
"""WRITEME
"""WRITEME
:param env:
:param env:
...
@@ -320,6 +321,8 @@ def streamline(env, thunks, order, post_thunk_old_storage = None, no_recycling =
...
@@ -320,6 +321,8 @@ def streamline(env, thunks, order, post_thunk_old_storage = None, no_recycling =
:param nice_errors: run in such a way that the double-traceback is printed. This costs a
:param nice_errors: run in such a way that the double-traceback is printed. This costs a
bit of performance in the inner python loop.
bit of performance in the inner python loop.
"""
"""
if
no_recycling
is
None
:
no_recycling
=
[]
if
profiler
is
not
None
:
if
profiler
is
not
None
:
raise
NotImplementedError
()
raise
NotImplementedError
()
...
@@ -419,7 +422,7 @@ class PerformLinker(LocalLinker):
...
@@ -419,7 +422,7 @@ class PerformLinker(LocalLinker):
self
.
env
=
None
self
.
env
=
None
self
.
allow_gc
=
allow_gc
self
.
allow_gc
=
allow_gc
def
accept
(
self
,
env
,
no_recycling
=
[]
):
def
accept
(
self
,
env
,
no_recycling
=
None
):
"""
"""
:param env: a PerformLinker can have accepted one Env instance at a time.
:param env: a PerformLinker can have accepted one Env instance at a time.
...
@@ -427,6 +430,8 @@ class PerformLinker(LocalLinker):
...
@@ -427,6 +430,8 @@ class PerformLinker(LocalLinker):
:returns: self (TODO: WHY? Who calls this function?)
:returns: self (TODO: WHY? Who calls this function?)
"""
"""
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
return
type
(
self
)()
.
accept
(
env
,
no_recycling
)
return
type
(
self
)()
.
accept
(
env
,
no_recycling
)
#raise Exception("Cannot accept from a Linker that is already tied to another Env.")
#raise Exception("Cannot accept from a Linker that is already tied to another Env.")
...
@@ -548,7 +553,7 @@ class WrapLinker(Linker):
...
@@ -548,7 +553,7 @@ class WrapLinker(Linker):
self
.
linkers
=
linkers
self
.
linkers
=
linkers
self
.
wrapper
=
wrapper
self
.
wrapper
=
wrapper
def
accept
(
self
,
env
,
no_recycling
=
[]
):
def
accept
(
self
,
env
,
no_recycling
=
None
):
"""
"""
@type env: gof.Env
@type env: gof.Env
@param env: the env which we will link
@param env: the env which we will link
...
@@ -560,6 +565,8 @@ class WrapLinker(Linker):
...
@@ -560,6 +565,8 @@ class WrapLinker(Linker):
the computation to avoid reusing it.
the computation to avoid reusing it.
"""
"""
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
return
type
(
self
)(
self
.
linkers
,
self
.
wrapper
)
.
accept
(
env
,
no_recycling
)
return
type
(
self
)(
self
.
linkers
,
self
.
wrapper
)
.
accept
(
env
,
no_recycling
)
...
...
theano/gof/opt.py
浏览文件 @
a96d5716
...
@@ -356,12 +356,14 @@ class MergeOptimizer(Optimizer):
...
@@ -356,12 +356,14 @@ class MergeOptimizer(Optimizer):
merge_optimizer
=
MergeOptimizer
()
merge_optimizer
=
MergeOptimizer
()
def
is_same_graph_with_merge
(
var1
,
var2
,
givens
=
{}
):
def
is_same_graph_with_merge
(
var1
,
var2
,
givens
=
None
):
"""
"""
Merge-based implementation of `theano.gof.graph.is_same_graph`.
Merge-based implementation of `theano.gof.graph.is_same_graph`.
See help on `theano.gof.graph.is_same_graph` for additional documentation.
See help on `theano.gof.graph.is_same_graph` for additional documentation.
"""
"""
if
givens
is
None
:
givens
=
{}
# Copy variables since the MergeOptimizer will modify them.
# Copy variables since the MergeOptimizer will modify them.
copied
=
copy
.
deepcopy
([
var1
,
var2
,
givens
])
copied
=
copy
.
deepcopy
([
var1
,
var2
,
givens
])
vars
=
copied
[
0
:
2
]
vars
=
copied
[
0
:
2
]
...
@@ -483,7 +485,9 @@ class LocalOptimizer(object):
...
@@ -483,7 +485,9 @@ class LocalOptimizer(object):
class
FromFunctionLocalOptimizer
(
LocalOptimizer
):
class
FromFunctionLocalOptimizer
(
LocalOptimizer
):
"""WRITEME"""
"""WRITEME"""
def
__init__
(
self
,
fn
,
tracks
=
[]):
def
__init__
(
self
,
fn
,
tracks
=
None
):
if
tracks
is
None
:
tracks
=
[]
self
.
transform
=
fn
self
.
transform
=
fn
self
.
_tracks
=
tracks
self
.
_tracks
=
tracks
def
tracks
(
self
):
def
tracks
(
self
):
...
...
theano/gof/tests/test_destroyhandler.py
浏览文件 @
a96d5716
...
@@ -40,9 +40,18 @@ def MyValue(data):
...
@@ -40,9 +40,18 @@ def MyValue(data):
class
MyOp
(
Op
):
class
MyOp
(
Op
):
def
__init__
(
self
,
nin
,
name
,
vmap
=
{},
dmap
=
{},
nout
=
1
,
def
__init__
(
self
,
nin
,
name
,
vmap
=
None
,
dmap
=
None
,
nout
=
1
,
destroyhandler_tolerate_same
=
[],
destroyhandler_tolerate_same
=
None
,
destroyhandler_tolerate_aliased
=
[]):
destroyhandler_tolerate_aliased
=
None
):
if
vmap
is
None
:
vmap
=
{}
if
dmap
is
None
:
dmap
=
{}
if
destroyhandler_tolerate_same
is
None
:
destroyhandler_tolerate_same
=
[]
if
destroyhandler_tolerate_aliased
is
None
:
destroyhandler_tolerate_aliased
=
[]
self
.
nin
=
nin
self
.
nin
=
nin
self
.
nout
=
nout
self
.
nout
=
nout
self
.
name
=
name
self
.
name
=
name
...
@@ -50,7 +59,7 @@ class MyOp(Op):
...
@@ -50,7 +59,7 @@ class MyOp(Op):
self
.
view_map
=
vmap
self
.
view_map
=
vmap
self
.
destroyhandler_tolerate_same
=
destroyhandler_tolerate_same
self
.
destroyhandler_tolerate_same
=
destroyhandler_tolerate_same
self
.
destroyhandler_tolerate_aliased
=
destroyhandler_tolerate_aliased
self
.
destroyhandler_tolerate_aliased
=
destroyhandler_tolerate_aliased
def
make_node
(
self
,
*
inputs
):
def
make_node
(
self
,
*
inputs
):
assert
len
(
inputs
)
==
self
.
nin
assert
len
(
inputs
)
==
self
.
nin
inputs
=
map
(
as_variable
,
inputs
)
inputs
=
map
(
as_variable
,
inputs
)
...
...
theano/gof/tests/test_opt.py
浏览文件 @
a96d5716
...
@@ -31,8 +31,10 @@ def MyVariable(name):
...
@@ -31,8 +31,10 @@ def MyVariable(name):
class
MyOp
(
Op
):
class
MyOp
(
Op
):
def
__init__
(
self
,
name
,
dmap
=
{},
x
=
None
):
def
__init__
(
self
,
name
,
dmap
=
None
,
x
=
None
):
self
.
name
=
name
self
.
name
=
name
if
dmap
is
None
:
dmap
=
{}
self
.
destroy_map
=
dmap
self
.
destroy_map
=
dmap
self
.
x
=
x
self
.
x
=
x
...
...
theano/gof/vm.py
浏览文件 @
a96d5716
...
@@ -429,7 +429,7 @@ class VM_Linker(link.LocalLinker):
...
@@ -429,7 +429,7 @@ class VM_Linker(link.LocalLinker):
self
.
callback
=
callback
self
.
callback
=
callback
self
.
updated_vars
=
{}
self
.
updated_vars
=
{}
def
accept
(
self
,
env
,
no_recycling
=
[]
):
def
accept
(
self
,
env
,
no_recycling
=
None
):
"""
"""
:param env: a PerformLinker can have accepted one Env instance
:param env: a PerformLinker can have accepted one Env instance
at a time.
at a time.
...
@@ -438,6 +438,8 @@ class VM_Linker(link.LocalLinker):
...
@@ -438,6 +438,8 @@ class VM_Linker(link.LocalLinker):
:returns: self (TODO: WHY? Who calls this function?)
:returns: self (TODO: WHY? Who calls this function?)
"""
"""
if
no_recycling
is
None
:
no_recycling
=
[]
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
if
self
.
env
is
not
None
and
self
.
env
is
not
env
:
return
type
(
self
)()
.
accept
(
env
,
no_recycling
)
return
type
(
self
)()
.
accept
(
env
,
no_recycling
)
self
.
env
=
env
self
.
env
=
env
...
...
theano/misc/pycuda_example.py
浏览文件 @
a96d5716
...
@@ -54,7 +54,9 @@ def theano_parse_c_arg(c_arg):
...
@@ -54,7 +54,9 @@ def theano_parse_c_arg(c_arg):
"""
"""
class TheanoElementwiseKernel(pycuda.elementwise.ElementwiseKernel):
class TheanoElementwiseKernel(pycuda.elementwise.ElementwiseKernel):
def __init__(self, arguments, operation,
def __init__(self, arguments, operation,
name="kernel", keep=False, options=[], **kwargs):
name="kernel", keep=False, options=None, **kwargs):
if options is None:
options = []
if isinstance(arguments, basestring):
if isinstance(arguments, basestring):
arguments = [theano_parse_c_arg(arg)
arguments = [theano_parse_c_arg(arg)
for arg in arguments.split(",")]
for arg in arguments.split(",")]
...
@@ -88,10 +90,12 @@ class PycudaElemwiseKernelOp(GpuOp):
...
@@ -88,10 +90,12 @@ class PycudaElemwiseKernelOp(GpuOp):
nin = property(lambda self: self.scalar_op.nin)
nin = property(lambda self: self.scalar_op.nin)
nout = property(lambda self: self.scalar_op.nout)
nout = property(lambda self: self.scalar_op.nout)
def __init__(self, scalar_op, inplace_pattern={}, name=None):
def __init__(self, scalar_op, inplace_pattern=None, name=None):
if inplace_pattern is None:
inplace_pattern = {}
self.name = name
self.name = name
self.scalar_op = scalar_op
self.scalar_op = scalar_op
self.inplace_pattern =
None
self.inplace_pattern =
inplace_pattern
def __str__(self):
def __str__(self):
if self.name is None:
if self.name is None:
...
@@ -172,10 +176,12 @@ class PycudaElemwiseSourceModuleOp(GpuOp):
...
@@ -172,10 +176,12 @@ class PycudaElemwiseSourceModuleOp(GpuOp):
nin
=
property
(
lambda
self
:
self
.
scalar_op
.
nin
)
nin
=
property
(
lambda
self
:
self
.
scalar_op
.
nin
)
nout
=
property
(
lambda
self
:
self
.
scalar_op
.
nout
)
nout
=
property
(
lambda
self
:
self
.
scalar_op
.
nout
)
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
{},
name
=
None
):
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
None
,
name
=
None
):
if
inplace_pattern
is
None
:
inplace_pattern
=
{}
self
.
name
=
name
self
.
name
=
name
self
.
scalar_op
=
scalar_op
self
.
scalar_op
=
scalar_op
self
.
inplace_pattern
=
None
self
.
inplace_pattern
=
inplace_pattern
def
__str__
(
self
):
def
__str__
(
self
):
if
self
.
name
is
None
:
if
self
.
name
is
None
:
...
@@ -264,10 +270,12 @@ class PycudaElemwiseSourceModuleMakeThunkOp(Op):
...
@@ -264,10 +270,12 @@ class PycudaElemwiseSourceModuleMakeThunkOp(Op):
nin
=
property
(
lambda
self
:
self
.
scalar_op
.
nin
)
nin
=
property
(
lambda
self
:
self
.
scalar_op
.
nin
)
nout
=
property
(
lambda
self
:
self
.
scalar_op
.
nout
)
nout
=
property
(
lambda
self
:
self
.
scalar_op
.
nout
)
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
{},
name
=
None
):
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
None
,
name
=
None
):
if
inplace_pattern
is
None
:
inplace_pattern
=
{}
self
.
name
=
name
self
.
name
=
name
self
.
scalar_op
=
scalar_op
self
.
scalar_op
=
scalar_op
self
.
inplace_pattern
=
None
self
.
inplace_pattern
=
inplace_pattern
def
__str__
(
self
):
def
__str__
(
self
):
if
self
.
name
is
None
:
if
self
.
name
is
None
:
...
...
theano/printing.py
浏览文件 @
a96d5716
...
@@ -170,14 +170,18 @@ class Print(Op):
...
@@ -170,14 +170,18 @@ class Print(Op):
class
PrinterState
(
gof
.
utils
.
scratchpad
):
class
PrinterState
(
gof
.
utils
.
scratchpad
):
def
__init__
(
self
,
props
=
{},
**
more_props
):
def
__init__
(
self
,
props
=
None
,
**
more_props
):
if
props
is
None
:
props
=
{}
if
isinstance
(
props
,
gof
.
utils
.
scratchpad
):
if
isinstance
(
props
,
gof
.
utils
.
scratchpad
):
self
.
__update__
(
props
)
self
.
__update__
(
props
)
else
:
else
:
self
.
__dict__
.
update
(
props
)
self
.
__dict__
.
update
(
props
)
self
.
__dict__
.
update
(
more_props
)
self
.
__dict__
.
update
(
more_props
)
def
clone
(
self
,
props
=
{},
**
more_props
):
def
clone
(
self
,
props
=
None
,
**
more_props
):
if
props
is
None
:
props
=
{}
return
PrinterState
(
self
,
**
dict
(
props
,
**
more_props
))
return
PrinterState
(
self
,
**
dict
(
props
,
**
more_props
))
...
@@ -359,8 +363,10 @@ class PPrinter:
...
@@ -359,8 +363,10 @@ class PPrinter:
cp
.
assign
(
condition
,
printer
)
cp
.
assign
(
condition
,
printer
)
return
cp
return
cp
def
process_graph
(
self
,
inputs
,
outputs
,
updates
=
{}
,
def
process_graph
(
self
,
inputs
,
outputs
,
updates
=
None
,
display_inputs
=
False
):
display_inputs
=
False
):
if
updates
is
None
:
updates
=
{}
if
not
isinstance
(
inputs
,
(
list
,
tuple
)):
if
not
isinstance
(
inputs
,
(
list
,
tuple
)):
inputs
=
[
inputs
]
inputs
=
[
inputs
]
if
not
isinstance
(
outputs
,
(
list
,
tuple
)):
if
not
isinstance
(
outputs
,
(
list
,
tuple
)):
...
...
theano/sandbox/cuda/basic_ops.py
浏览文件 @
a96d5716
...
@@ -130,10 +130,12 @@ class GpuElemwise(GpuOp):
...
@@ -130,10 +130,12 @@ class GpuElemwise(GpuOp):
nin
=
property
(
lambda
self
:
self
.
scalar_op
.
nin
)
nin
=
property
(
lambda
self
:
self
.
scalar_op
.
nin
)
nout
=
property
(
lambda
self
:
self
.
scalar_op
.
nout
)
nout
=
property
(
lambda
self
:
self
.
scalar_op
.
nout
)
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
{}
,
sync
=
None
):
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
None
,
sync
=
None
):
#TODO-- this looks like a bug-- either we should use the sync argument
#TODO-- this looks like a bug-- either we should use the sync argument
# or get rid of it, we shouldn't let the client think they can control
# or get rid of it, we shouldn't let the client think they can control
#sync when they can't
#sync when they can't
if
inplace_pattern
is
None
:
inplace_pattern
=
{}
sync
=
config
.
gpuelemwise
.
sync
sync
=
config
.
gpuelemwise
.
sync
self
.
scalar_op
=
scalar_op
self
.
scalar_op
=
scalar_op
...
...
theano/sandbox/cuda/elemwise.py
浏览文件 @
a96d5716
...
@@ -39,11 +39,13 @@ class NaiveAlgo(object):
...
@@ -39,11 +39,13 @@ class NaiveAlgo(object):
#cache_version = ()
#cache_version = ()
cache_version
=
(
15
,
verbose
)
cache_version
=
(
15
,
verbose
)
def
__init__
(
self
,
scalar_op
,
sync
=
True
,
inplace_pattern
=
{}
):
def
__init__
(
self
,
scalar_op
,
sync
=
True
,
inplace_pattern
=
None
):
"""
"""
:param scalar_op: the scalar operation to execute on each element.
:param scalar_op: the scalar operation to execute on each element.
:param sync: if True, will wait after the kernel launch and check for error call.
:param sync: if True, will wait after the kernel launch and check for error call.
"""
"""
if
inplace_pattern
is
None
:
inplace_pattern
=
{}
try
:
try
:
code
=
scalar_op
.
c_support_code_apply
(
None
,
"nodename"
)
code
=
scalar_op
.
c_support_code_apply
(
None
,
"nodename"
)
if
code
:
if
code
:
...
...
theano/sandbox/cuda/tests/test_bench_loopfusion.py
浏览文件 @
a96d5716
...
@@ -54,9 +54,14 @@ class Kouh2008(object):
...
@@ -54,9 +54,14 @@ class Kouh2008(object):
_logger
.
debug
(
'output dtype
%
s'
%
output
.
dtype
)
_logger
.
debug
(
'output dtype
%
s'
%
output
.
dtype
)
@classmethod
@classmethod
def
new_expbounds
(
cls
,
rng
,
x_list
,
n_out
,
dtype
=
None
,
params
=
[],
updates
=
[],
exponent_range
=
(
1.0
,
3.0
)):
def
new_expbounds
(
cls
,
rng
,
x_list
,
n_out
,
dtype
=
None
,
params
=
None
,
updates
=
None
,
exponent_range
=
(
1.0
,
3.0
)):
"""
"""
"""
"""
if
params
is
None
:
params
=
[]
if
updates
is
None
:
updates
=
[]
if
dtype
is
None
:
if
dtype
is
None
:
dtype
=
x_list
[
0
]
.
dtype
dtype
=
x_list
[
0
]
.
dtype
n_terms
=
len
(
x_list
)
n_terms
=
len
(
x_list
)
...
...
theano/sandbox/debug.py
浏览文件 @
a96d5716
...
@@ -10,12 +10,16 @@ class DebugLinker(gof.WrapLinker):
...
@@ -10,12 +10,16 @@ class DebugLinker(gof.WrapLinker):
def
__init__
(
self
,
def
__init__
(
self
,
linkers
,
linkers
,
debug_pre
=
[],
debug_pre
=
None
,
debug_post
=
[],
debug_post
=
None
,
copy_originals
=
False
,
copy_originals
=
False
,
check_types
=
True
,
check_types
=
True
,
compare_variables
=
True
,
compare_variables
=
True
,
compare_fn
=
(
lambda
x
,
y
:
x
==
y
)):
compare_fn
=
(
lambda
x
,
y
:
x
==
y
)):
if
debug_pre
is
None
:
debug_pre
=
[]
if
debug_post
is
None
:
debug_post
=
[]
gof
.
WrapLinker
.
__init__
(
self
,
gof
.
WrapLinker
.
__init__
(
self
,
linkers
=
linkers
,
linkers
=
linkers
,
wrapper
=
self
.
wrapper
)
wrapper
=
self
.
wrapper
)
...
@@ -23,7 +27,7 @@ class DebugLinker(gof.WrapLinker):
...
@@ -23,7 +27,7 @@ class DebugLinker(gof.WrapLinker):
self
.
env
=
None
self
.
env
=
None
self
.
compare_fn
=
compare_fn
self
.
compare_fn
=
compare_fn
self
.
copy_originals
=
copy_originals
self
.
copy_originals
=
copy_originals
if
check_types
not
in
[
None
,
True
]:
if
check_types
not
in
[
None
,
True
]:
self
.
check_types
=
check_types
self
.
check_types
=
check_types
...
@@ -42,10 +46,12 @@ class DebugLinker(gof.WrapLinker):
...
@@ -42,10 +46,12 @@ class DebugLinker(gof.WrapLinker):
if
compare_variables
is
not
None
:
if
compare_variables
is
not
None
:
self
.
debug_post
.
append
(
self
.
compare_variables
)
self
.
debug_post
.
append
(
self
.
compare_variables
)
def
accept
(
self
,
env
,
no_recycling
=
[]):
def
accept
(
self
,
env
,
no_recycling
=
None
):
if
no_recycling
is
None
:
no_recycling
=
[]
return
gof
.
WrapLinker
.
accept
(
self
,
return
gof
.
WrapLinker
.
accept
(
self
,
env
=
env
,
env
=
env
,
no_recycling
=
no_recycling
)
no_recycling
=
no_recycling
)
def
store_value
(
self
,
i
,
node
,
*
thunks
):
def
store_value
(
self
,
i
,
node
,
*
thunks
):
th1
=
thunks
[
0
]
th1
=
thunks
[
0
]
...
@@ -165,7 +171,9 @@ def numpy_compare(a, b, tolerance = 1e-6):
...
@@ -165,7 +171,9 @@ def numpy_compare(a, b, tolerance = 1e-6):
return
a
==
b
return
a
==
b
def
numpy_debug_linker
(
pre
,
post
=
[]):
def
numpy_debug_linker
(
pre
,
post
=
None
):
if
post
is
None
:
post
=
[]
return
DebugLinker
([
gof
.
OpWiseCLinker
],
return
DebugLinker
([
gof
.
OpWiseCLinker
],
pre
,
pre
,
post
,
post
,
...
...
theano/sandbox/symbolic_module.py
浏览文件 @
a96d5716
...
@@ -96,10 +96,12 @@ def compile_fn(f, path_locals, common_inputs):
...
@@ -96,10 +96,12 @@ def compile_fn(f, path_locals, common_inputs):
updated
=
[]
updated
=
[]
return
compiled_f
,
updated
return
compiled_f
,
updated
def
compile
(
smod
,
initial_values
=
{}
):
def
compile
(
smod
,
initial_values
=
None
):
"""
"""
:type values: dictionary Variable -> value
:type values: dictionary Variable -> value
"""
"""
if
initial_values
is
None
:
initial_values
=
{}
def
sym_items
(
mod
):
def
sym_items
(
mod
):
for
k
in
mod
.
__dict__
:
for
k
in
mod
.
__dict__
:
if
k
in
[
'__module__'
,
'build_graph'
,
'__doc__'
]:
if
k
in
[
'__module__'
,
'build_graph'
,
'__doc__'
]:
...
...
theano/sandbox/test_rng_mrg.py
浏览文件 @
a96d5716
...
@@ -281,8 +281,10 @@ def test_consistency_GPU_parallel():
...
@@ -281,8 +281,10 @@ def test_consistency_GPU_parallel():
samples
=
numpy
.
array
(
samples
)
.
flatten
()
samples
=
numpy
.
array
(
samples
)
.
flatten
()
assert
(
numpy
.
allclose
(
samples
,
java_samples
))
assert
(
numpy
.
allclose
(
samples
,
java_samples
))
def
basictest
(
f
,
steps
,
sample_size
,
prefix
=
""
,
allow_01
=
False
,
inputs
=
[]
,
def
basictest
(
f
,
steps
,
sample_size
,
prefix
=
""
,
allow_01
=
False
,
inputs
=
None
,
target_avg
=
0.5
,
target_std
=
None
,
mean_rtol
=
0.01
):
target_avg
=
0.5
,
target_std
=
None
,
mean_rtol
=
0.01
):
if
inputs
is
None
:
inputs
=
[]
dt
=
0.0
dt
=
0.0
avg_std
=
0.0
avg_std
=
0.0
...
...
theano/sandbox/theano_object.py
浏览文件 @
a96d5716
...
@@ -25,7 +25,7 @@ class symbolic_fn_callable(object):
...
@@ -25,7 +25,7 @@ class symbolic_fn_callable(object):
class.
class.
.. code-block:: python
.. code-block:: python
class T(TheanoObject):
class T(TheanoObject):
@symbolic_fn
@symbolic_fn
def add(self, x):
def add(self, x):
...
@@ -33,7 +33,7 @@ class symbolic_fn_callable(object):
...
@@ -33,7 +33,7 @@ class symbolic_fn_callable(object):
add_outputs = ...
add_outputs = ...
add_updates = ...
add_updates = ...
return RVal(add_outputs, add_updates)
return RVal(add_outputs, add_updates)
t = T()
t = T()
t.add.outputs(5) # returns `add_outputs` from when `x=theano_type(5)`
t.add.outputs(5) # returns `add_outputs` from when `x=theano_type(5)`
t.add.updates(5) # returns `add_updates` from when `x=theano_type(5)`
t.add.updates(5) # returns `add_updates` from when `x=theano_type(5)`
t.add.theano_function(5) # returns the `Function` compiled when `x=theano_type(5)`
t.add.theano_function(5) # returns the `Function` compiled when `x=theano_type(5)`
...
@@ -48,7 +48,7 @@ class symbolic_fn_callable(object):
...
@@ -48,7 +48,7 @@ class symbolic_fn_callable(object):
"""Silly method to work with symbolic_fn.__get__"""
"""Silly method to work with symbolic_fn.__get__"""
self
.
o_self
=
o_self
self
.
o_self
=
o_self
return
self
return
self
def
run_symbolic
(
self
,
*
args
,
**
kwargs
):
def
run_symbolic
(
self
,
*
args
,
**
kwargs
):
return
self
.
o_self
.
_get_method_impl
(
self
.
fn
,
self
.
o_self
,
args
,
kwargs
,
mode
=
self
.
mode
)
return
self
.
o_self
.
_get_method_impl
(
self
.
fn
,
self
.
o_self
,
args
,
kwargs
,
mode
=
self
.
mode
)
...
@@ -70,7 +70,7 @@ class symbolic_fn(object):
...
@@ -70,7 +70,7 @@ class symbolic_fn(object):
def
__init__
(
self
,
fn
,
mode
=
None
):
def
__init__
(
self
,
fn
,
mode
=
None
):
self
.
fn
=
fn
self
.
fn
=
fn
self
.
callable
=
symbolic_fn_callable
(
fn
,
mode
)
self
.
callable
=
symbolic_fn_callable
(
fn
,
mode
)
def
__get__
(
self
,
o_self
,
o_cls
):
def
__get__
(
self
,
o_self
,
o_cls
):
return
self
.
callable
.
on
(
o_self
)
return
self
.
callable
.
on
(
o_self
)
...
@@ -91,16 +91,18 @@ class RVal(object):
...
@@ -91,16 +91,18 @@ class RVal(object):
"""A Return-Value object for a `symbolic_fn` """
"""A Return-Value object for a `symbolic_fn` """
outputs
=
[]
outputs
=
[]
"""The method will compute values for the variables in this list"""
"""The method will compute values for the variables in this list"""
updates
=
{}
updates
=
{}
"""The method will update module variables in this dictionary
"""The method will update module variables in this dictionary
For items ``(k,v)`` in this dictionary, ``k`` must be a `symbolic_member` of some module.
For items ``(k,v)`` in this dictionary, ``k`` must be a `symbolic_member` of some module.
On each call to this compiled function, the value of ``k`` will be replaced with the
On each call to this compiled function, the value of ``k`` will be replaced with the
computed value of the Variable ``v``.
computed value of the Variable ``v``.
"""
"""
def
__init__
(
self
,
outputs
,
updates
=
{}):
def
__init__
(
self
,
outputs
,
updates
=
None
):
if
updates
is
None
:
updates
=
{}
self
.
outputs
=
outputs
self
.
outputs
=
outputs
assert
type
(
updates
)
is
dict
assert
type
(
updates
)
is
dict
self
.
updates
=
updates
self
.
updates
=
updates
...
@@ -111,19 +113,19 @@ class TheanoObject(object):
...
@@ -111,19 +113,19 @@ class TheanoObject(object):
This class provides support for symbolic_fn class attributes.
This class provides support for symbolic_fn class attributes.
These will be compiled on demand so that they can be used just like normal (non-symbolic)
These will be compiled on demand so that they can be used just like normal (non-symbolic)
methods.
methods.
The symbolic functions in a TheanoObject can share member variables that have been created
The symbolic functions in a TheanoObject can share member variables that have been created
using the `symbolic_member` method.
using the `symbolic_member` method.
:note: Other variables (ones not created using ``self.symbolic_member``) referred to in the
:note: Other variables (ones not created using ``self.symbolic_member``) referred to in the
body of a symbolic function will *not* be shared between symbolic functions, or between
body of a symbolic function will *not* be shared between symbolic functions, or between
symbolic functions and this class. These other variables will be locked away in the
symbolic functions and this class. These other variables will be locked away in the
closure of a symbolic function when that function is compiled.
closure of a symbolic function when that function is compiled.
:warning: It is not recommended for code to interleave
:warning: It is not recommended for code to interleave
(a) changes to non-symbolic instance variables with
(a) changes to non-symbolic instance variables with
(b) calls to symbolic functions that use those instance variables.
(b) calls to symbolic functions that use those instance variables.
A symbolic function may be
A symbolic function may be
compiled multiple times because it must be compiled for each set of argument types.
compiled multiple times because it must be compiled for each set of argument types.
Each time the function is compiled, the values of non-symbolic variables will be locked
Each time the function is compiled, the values of non-symbolic variables will be locked
...
@@ -179,7 +181,7 @@ class TheanoObject(object):
...
@@ -179,7 +181,7 @@ class TheanoObject(object):
# construct In instances for the symbolic_member instances that can automatically be
# construct In instances for the symbolic_member instances that can automatically be
# included here.
# included here.
module_inputs
=
[
theano
.
compile
.
io
.
In
(
module_inputs
=
[
theano
.
compile
.
io
.
In
(
variable
=
v
,
variable
=
v
,
value
=
v
.
_theanoclass_container
,
value
=
v
.
_theanoclass_container
,
mutable
=
(
v
in
rval
.
updates
),
mutable
=
(
v
in
rval
.
updates
),
update
=
rval
.
updates
.
get
(
v
,
None
))
update
=
rval
.
updates
.
get
(
v
,
None
))
...
@@ -210,7 +212,7 @@ class TheanoObject(object):
...
@@ -210,7 +212,7 @@ class TheanoObject(object):
v
=
tensor
.
lscalar
(
name
)
v
=
tensor
.
lscalar
(
name
)
v
.
_theanoclass_container
=
\
v
.
_theanoclass_container
=
\
theano
.
gof
.
Container
(
v
,
theano
.
gof
.
Container
(
v
,
storage
=
[
theano
.
_asarray
(
ival
,
dtype
=
'int64'
)],
storage
=
[
theano
.
_asarray
(
ival
,
dtype
=
'int64'
)],
readonly
=
False
)
readonly
=
False
)
assert
not
hasattr
(
v
,
'set'
)
assert
not
hasattr
(
v
,
'set'
)
...
@@ -222,5 +224,5 @@ class TheanoObject(object):
...
@@ -222,5 +224,5 @@ class TheanoObject(object):
return
v
return
v
theano/scan_module/scan_utils.py
浏览文件 @
a96d5716
...
@@ -454,7 +454,7 @@ def infer_shape(outs, inputs, input_shapes):
...
@@ -454,7 +454,7 @@ def infer_shape(outs, inputs, input_shapes):
class
Validator
(
object
):
class
Validator
(
object
):
def
__init__
(
self
,
valid
=
[],
invalid
=
[],
valid_equivalent
=
{}
):
def
__init__
(
self
,
valid
=
None
,
invalid
=
None
,
valid_equivalent
=
None
):
'''
'''
Check if variables can be expressed without using variables in invalid.
Check if variables can be expressed without using variables in invalid.
...
@@ -462,6 +462,13 @@ class Validator(object):
...
@@ -462,6 +462,13 @@ class Validator(object):
variables to valid ones that can be used instead.
variables to valid ones that can be used instead.
'''
'''
if
valid
is
None
:
valid
=
[]
if
invalid
is
None
:
invalid
=
[]
if
valid_equivalent
is
None
:
valid_equivalent
=
{}
# Nodes that are valid to have in the graph computing outputs
# Nodes that are valid to have in the graph computing outputs
self
.
valid
=
set
(
valid
)
self
.
valid
=
set
(
valid
)
...
...
theano/tensor/basic.py
浏览文件 @
a96d5716
...
@@ -2662,22 +2662,28 @@ def zeros_like(model, dtype=None):
...
@@ -2662,22 +2662,28 @@ def zeros_like(model, dtype=None):
return
fill
(
model
,
constant
(
0.0
,
dtype
=
dtype
))
return
fill
(
model
,
constant
(
0.0
,
dtype
=
dtype
))
def
zeros
(
shape
,
dtype
=
config
.
floatX
):
def
zeros
(
shape
,
dtype
=
None
):
"""
"""
Create a Tensor filled with zeros, closer to Numpy's syntax than ``alloc``.
Create a Tensor filled with zeros, closer to Numpy's syntax than ``alloc``.
"""
"""
if
dtype
is
None
:
dtype
=
config
.
floatX
return
alloc
(
numpy
.
array
(
0
,
dtype
=
dtype
),
*
shape
)
return
alloc
(
numpy
.
array
(
0
,
dtype
=
dtype
),
*
shape
)
def
ones
(
shape
,
dtype
=
config
.
floatX
):
def
ones
(
shape
,
dtype
=
None
):
"""
"""
Create a Tensor filled with ones, closer to Numpy's syntax than ``alloc``.
Create a Tensor filled with ones, closer to Numpy's syntax than ``alloc``.
"""
"""
if
dtype
is
None
:
dtype
=
config
.
floatX
return
alloc
(
numpy
.
array
(
1
,
dtype
=
dtype
),
*
shape
)
return
alloc
(
numpy
.
array
(
1
,
dtype
=
dtype
),
*
shape
)
class
Eye
(
gof
.
Op
):
class
Eye
(
gof
.
Op
):
def
__init__
(
self
,
dtype
=
config
.
floatX
):
def
__init__
(
self
,
dtype
=
None
):
if
dtype
is
None
:
dtype
=
config
.
floatX
self
.
dtype
=
dtype
self
.
dtype
=
dtype
def
make_node
(
self
,
n
,
m
,
k
):
def
make_node
(
self
,
n
,
m
,
k
):
...
@@ -2702,8 +2708,10 @@ class Eye(gof.Op):
...
@@ -2702,8 +2708,10 @@ class Eye(gof.Op):
return
hash
(
self
.
dtype
)
^
hash
(
type
(
self
))
return
hash
(
self
.
dtype
)
^
hash
(
type
(
self
))
def
eye
(
n
,
m
=
None
,
k
=
0
,
dtype
=
config
.
floatX
):
def
eye
(
n
,
m
=
None
,
k
=
0
,
dtype
=
None
):
if
m
==
None
:
if
dtype
is
None
:
dtype
=
config
.
floatX
if
m
is
None
:
m
=
n
m
=
n
localop
=
Eye
(
dtype
)
localop
=
Eye
(
dtype
)
return
localop
(
n
,
m
,
k
)
return
localop
(
n
,
m
,
k
)
...
@@ -3080,7 +3088,7 @@ def var(input, axis=None):
...
@@ -3080,7 +3088,7 @@ def var(input, axis=None):
"""
"""
input_ndim
=
input
.
type
.
ndim
input_ndim
=
input
.
type
.
ndim
if
axis
==
None
:
if
axis
is
None
:
axis
=
range
(
input_ndim
)
axis
=
range
(
input_ndim
)
if
isinstance
(
axis
,
int
):
if
isinstance
(
axis
,
int
):
axis
=
[
axis
]
axis
=
[
axis
]
...
@@ -4081,7 +4089,9 @@ class IncSubtensor(Op):
...
@@ -4081,7 +4089,9 @@ class IncSubtensor(Op):
"""
"""
def
__init__
(
self
,
idx_list
,
inplace
=
False
,
set_instead_of_inc
=
False
,
def
__init__
(
self
,
idx_list
,
inplace
=
False
,
set_instead_of_inc
=
False
,
destroyhandler_tolerate_aliased
=
[]):
destroyhandler_tolerate_aliased
=
None
):
if
destroyhandler_tolerate_aliased
is
None
:
destroyhandler_tolerate_aliased
=
[]
self
.
idx_list
=
map
(
Subtensor
.
convert
,
idx_list
)
self
.
idx_list
=
map
(
Subtensor
.
convert
,
idx_list
)
self
.
inplace
=
inplace
self
.
inplace
=
inplace
if
inplace
:
if
inplace
:
...
...
theano/tensor/deprecated/rmodule.py
浏览文件 @
a96d5716
...
@@ -9,7 +9,7 @@ else:
...
@@ -9,7 +9,7 @@ else:
import
numpy
import
numpy
from
copy
import
copy
from
copy
import
copy
from
theano.compile
import
(
SymbolicInputKit
,
SymbolicInput
,
from
theano.compile
import
(
SymbolicInputKit
,
SymbolicInput
,
Module
,
module
,
Method
,
Member
,
In
,
Component
)
Module
,
module
,
Method
,
Member
,
In
,
Component
)
from
theano.gof
import
Container
from
theano.gof
import
Container
from
theano.gof.python25
import
deque
from
theano.gof.python25
import
deque
...
@@ -20,7 +20,7 @@ class KitComponent(Component):
...
@@ -20,7 +20,7 @@ class KitComponent(Component):
"""
"""
Represents a SymbolicInputKit (see io.py).
Represents a SymbolicInputKit (see io.py).
"""
"""
def
__init__
(
self
,
kit
):
def
__init__
(
self
,
kit
):
super
(
KitComponent
,
self
)
.
__init__
()
super
(
KitComponent
,
self
)
.
__init__
()
self
.
kit
=
kit
self
.
kit
=
kit
...
@@ -88,7 +88,9 @@ rk = RandomKit('rk', 0xBAD5EED)
...
@@ -88,7 +88,9 @@ rk = RandomKit('rk', 0xBAD5EED)
class
RModule
(
Module
):
class
RModule
(
Module
):
"""Module providing random number streams in Theano graphs."""
"""Module providing random number streams in Theano graphs."""
def
__init__
(
self
,
components
=
{},
**
kwcomponents
):
def
__init__
(
self
,
components
=
None
,
**
kwcomponents
):
if
components
is
None
:
components
=
{}
super
(
RModule
,
self
)
.
__init__
(
components
,
**
kwcomponents
)
super
(
RModule
,
self
)
.
__init__
(
components
,
**
kwcomponents
)
self
.
random
=
RandomKit
(
'rkit'
)
self
.
random
=
RandomKit
(
'rkit'
)
self
.
_rkit
=
KitComponent
(
self
.
random
)
self
.
_rkit
=
KitComponent
(
self
.
random
)
...
@@ -104,8 +106,8 @@ class RModule(Module):
...
@@ -104,8 +106,8 @@ class RModule(Module):
if
recursive
:
if
recursive
:
#Here, we recurse through all the components (inst2) contained in (inst)
#Here, we recurse through all the components (inst2) contained in (inst)
#and seeds each subcomponent that is an RModule
#and seeds each subcomponent that is an RModule
for
path
,
c
in
self
.
flat_components_map
(
True
):
for
path
,
c
in
self
.
flat_components_map
(
True
):
if
isinstance
(
c
,
RModule
):
if
isinstance
(
c
,
RModule
):
inst2
=
inst
inst2
=
inst
...
...
theano/tensor/elemwise.py
浏览文件 @
a96d5716
...
@@ -430,7 +430,7 @@ class Elemwise(Op):
...
@@ -430,7 +430,7 @@ class Elemwise(Op):
Elemwise(log)(rand(3, 4, 5))
Elemwise(log)(rand(3, 4, 5))
"""
"""
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
{}
,
name
=
None
,
def
__init__
(
self
,
scalar_op
,
inplace_pattern
=
None
,
name
=
None
,
nfunc_spec
=
None
):
nfunc_spec
=
None
):
"""
"""
Usage: Elemwise(scalar_op, inplace_pattern = {})
Usage: Elemwise(scalar_op, inplace_pattern = {})
...
@@ -451,6 +451,8 @@ class Elemwise(Op):
...
@@ -451,6 +451,8 @@ class Elemwise(Op):
NOTE: as of now, the sign of the nout field is ignored (some work
NOTE: as of now, the sign of the nout field is ignored (some work
needs to be done to resize the destinations when needed).
needs to be done to resize the destinations when needed).
"""
"""
if
inplace_pattern
is
None
:
inplace_pattern
=
{}
self
.
name
=
name
self
.
name
=
name
self
.
scalar_op
=
scalar_op
self
.
scalar_op
=
scalar_op
self
.
inplace_pattern
=
inplace_pattern
self
.
inplace_pattern
=
inplace_pattern
...
...
theano/tensor/nnet/tests/test_sigm.py
浏览文件 @
a96d5716
...
@@ -31,7 +31,7 @@ class T_softplus(unittest.TestCase):
...
@@ -31,7 +31,7 @@ class T_softplus(unittest.TestCase):
class
T_sigmoid_opts
(
unittest
.
TestCase
):
class
T_sigmoid_opts
(
unittest
.
TestCase
):
def
get_mode
(
self
,
excluding
=
[]
):
def
get_mode
(
self
,
excluding
=
None
):
"""
"""
Return appropriate mode for the tests.
Return appropriate mode for the tests.
...
@@ -41,6 +41,8 @@ class T_sigmoid_opts(unittest.TestCase):
...
@@ -41,6 +41,8 @@ class T_sigmoid_opts(unittest.TestCase):
set to 'FAST_COMPILE' (in which case it is replaced by the 'FAST_RUN'
set to 'FAST_COMPILE' (in which case it is replaced by the 'FAST_RUN'
mode), without the optimizations specified in `excluding`.
mode), without the optimizations specified in `excluding`.
"""
"""
if
excluding
is
None
:
excluding
=
[]
m
=
theano
.
config
.
mode
m
=
theano
.
config
.
mode
if
m
==
'FAST_COMPILE'
:
if
m
==
'FAST_COMPILE'
:
mode
=
theano
.
compile
.
mode
.
get_mode
(
'FAST_RUN'
)
mode
=
theano
.
compile
.
mode
.
get_mode
(
'FAST_RUN'
)
...
...
theano/tensor/tests/test_basic.py
浏览文件 @
a96d5716
...
@@ -173,9 +173,19 @@ def safe_make_node(op, *inputs):
...
@@ -173,9 +173,19 @@ def safe_make_node(op, *inputs):
return
node
.
owner
return
node
.
owner
def
makeTester
(
name
,
op
,
expected
,
checks
=
{},
good
=
{},
bad_build
=
{}
,
def
makeTester
(
name
,
op
,
expected
,
checks
=
None
,
good
=
None
,
bad_build
=
None
,
bad_runtime
=
{},
grad
=
{}
,
mode
=
None
,
grad_rtol
=
None
,
bad_runtime
=
None
,
grad
=
None
,
mode
=
None
,
grad_rtol
=
None
,
eps
=
1e-10
,
skip
=
False
):
eps
=
1e-10
,
skip
=
False
):
if
checks
is
None
:
checks
=
{}
if
good
is
None
:
good
=
{}
if
bad_build
is
None
:
bad_build
=
{}
if
bad_runtime
is
None
:
bad_runtime
=
{}
if
grad
is
None
:
grad
=
{}
if
grad
is
True
:
if
grad
is
True
:
grad
=
good
grad
=
good
...
@@ -400,7 +410,9 @@ def rand_of_dtype(shape, dtype):
...
@@ -400,7 +410,9 @@ def rand_of_dtype(shape, dtype):
raise
TypeError
()
raise
TypeError
()
def
makeBroadcastTester
(
op
,
expected
,
checks
=
{},
name
=
None
,
**
kwargs
):
def
makeBroadcastTester
(
op
,
expected
,
checks
=
None
,
name
=
None
,
**
kwargs
):
if
checks
is
None
:
checks
=
{}
if
name
is
None
:
if
name
is
None
:
name
=
str
(
op
)
name
=
str
(
op
)
# Here we ensure the test name matches the name of the variable defined in
# Here we ensure the test name matches the name of the variable defined in
...
@@ -575,10 +587,12 @@ MulInplaceTester = makeBroadcastTester(op = inplace.mul_inplace,
...
@@ -575,10 +587,12 @@ MulInplaceTester = makeBroadcastTester(op = inplace.mul_inplace,
inplace
=
True
)
inplace
=
True
)
def
copymod
(
dct
,
without
=
[]
,
**
kwargs
):
def
copymod
(
dct
,
without
=
None
,
**
kwargs
):
"""Return dct but with the keys named by args removed, and with
"""Return dct but with the keys named by args removed, and with
kwargs added.
kwargs added.
"""
"""
if
without
is
None
:
without
=
[]
rval
=
copy
(
dct
)
rval
=
copy
(
dct
)
for
a
in
without
:
for
a
in
without
:
if
a
in
rval
:
if
a
in
rval
:
...
...
theano/tensor/tests/test_blas.py
浏览文件 @
a96d5716
...
@@ -1427,7 +1427,9 @@ class TestGer(TestCase, unittest_tools.TestOptimizationMixin):
...
@@ -1427,7 +1427,9 @@ class TestGer(TestCase, unittest_tools.TestOptimizationMixin):
self
.
ger_destructive
=
ger_destructive
self
.
ger_destructive
=
ger_destructive
self
.
gemm
=
gemm_no_inplace
self
.
gemm
=
gemm_no_inplace
def
function
(
self
,
inputs
,
outputs
,
updates
=
{}):
def
function
(
self
,
inputs
,
outputs
,
updates
=
None
):
if
updates
is
None
:
updates
=
{}
return
theano
.
function
(
inputs
,
outputs
,
self
.
mode
,
updates
=
updates
)
return
theano
.
function
(
inputs
,
outputs
,
self
.
mode
,
updates
=
updates
)
def
b
(
self
,
bval
):
def
b
(
self
,
bval
):
...
...
theano/tensor/tests/test_merge.py
浏览文件 @
a96d5716
...
@@ -21,11 +21,13 @@ class MyType(Type):
...
@@ -21,11 +21,13 @@ class MyType(Type):
class
MyOp
(
Op
):
class
MyOp
(
Op
):
def
__init__
(
self
,
name
,
dmap
=
{},
x
=
None
):
def
__init__
(
self
,
name
,
dmap
=
None
,
x
=
None
):
if
dmap
is
None
:
dmap
=
{}
self
.
name
=
name
self
.
name
=
name
self
.
destroy_map
=
dmap
self
.
destroy_map
=
dmap
self
.
x
=
x
self
.
x
=
x
def
make_node
(
self
,
*
inputs
):
def
make_node
(
self
,
*
inputs
):
inputs
=
map
(
as_variable
,
inputs
)
inputs
=
map
(
as_variable
,
inputs
)
for
input
in
inputs
:
for
input
in
inputs
:
...
@@ -41,7 +43,8 @@ class MyOp(Op):
...
@@ -41,7 +43,8 @@ class MyOp(Op):
return
self
.
name
return
self
.
name
def
__eq__
(
self
,
other
):
def
__eq__
(
self
,
other
):
return
self
is
other
or
isinstance
(
other
,
MyOp
)
and
self
.
x
is
not
None
and
self
.
x
==
other
.
x
return
(
self
is
other
or
isinstance
(
other
,
MyOp
)
and
self
.
x
is
not
None
and
self
.
x
==
other
.
x
)
def
__hash__
(
self
):
def
__hash__
(
self
):
if
self
.
x
is
not
None
:
if
self
.
x
is
not
None
:
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论