Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
1211db88
提交
1211db88
authored
8月 19, 2015
作者:
abergeron
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request #3302 from harlouci/numpydoc_scan_module
Numpydoc scan_module
上级
690d3628
02b6e413
全部展开
隐藏空白字符变更
内嵌
并排
正在显示
6 个修改的文件
包含
235 行增加
和
173 行删除
+235
-173
__init__.py
theano/scan_module/__init__.py
+2
-3
scan.py
theano/scan_module/scan.py
+34
-34
scan_op.py
theano/scan_module/scan_op.py
+68
-45
scan_opt.py
theano/scan_module/scan_opt.py
+46
-13
scan_utils.py
theano/scan_module/scan_utils.py
+0
-0
scan_views.py
theano/scan_module/scan_views.py
+85
-78
没有找到文件。
theano/scan_module/__init__.py
浏览文件 @
1211db88
"""
"""
This module provides the Scan Op
This module provides the Scan Op
.
Scanning is a general form of recurrence, which can be used for looping.
Scanning is a general form of recurrence, which can be used for looping.
The idea is that you *scan* a function along some input sequence, producing
The idea is that you *scan* a function along some input sequence, producing
...
@@ -26,9 +26,8 @@ the symbolic graph.
...
@@ -26,9 +26,8 @@ the symbolic graph.
The Scan Op should typically be used by calling any of the following
The Scan Op should typically be used by calling any of the following
functions: ``scan()``, ``map()``, ``reduce()``, ``foldl()``,
functions: ``scan()``, ``map()``, ``reduce()``, ``foldl()``,
``foldr()``.
``foldr()``.
"""
"""
__docformat__
=
'restructedtext en'
__docformat__
=
'restructedtext en'
__authors__
=
(
"Razvan Pascanu "
__authors__
=
(
"Razvan Pascanu "
"Frederic Bastien "
"Frederic Bastien "
...
...
theano/scan_module/scan.py
浏览文件 @
1211db88
"""
"""
This module provides the Scan Op
This module provides the Scan Op
.
Scanning is a general form of recurrence, which can be used for looping.
Scanning is a general form of recurrence, which can be used for looping.
The idea is that you *scan* a function along some input sequence, producing
The idea is that you *scan* a function along some input sequence, producing
...
@@ -32,6 +32,7 @@ host at each step
...
@@ -32,6 +32,7 @@ host at each step
The Scan Op should typically be used by calling any of the following
The Scan Op should typically be used by calling any of the following
functions: ``scan()``, ``map()``, ``reduce()``, ``foldl()``,
functions: ``scan()``, ``map()``, ``reduce()``, ``foldl()``,
``foldr()``.
``foldr()``.
"""
"""
__docformat__
=
'restructedtext en'
__docformat__
=
'restructedtext en'
__authors__
=
(
"Razvan Pascanu "
__authors__
=
(
"Razvan Pascanu "
...
@@ -84,7 +85,9 @@ def scan(fn,
...
@@ -84,7 +85,9 @@ def scan(fn,
This function constructs and applies a Scan op to the provided
This function constructs and applies a Scan op to the provided
arguments.
arguments.
:param fn:
Parameters
----------
fn
``fn`` is a function that describes the operations involved in one
``fn`` is a function that describes the operations involved in one
step of ``scan``. ``fn`` should construct variables describing the
step of ``scan``. ``fn`` should construct variables describing the
output of one iteration step. It should expect as input theano
output of one iteration step. It should expect as input theano
...
@@ -175,7 +178,7 @@ def scan(fn,
...
@@ -175,7 +178,7 @@ def scan(fn,
number of steps ) is still required even though a condition is
number of steps ) is still required even though a condition is
passed (and it is used to allocate memory if needed). = {}):
passed (and it is used to allocate memory if needed). = {}):
:param sequences:
sequences
``sequences`` is the list of Theano variables or dictionaries
``sequences`` is the list of Theano variables or dictionaries
describing the sequences ``scan`` has to iterate over. If a
describing the sequences ``scan`` has to iterate over. If a
sequence is given as wrapped in a dictionary, then a set of optional
sequence is given as wrapped in a dictionary, then a set of optional
...
@@ -193,8 +196,7 @@ def scan(fn,
...
@@ -193,8 +196,7 @@ def scan(fn,
Any Theano variable in the list ``sequences`` is automatically
Any Theano variable in the list ``sequences`` is automatically
wrapped into a dictionary where ``taps`` is set to ``[0]``
wrapped into a dictionary where ``taps`` is set to ``[0]``
outputs_info
:param outputs_info:
``outputs_info`` is the list of Theano variables or dictionaries
``outputs_info`` is the list of Theano variables or dictionaries
describing the initial state of the outputs computed
describing the initial state of the outputs computed
recurrently. When this initial states are given as dictionary
recurrently. When this initial states are given as dictionary
...
@@ -252,15 +254,13 @@ def scan(fn,
...
@@ -252,15 +254,13 @@ def scan(fn,
raised (because there is no convention on how scan should map
raised (because there is no convention on how scan should map
the provided information to the outputs of ``fn``)
the provided information to the outputs of ``fn``)
non_sequences
:param non_sequences:
``non_sequences`` is the list of arguments that are passed to
``non_sequences`` is the list of arguments that are passed to
``fn`` at each steps. One can opt to exclude variable
``fn`` at each steps. One can opt to exclude variable
used in ``fn`` from this list as long as they are part of the
used in ``fn`` from this list as long as they are part of the
computational graph, though for clarity we encourage not to do so.
computational graph, though for clarity we encourage not to do so.
n_steps
:param n_steps:
``n_steps`` is the number of steps to iterate given as an int
``n_steps`` is the number of steps to iterate given as an int
or Theano scalar. If any of the input sequences do not have
or Theano scalar. If any of the input sequences do not have
enough elements, scan will raise an error. If the *value is 0* the
enough elements, scan will raise an error. If the *value is 0* the
...
@@ -270,8 +270,7 @@ def scan(fn,
...
@@ -270,8 +270,7 @@ def scan(fn,
in time. If n_steps is not provided, ``scan`` will figure
in time. If n_steps is not provided, ``scan`` will figure
out the amount of steps it should run given its input sequences.
out the amount of steps it should run given its input sequences.
truncate_gradient
:param truncate_gradient:
``truncate_gradient`` is the number of steps to use in truncated
``truncate_gradient`` is the number of steps to use in truncated
BPTT. If you compute gradients through a scan op, they are
BPTT. If you compute gradients through a scan op, they are
computed using backpropagation through time. By providing a
computed using backpropagation through time. By providing a
...
@@ -279,16 +278,14 @@ def scan(fn,
...
@@ -279,16 +278,14 @@ def scan(fn,
of classical BPTT, where you go for only ``truncate_gradient``
of classical BPTT, where you go for only ``truncate_gradient``
number of steps back in time.
number of steps back in time.
go_backwards
:param go_backwards:
``go_backwards`` is a flag indicating if ``scan`` should go
``go_backwards`` is a flag indicating if ``scan`` should go
backwards through the sequences. If you think of each sequence
backwards through the sequences. If you think of each sequence
as indexed by time, making this flag True would mean that
as indexed by time, making this flag True would mean that
``scan`` goes back in time, namely that for any sequence it
``scan`` goes back in time, namely that for any sequence it
starts from the end and goes towards 0.
starts from the end and goes towards 0.
name
:param name:
When profiling ``scan``, it is crucial to provide a name for any
When profiling ``scan``, it is crucial to provide a name for any
instance of ``scan``. The profiler will produce an overall
instance of ``scan``. The profiler will produce an overall
profile of your code as well as profiles for the computation of
profile of your code as well as profiles for the computation of
...
@@ -296,7 +293,7 @@ def scan(fn,
...
@@ -296,7 +293,7 @@ def scan(fn,
appears in those profiles and can greatly help to disambiguate
appears in those profiles and can greatly help to disambiguate
information.
information.
:param mode:
mode
It is recommended to leave this argument to None, especially
It is recommended to leave this argument to None, especially
when profiling ``scan`` (otherwise the results are not going to
when profiling ``scan`` (otherwise the results are not going to
be accurate). If you prefer the computations of one step of
be accurate). If you prefer the computations of one step of
...
@@ -305,7 +302,7 @@ def scan(fn,
...
@@ -305,7 +302,7 @@ def scan(fn,
loop are done (see ``theano.function`` for details about
loop are done (see ``theano.function`` for details about
possible values and their meaning).
possible values and their meaning).
:param profile:
profile
Flag or string. If true, or different from the empty string, a
Flag or string. If true, or different from the empty string, a
profile object will be created and attached to the inner graph of
profile object will be created and attached to the inner graph of
scan. In case ``profile`` is True, the profile object will have the
scan. In case ``profile`` is True, the profile object will have the
...
@@ -314,25 +311,27 @@ def scan(fn,
...
@@ -314,25 +311,27 @@ def scan(fn,
inner graph with the new cvm linker ( with default modes,
inner graph with the new cvm linker ( with default modes,
other linkers this argument is useless)
other linkers this argument is useless)
:param allow_gc:
allow_gc
Set the value of allow gc for the internal graph of scan. If
Set the value of allow gc for the internal graph of scan. If
set to None, this will use the value of config.scan.allow_gc.
set to None, this will use the value of config.scan.allow_gc.
:param strict:
strict
If true, all the shared variables used in ``fn`` must be provided as a
If true, all the shared variables used in ``fn`` must be provided as a
part of ``non_sequences`` or ``sequences``.
part of ``non_sequences`` or ``sequences``.
:rtype: tuple
Returns
:return: tuple of the form (outputs, updates); ``outputs`` is either a
-------
Theano variable or a list of Theano variables representing the
tuple
outputs of ``scan`` (in the same order as in
Tuple of the form (outputs, updates); ``outputs`` is either a
``outputs_info``). ``updates`` is a subclass of dictionary
Theano variable or a list of Theano variables representing the
specifying the
outputs of ``scan`` (in the same order as in ``outputs_info``).
update rules for all shared variables used in scan
``updates`` is a subclass of dictionary specifying the update rules for
This dictionary should be passed to ``theano.function`` when
all shared variables used in scan.
you compile your function. The change compared to a normal
This dictionary should be passed to ``theano.function`` when you compile
dictionary is that we validate that keys are SharedVariable
your function. The change compared to a normal dictionary is that we
and addition of those dictionary are validated to be consistent.
validate that keys are SharedVariable and addition of those dictionary
are validated to be consistent.
"""
"""
# General observation : this code is executed only once, at creation
# General observation : this code is executed only once, at creation
# of the computational graph, so we don't yet need to be smart about
# of the computational graph, so we don't yet need to be smart about
...
@@ -344,9 +343,10 @@ def scan(fn,
...
@@ -344,9 +343,10 @@ def scan(fn,
# check if inputs are just single variables instead of lists
# check if inputs are just single variables instead of lists
def
wrap_into_list
(
x
):
def
wrap_into_list
(
x
):
'''
"""
Wrap the input into a list if it is not already a list
Wrap the input into a list if it is not already a list.
'''
"""
if
x
is
None
:
if
x
is
None
:
return
[]
return
[]
elif
not
isinstance
(
x
,
(
list
,
tuple
)):
elif
not
isinstance
(
x
,
(
list
,
tuple
)):
...
@@ -534,7 +534,7 @@ def scan(fn,
...
@@ -534,7 +534,7 @@ def scan(fn,
if
len
(
lengths_vec
)
==
0
:
if
len
(
lengths_vec
)
==
0
:
# ^ No information about the number of steps
# ^ No information about the number of steps
raise
ValueError
(
'
No information about the number of steps '
raise
ValueError
(
'No information about the number of steps '
'provided. Either provide a value for '
'provided. Either provide a value for '
'n_steps argument of scan or provide an input '
'n_steps argument of scan or provide an input '
'sequence'
)
'sequence'
)
...
...
theano/scan_module/scan_op.py
浏览文件 @
1211db88
"""
"""
This module provides the Scan Op
This module provides the Scan Op
.
See scan.py for details on scan
See scan.py for details on scan
.
Memory reuse in scan
Memory reuse in scan
...
@@ -44,6 +44,7 @@ relies on the following elements to work properly :
...
@@ -44,6 +44,7 @@ relies on the following elements to work properly :
the outputs are stored as they are computed which means that, if the buffer
the outputs are stored as they are computed which means that, if the buffer
is too small, computing an output can overwrite an input that is still
is too small, computing an output can overwrite an input that is still
needed to compute another output.
needed to compute another output.
"""
"""
from
__future__
import
print_function
from
__future__
import
print_function
...
@@ -96,35 +97,43 @@ AddConfigVar('scan.allow_output_prealloc',
...
@@ -96,35 +97,43 @@ AddConfigVar('scan.allow_output_prealloc',
class
Scan
(
PureOp
):
class
Scan
(
PureOp
):
"""
Parameters
----------
inputs
Inputs of the inner function of scan.
outputs
Outputs of the inner function of scan.
info
Dictionary containing different properties of the scan op (like number
of different types of arguments, name, mode, if it should run on GPU or
not, etc.).
typeConstructor
Function that constructs an equivalent to Theano TensorType.
Notes
-----
``typeConstructor`` had been added to refactor how
Theano deals with the GPU. If it runs on the GPU, scan needs
to construct certain outputs (those who reside in the GPU
memory) as the GPU-specific type. However we can not import
gpu code in this file (as it is in sandbox, and not available
on each machine) so the workaround is that the GPU
optimization passes to the constructor of this class a
function that is able to construct a GPU type. This way the
class Scan does not need to be aware of the details for the
GPU, it just constructs any tensor using this function (which
by default constructs normal tensors).
"""
def
__init__
(
self
,
def
__init__
(
self
,
inputs
,
inputs
,
outputs
,
outputs
,
info
,
info
,
typeConstructor
=
None
,
typeConstructor
=
None
,
):
):
"""
:param inputs: inputs of the inner function of scan
:param outputs: outputs of the inner function of scan
:param info: dictionary containing different properties of
the scan op (like number of different types of
arguments, name, mode, if it should run on GPU or
not, etc.)
:param typeConstructor: function that constructs an equivalent
to Theano TensorType
Note: ``typeConstructor`` had been added to refactor how
Theano deals with the GPU. If it runs on the GPU, scan needs
to construct certain outputs (those who reside in the GPU
memory) as the GPU-specific type. However we can not import
gpu code in this file (as it is in sandbox, and not available
on each machine) so the workaround is that the GPU
optimization passes to the constructor of this class a
function that is able to construct a GPU type. This way the
class Scan does not need to be aware of the details for the
GPU, it just constructs any tensor using this function (which
by default constructs normal tensors).
"""
if
'gpua'
not
in
info
:
if
'gpua'
not
in
info
:
info
[
'gpua'
]
=
False
info
[
'gpua'
]
=
False
# adding properties into self
# adding properties into self
...
@@ -228,8 +237,10 @@ class Scan(PureOp):
...
@@ -228,8 +237,10 @@ class Scan(PureOp):
self
.
var_mappings
=
self
.
get_oinp_iinp_iout_oout_mappings
()
self
.
var_mappings
=
self
.
get_oinp_iinp_iout_oout_mappings
()
def
validate_inner_graph
(
self
):
def
validate_inner_graph
(
self
):
""" Perform some elementary validations on the inner graph to ensure
"""
Perform some elementary validations on the inner graph to ensure
that it is coherent.
that it is coherent.
"""
"""
# For every recurrent output, iterate over the associated inner
# For every recurrent output, iterate over the associated inner
...
@@ -323,6 +334,7 @@ class Scan(PureOp):
...
@@ -323,6 +334,7 @@ class Scan(PureOp):
inner_X_out - the variable representing the new value of X after
inner_X_out - the variable representing the new value of X after
executing one step of scan (i.e. outputs given by
executing one step of scan (i.e. outputs given by
the inner function)
the inner function)
"""
"""
assert
numpy
.
all
(
isinstance
(
i
,
gof
.
Variable
)
for
i
in
inputs
)
assert
numpy
.
all
(
isinstance
(
i
,
gof
.
Variable
)
for
i
in
inputs
)
# Check that the number of inputs to the Scan node corresponds to
# Check that the number of inputs to the Scan node corresponds to
...
@@ -391,10 +403,12 @@ class Scan(PureOp):
...
@@ -391,10 +403,12 @@ class Scan(PureOp):
)
)
def
format
(
var
,
as_var
):
def
format
(
var
,
as_var
):
""" This functions ensures that ``out`` has the same dtype as
"""
This functions ensures that ``out`` has the same dtype as
``inp`` as well as calling filter_variable to make sure they are
``inp`` as well as calling filter_variable to make sure they are
both TensorType or CudaNdarrayType. It internally deals with the
both TensorType or CudaNdarrayType. It internally deals with the
corner case where inp.ndim + 1 = out.ndim
corner case where inp.ndim + 1 = out.ndim
"""
"""
if
not
hasattr
(
var
,
'dtype'
):
if
not
hasattr
(
var
,
'dtype'
):
return
var
return
var
...
@@ -686,24 +700,31 @@ class Scan(PureOp):
...
@@ -686,24 +700,31 @@ class Scan(PureOp):
def
make_thunk
(
self
,
node
,
storage_map
,
compute_map
,
no_recycling
):
def
make_thunk
(
self
,
node
,
storage_map
,
compute_map
,
no_recycling
):
"""
"""
:param node: something previously returned by self.make_node
:param storage_map: dict variable -> one-element-list where a computed
Parameters
value for this variable may be found.
----------
node
Something previously returned by self.make_node.
storage_map
dict variable -> one-element-list where a computed
value for this variable may be found.
compute_map
dict variable -> one-element-list where a boolean
value will be found. The boolean indicates whether the
variable's storage_map container contains a valid value (True)
or if it has not been computed yet (False).
no_recycling
List of variables for which it is forbidden to reuse memory
allocated by a previous call.
Notes
-----
If the thunk consults the storage_map on every call, it is safe
for it to ignore the no_recycling argument, because elements of the
no_recycling list will have a value of None in the storage map. If
the thunk can potentially cache return values (like CLinker does),
then it must not do so for variables in the no_recycling list.
:param compute_map: dict variable -> one-element-list where a boolean
value will be found. The boolean indicates whether the
variable's storage_map container contains a valid value (True)
or if it has not been computed yet (False).
:param no_recycling: list of variables for which it is forbidden to
reuse memory allocated by a previous call.
:note: If the thunk consults the storage_map on every call, it is safe
for it to ignore the no_recycling argument, because elements of the
no_recycling list will have a value of None in the storage map. If
the thunk can potentially cache return values (like CLinker does),
then it must not do so for variables in the no_recycling list.
"""
"""
# Before building the thunk, validate that the inner graph is
# Before building the thunk, validate that the inner graph is
...
@@ -1531,7 +1552,8 @@ class Scan(PureOp):
...
@@ -1531,7 +1552,8 @@ class Scan(PureOp):
return
connection_pattern
return
connection_pattern
def
get_oinp_iinp_iout_oout_mappings
(
self
):
def
get_oinp_iinp_iout_oout_mappings
(
self
):
""" Compute and return dictionary mappings between the inputs and
"""
Compute and return dictionary mappings between the inputs and
outputs of the inner function and the inputs and outputs of the Scan
outputs of the inner function and the inputs and outputs of the Scan
node in the outer graph.
node in the outer graph.
...
@@ -1541,7 +1563,8 @@ class Scan(PureOp):
...
@@ -1541,7 +1563,8 @@ class Scan(PureOp):
the values are individual integer indices. In dictionaries
the values are individual integer indices. In dictionaries
representing mappings to inner variables, the values are sequences of
representing mappings to inner variables, the values are sequences of
indices because multiple inner variables can be associated with the
indices because multiple inner variables can be associated with the
same state
same state.
"""
"""
# Lists for outer variables contain individual indices, lists for
# Lists for outer variables contain individual indices, lists for
# inner variables contain sequences of indices because many inner
# inner variables contain sequences of indices because many inner
...
...
theano/scan_module/scan_opt.py
浏览文件 @
1211db88
"""
"""
This module provides optimizations for scan
This module provides optimizations for scan
.
The Optimization provided in this file:
The Optimization provided in this file:
local opt: remove_constants_and_unused_inputs_scan,
local opt: remove_constants_and_unused_inputs_scan,
...
@@ -48,9 +48,8 @@ scan_eqopt2 -> They are all global optimizer. (in2out convert local to global).
...
@@ -48,9 +48,8 @@ scan_eqopt2 -> They are all global optimizer. (in2out convert local to global).
in2out(scan_merge_inouts),
in2out(scan_merge_inouts),
ScanSaveMem,
ScanSaveMem,
in2out(remove_constants_and_unused_inputs_scan3)
in2out(remove_constants_and_unused_inputs_scan3)
"""
"""
__docformat__
=
'restructedtext en'
__docformat__
=
'restructedtext en'
__authors__
=
(
"Razvan Pascanu "
__authors__
=
(
"Razvan Pascanu "
"Frederic Bastien "
"Frederic Bastien "
...
@@ -104,7 +103,7 @@ def info(*msg):
...
@@ -104,7 +103,7 @@ def info(*msg):
@gof.local_optimizer
([
scan_op
.
Scan
])
@gof.local_optimizer
([
scan_op
.
Scan
])
def
remove_constants_and_unused_inputs_scan
(
node
):
def
remove_constants_and_unused_inputs_scan
(
node
):
'''
"""
Move constants into the inner graph, and remove unused inputs.
Move constants into the inner graph, and remove unused inputs.
Constants that are in the outer graph are represented by a free symbolic
Constants that are in the outer graph are represented by a free symbolic
...
@@ -112,7 +111,8 @@ def remove_constants_and_unused_inputs_scan(node):
...
@@ -112,7 +111,8 @@ def remove_constants_and_unused_inputs_scan(node):
constant-folding can happen in the inner graph.
constant-folding can happen in the inner graph.
This is applied only on sequences and non-sequences,
This is applied only on sequences and non-sequences,
not on initial states.
not on initial states.
'''
"""
if
not
isinstance
(
node
.
op
,
scan_op
.
Scan
):
if
not
isinstance
(
node
.
op
,
scan_op
.
Scan
):
return
False
return
False
op
=
node
.
op
op
=
node
.
op
...
@@ -214,7 +214,9 @@ class PushOutNonSeqScan(gof.Optimizer):
...
@@ -214,7 +214,9 @@ class PushOutNonSeqScan(gof.Optimizer):
"""
"""
A global optimizer for pushing out the variables inside the scan that
A global optimizer for pushing out the variables inside the scan that
are not used by the scan.
are not used by the scan.
"""
"""
def
__init__
(
self
):
def
__init__
(
self
):
gof
.
Optimizer
.
__init__
(
self
)
gof
.
Optimizer
.
__init__
(
self
)
...
@@ -233,6 +235,7 @@ class PushOutNonSeqScan(gof.Optimizer):
...
@@ -233,6 +235,7 @@ class PushOutNonSeqScan(gof.Optimizer):
By default they are not ordered for efficiency reasons. Take care
By default they are not ordered for efficiency reasons. Take care
and make sure of changing them with their Ordered counterparts if you
and make sure of changing them with their Ordered counterparts if you
need to iterate over these variables.
need to iterate over these variables.
"""
"""
# this flag tells if there was any change during the last iterations
# this flag tells if there was any change during the last iterations
clean_inputs
,
clean_outputs
=
scan_utils
.
reconstruct_graph
(
clean_inputs
,
clean_outputs
=
scan_utils
.
reconstruct_graph
(
...
@@ -410,7 +413,9 @@ class PushOutSeqScan(gof.Optimizer):
...
@@ -410,7 +413,9 @@ class PushOutSeqScan(gof.Optimizer):
"""
"""
A global optimizer for pushing out the input variables that are not being
A global optimizer for pushing out the input variables that are not being
used inside the scan and provided in the sequences.
used inside the scan and provided in the sequences.
"""
"""
def
__init__
(
self
):
def
__init__
(
self
):
gof
.
Optimizer
.
__init__
(
self
)
gof
.
Optimizer
.
__init__
(
self
)
...
@@ -429,6 +434,7 @@ class PushOutSeqScan(gof.Optimizer):
...
@@ -429,6 +434,7 @@ class PushOutSeqScan(gof.Optimizer):
By default they are not ordered for efficiency reasons. Take care
By default they are not ordered for efficiency reasons. Take care
and make sure of changing them to Ordered versions if you need to
and make sure of changing them to Ordered versions if you need to
iterate over those variables.
iterate over those variables.
"""
"""
# this flag tells if there was any change during the last iterations
# this flag tells if there was any change during the last iterations
clean_inputs
,
clean_outputs
=
scan_utils
.
reconstruct_graph
(
clean_inputs
,
clean_outputs
=
scan_utils
.
reconstruct_graph
(
...
@@ -653,7 +659,9 @@ class PushOutScanOutput(gof.Optimizer):
...
@@ -653,7 +659,9 @@ class PushOutScanOutput(gof.Optimizer):
"""
"""
This is an optimization that can push operations performed
This is an optimization that can push operations performed
at the end of the inner graph of scan to outside of scan.
at the end of the inner graph of scan to outside of scan.
"""
"""
def
__init__
(
self
):
def
__init__
(
self
):
gof
.
Optimizer
.
__init__
(
self
)
gof
.
Optimizer
.
__init__
(
self
)
...
@@ -701,8 +709,8 @@ class PushOutScanOutput(gof.Optimizer):
...
@@ -701,8 +709,8 @@ class PushOutScanOutput(gof.Optimizer):
The Dot product is pushed out of the scan and its inputs are
The Dot product is pushed out of the scan and its inputs are
now the original matrix and a new matrix obtained by
now the original matrix and a new matrix obtained by
concatenating the vectors into a matrix.
concatenating the vectors into a matrix.
"""
"""
# Ensure that the output of the Dot is used in the outer
# Ensure that the output of the Dot is used in the outer
# graph to avoid apply the optimization needlessly
# graph to avoid apply the optimization needlessly
dot_out_nitsot_idx
=
args
.
inner_out_nit_sot
.
index
(
nd
.
out
)
dot_out_nitsot_idx
=
args
.
inner_out_nit_sot
.
index
(
nd
.
out
)
...
@@ -715,6 +723,7 @@ class PushOutScanOutput(gof.Optimizer):
...
@@ -715,6 +723,7 @@ class PushOutScanOutput(gof.Optimizer):
non-sequence input to scan and that the other input is a
non-sequence input to scan and that the other input is a
vector and either an sequence input to scan or the result
vector and either an sequence input to scan or the result
of computation in the inner function of scan.
of computation in the inner function of scan.
"""
"""
valid_inputs
=
False
valid_inputs
=
False
idx_matrix_input
=
-
1
idx_matrix_input
=
-
1
...
@@ -863,6 +872,7 @@ class PushOutScanOutput(gof.Optimizer):
...
@@ -863,6 +872,7 @@ class PushOutScanOutput(gof.Optimizer):
nit_sot output has only one client and that client is a Subtensor
nit_sot output has only one client and that client is a Subtensor
instance that takes only the last step (last element along the first
instance that takes only the last step (last element along the first
axis).
axis).
"""
"""
idx
=
scan_args
.
inner_out_sit_sot
.
index
(
var
)
idx
=
scan_args
.
inner_out_sit_sot
.
index
(
var
)
outer_var
=
scan_args
.
outer_out_sit_sot
[
idx
]
outer_var
=
scan_args
.
outer_out_sit_sot
[
idx
]
...
@@ -988,7 +998,11 @@ class PushOutScanOutput(gof.Optimizer):
...
@@ -988,7 +998,11 @@ class PushOutScanOutput(gof.Optimizer):
class
ScanInplaceOptimizer
(
Optimizer
):
class
ScanInplaceOptimizer
(
Optimizer
):
"""Graph optimizer for Scan(makes it run inplace)"""
"""
Graph optimizer for Scan (makes it run inplace).
"""
def
__init__
(
self
,
typeConstructor
=
None
,
gpu_flag
=
False
,
gpua_flag
=
False
):
def
__init__
(
self
,
typeConstructor
=
None
,
gpu_flag
=
False
,
gpua_flag
=
False
):
Optimizer
.
__init__
(
self
)
Optimizer
.
__init__
(
self
)
self
.
typeConstructor
=
typeConstructor
self
.
typeConstructor
=
typeConstructor
...
@@ -1052,7 +1066,11 @@ class ScanInplaceOptimizer(Optimizer):
...
@@ -1052,7 +1066,11 @@ class ScanInplaceOptimizer(Optimizer):
class
ScanSaveMem
(
gof
.
Optimizer
):
class
ScanSaveMem
(
gof
.
Optimizer
):
""" Graph Optimizer that reduces scan memory consumption """
"""
Graph Optimizer that reduces scan memory consumption.
"""
def
__init__
(
self
):
def
__init__
(
self
):
gof
.
Optimizer
.
__init__
(
self
)
gof
.
Optimizer
.
__init__
(
self
)
...
@@ -1604,7 +1622,11 @@ class ScanSaveMem(gof.Optimizer):
...
@@ -1604,7 +1622,11 @@ class ScanSaveMem(gof.Optimizer):
class
ScanMerge
(
gof
.
Optimizer
):
class
ScanMerge
(
gof
.
Optimizer
):
""" Graph Optimizer that merges different scan ops """
"""
Graph Optimizer that merges different scan ops.
"""
def
add_requirements
(
self
,
fgraph
):
def
add_requirements
(
self
,
fgraph
):
fgraph
.
attach_feature
(
gof
.
toolbox
.
ReplaceValidate
())
fgraph
.
attach_feature
(
gof
.
toolbox
.
ReplaceValidate
())
...
@@ -1783,6 +1805,7 @@ class ScanMerge(gof.Optimizer):
...
@@ -1783,6 +1805,7 @@ class ScanMerge(gof.Optimizer):
over the same number of steps, have the same condition (if any),
over the same number of steps, have the same condition (if any),
have the same value for truncate_gradient, and have the same mode.
have the same value for truncate_gradient, and have the same mode.
Questionable, we should also consider profile ?
Questionable, we should also consider profile ?
"""
"""
rep
=
set_nodes
[
0
]
rep
=
set_nodes
[
0
]
if
rep
.
op
.
as_while
!=
node
.
op
.
as_while
:
if
rep
.
op
.
as_while
!=
node
.
op
.
as_while
:
...
@@ -1852,13 +1875,19 @@ class ScanMerge(gof.Optimizer):
...
@@ -1852,13 +1875,19 @@ class ScanMerge(gof.Optimizer):
def
has_duplicates
(
l
):
def
has_duplicates
(
l
):
"""returns true if l has any duplicates (according to __eq__)."""
"""
Returns true if l has any duplicates (according to __eq__).
"""
return
len
(
set
(
l
))
<
len
(
l
)
return
len
(
set
(
l
))
<
len
(
l
)
def
make_equiv
(
lo
,
li
):
def
make_equiv
(
lo
,
li
):
"""builds a dictionary of equivalences between inner inputs based on
"""
the equivalence of their corresponding outer inputs."""
Builds a dictionary of equivalences between inner inputs based on
the equivalence of their corresponding outer inputs.
"""
seeno
=
OrderedDict
()
seeno
=
OrderedDict
()
left
=
[]
left
=
[]
right
=
[]
right
=
[]
...
@@ -2034,7 +2063,11 @@ def scan_merge_inouts(node):
...
@@ -2034,7 +2063,11 @@ def scan_merge_inouts(node):
class
PushOutDot1
(
gof
.
Optimizer
):
class
PushOutDot1
(
gof
.
Optimizer
):
"""Graph optimizer for Scan(makes it run inplace)"""
"""
Graph optimizer for Scan(makes it run inplace).
"""
def
__init__
(
self
):
def
__init__
(
self
):
Optimizer
.
__init__
(
self
)
Optimizer
.
__init__
(
self
)
...
...
theano/scan_module/scan_utils.py
浏览文件 @
1211db88
差异被折叠。
点击展开。
theano/scan_module/scan_views.py
浏览文件 @
1211db88
"""
"""
This module provides syntax shortcut for the Scan Op
This module provides syntax shortcut for the Scan Op
.
See scan.py for details on scan
See scan.py for details on scan.
"""
"""
__docformat__
=
'restructedtext en'
__docformat__
=
'restructedtext en'
__authors__
=
(
"Razvan Pascanu "
__authors__
=
(
"Razvan Pascanu "
"Frederic Bastien "
"Frederic Bastien "
...
@@ -37,26 +37,27 @@ def map(fn,
...
@@ -37,26 +37,27 @@ def map(fn,
"""
"""
Similar behaviour as python's map.
Similar behaviour as python's map.
:param fn: The function that ``map`` applies at each iteration step
Parameters
(see ``scan`` for more info).
----------
fn
:param sequences: List of sequences over which ``map`` iterates
The function that ``map`` applies at each iteration step
(see ``scan`` for more info).
(see ``scan`` for more info).
sequences
:param non_sequences: List of arguments passed to ``fn``. ``map`` will
List of sequences over which ``map`` iterates
not iterate over these arguments (see ``scan`` for
(see ``scan`` for more info).
more info).
non_sequences
List of arguments passed to ``fn``. ``map`` will not iterate over
:param truncate_gradient: See ``scan``.
these arguments (see ``scan`` for more info).
truncate_gradient
:param go_backwards: Boolean value that decides the direction of
See ``scan``.
iteration. True means that sequences are parsed
go_backwards : bool
from the end towards the begining, while False
Decides the direction of iteration. True means that sequences are parsed
is the other way around.
from the end towards the begining, while False is the other way around.
mode
See ``scan``.
name
See ``scan``.
:param mode: See ``scan``.
:param name: See ``scan``.
"""
"""
return
scan
(
fn
=
fn
,
return
scan
(
fn
=
fn
,
sequences
=
sequences
,
sequences
=
sequences
,
...
@@ -77,29 +78,31 @@ def reduce(fn,
...
@@ -77,29 +78,31 @@ def reduce(fn,
mode
=
None
,
mode
=
None
,
name
=
None
):
name
=
None
):
"""
"""
Similar behaviour as python's reduce
Similar behaviour as python's reduce.
:param fn: The function that ``reduce`` applies at each iteration step
Parameters
(see ``scan`` for more info).
----------
fn
:param sequences: List of sequences over which ``reduce`` iterates
The function that ``reduce`` applies at each iteration step
(see ``scan`` for more info)
(see ``scan`` for more info).
sequences
:param outputs_info: List of dictionaries describing the outputs of
List of sequences over which ``reduce`` iterates
reduce (see ``scan`` for more info).
(see ``scan`` for more info).
outputs_info
:param non_sequences: List of arguments passed to ``fn``. ``reduce`` will
List of dictionaries describing the outputs of
reduce (see ``scan`` for more info).
non_sequences
List of arguments passed to ``fn``. ``reduce`` will
not iterate over these arguments (see ``scan`` for
not iterate over these arguments (see ``scan`` for
more info).
more info).
go_backwards : bool
Decides the direction of iteration. True means that sequences are parsed
from the end towards the begining, while False is the other way around.
mode
See ``scan``.
name
See ``scan``.
:param go_backwards: Boolean value that decides the direction of
iteration. True means that sequences are parsed
from the end towards the begining, while False
is the other way around.
:param mode: See ``scan``.
:param name: See ``scan``.
"""
"""
rval
=
scan
(
fn
=
fn
,
rval
=
scan
(
fn
=
fn
,
sequences
=
sequences
,
sequences
=
sequences
,
...
@@ -123,25 +126,27 @@ def foldl(fn,
...
@@ -123,25 +126,27 @@ def foldl(fn,
mode
=
None
,
mode
=
None
,
name
=
None
):
name
=
None
):
"""
"""
Similar behaviour as haskell's foldl
Similar behaviour as haskell's foldl.
:param fn: The function that ``foldl`` applies at each iteration step
Parameters
(see ``scan`` for more info).
----------
fn
The function that ``foldl`` applies at each iteration step
(see ``scan`` for more info).
sequences
List of sequences over which ``foldl`` iterates
(see ``scan`` for more info).
outputs_info
List of dictionaries describing the outputs of reduce
(see ``scan`` for more info).
non_sequences
List of arguments passed to `fn`. ``foldl`` will not iterate over
these arguments (see ``scan`` for more info).
mode
See ``scan``.
name
See ``scan``.
:param sequences: List of sequences over which ``foldl`` iterates
(see ``scan`` for more info)
:param outputs_info: List of dictionaries describing the outputs of
reduce (see ``scan`` for more info).
:param non_sequences: List of arguments passed to `fn`. ``foldl`` will
not iterate over these arguments (see ``scan`` for
more info).
:param mode: See ``scan``.
:param name: See ``scan``.
"""
"""
return
reduce
(
fn
=
fn
,
return
reduce
(
fn
=
fn
,
sequences
=
sequences
,
sequences
=
sequences
,
...
@@ -160,25 +165,27 @@ def foldr(fn,
...
@@ -160,25 +165,27 @@ def foldr(fn,
mode
=
None
,
mode
=
None
,
name
=
None
):
name
=
None
):
"""
"""
Similar behaviour as haskell' foldr
Similar behaviour as haskell' foldr.
:param fn: The function that ``foldr`` applies at each iteration step
Parameters
(see ``scan`` for more info).
----------
fn
The function that ``foldr`` applies at each iteration step
:param sequences: List of sequences over which ``foldr`` iterates
(see ``scan`` for more info).
(see ``scan`` for more info)
sequences
List of sequences over which ``foldr`` iterates
:param outputs_info: List of dictionaries describing the outputs of
(see ``scan`` for more info).
reduce (see ``scan`` for more info).
outputs_info
List of dictionaries describing the outputs of reduce
:param non_sequences: List of arguments passed to `fn`. ``foldr`` will
(see ``scan`` for more info).
not iterate over these arguments (see ``scan`` for
non_sequences
more info).
List of arguments passed to `fn`. ``foldr`` will not iterate over these
arguments (see ``scan`` for more info).
:param mode: See ``scan``.
mode
See ``scan``.
name
See ``scan``.
:param name: See ``scan``.
"""
"""
return
reduce
(
fn
=
fn
,
return
reduce
(
fn
=
fn
,
sequences
=
sequences
,
sequences
=
sequences
,
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论