Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
690d3628
提交
690d3628
authored
8月 19, 2015
作者:
abergeron
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request #3301 from harlouci/numpydoc_compile
Numpydoc compile
上级
e8ecd0fc
bed6f019
隐藏空白字符变更
内嵌
并排
正在显示
13 个修改的文件
包含
1226 行增加
和
863 行删除
+1226
-863
builders.py
theano/compile/builders.py
+14
-10
debugmode.py
theano/compile/debugmode.py
+298
-149
function.py
theano/compile/function.py
+68
-75
function_module.py
theano/compile/function_module.py
+159
-106
io.py
theano/compile/io.py
+103
-112
mode.py
theano/compile/mode.py
+40
-22
monitormode.py
theano/compile/monitormode.py
+22
-22
nanguardmode.py
theano/compile/nanguardmode.py
+22
-11
ops.py
theano/compile/ops.py
+151
-95
pfunc.py
theano/compile/pfunc.py
+135
-145
profilemode.py
theano/compile/profilemode.py
+46
-24
profiling.py
theano/compile/profiling.py
+93
-40
sharedvalue.py
theano/compile/sharedvalue.py
+75
-52
没有找到文件。
theano/compile/builders.py
浏览文件 @
690d3628
...
@@ -10,10 +10,11 @@ from functools import reduce
...
@@ -10,10 +10,11 @@ from functools import reduce
class
OpFromGraph
(
gof
.
Op
):
class
OpFromGraph
(
gof
.
Op
):
"""This creates an `Op` from inputs and outputs lists of variables.
"""
This creates an `Op` from inputs and outputs lists of variables.
The signature is similar to theano.function() and the resulting
The signature is similar to theano.function() and the resulting
`Op`'s perform will do the same operation as:
:
`Op`'s perform will do the same operation as:
orig_function(inputs, outputs, **kwargs)
orig_function(inputs, outputs, **kwargs)
...
@@ -31,11 +32,15 @@ class OpFromGraph(gof.Op):
...
@@ -31,11 +32,15 @@ class OpFromGraph(gof.Op):
- Add support to pickle this Op.
- Add support to pickle this Op.
- Add support/test with random generator
- Add support/test with random generator
:note:
Notes
- We support shared variables in the inner graph. This is automatic and
-----
invisible to the user. They can be as input to the node or in the
- We support shared variables in the inner graph. This is automatic and
inner graph.
invisible to the user. They can be as input to the node or in the
- We support unused inputs. This is needed for the grad.
inner graph.
- We support unused inputs. This is needed for the grad.
Examples
--------
Example 1:
Example 1:
...
@@ -49,8 +54,6 @@ class OpFromGraph(gof.Op):
...
@@ -49,8 +54,6 @@ class OpFromGraph(gof.Op):
e2 = op(x, y, z) + op(z, y, x)
e2 = op(x, y, z) + op(z, y, x)
fn = function([x, y, z], [e2])
fn = function([x, y, z], [e2])
Example 2 with shared variable:
Example 2 with shared variable:
.. code-block:: python
.. code-block:: python
...
@@ -139,7 +142,8 @@ class OpFromGraph(gof.Op):
...
@@ -139,7 +142,8 @@ class OpFromGraph(gof.Op):
def
connection_pattern
(
self
,
node
):
def
connection_pattern
(
self
,
node
):
"""
"""
Return connection pattern of subfgraph defined by inputs and outputs
Return connection pattern of subfgraph defined by inputs and outputs.
"""
"""
return
io_connection_pattern
(
self
.
new_inputs
,
self
.
new_outputs
)
return
io_connection_pattern
(
self
.
new_inputs
,
self
.
new_outputs
)
...
...
theano/compile/debugmode.py
浏览文件 @
690d3628
"""Provides `DebugMode`, an evaluation mode for debugging theano internals.
"""
Provides `DebugMode`, an evaluation mode for debugging theano internals.
:
TODO: add support for IfElse Op, LazyLinker, PureOp, etc.
TODO: add support for IfElse Op, LazyLinker, PureOp, etc.
"""
"""
from
__future__
import
print_function
from
__future__
import
print_function
...
@@ -123,7 +124,11 @@ _logger.addFilter(NoDuplicateOptWarningFilter())
...
@@ -123,7 +124,11 @@ _logger.addFilter(NoDuplicateOptWarningFilter())
#
#
########################
########################
class
DebugModeError
(
Exception
):
class
DebugModeError
(
Exception
):
"""Generic Exception raised to indicate an internal theano problem"""
"""
Generic Exception raised to indicate an internal theano problem.
"""
pass
pass
...
@@ -135,21 +140,30 @@ class BadThunkOutput(DebugModeError):
...
@@ -135,21 +140,30 @@ class BadThunkOutput(DebugModeError):
do not agree, or if one of these methods do not give the same result
do not agree, or if one of these methods do not give the same result
when called twice with the same inputs (but different memory layouts
when called twice with the same inputs (but different memory layouts
for the output).
for the output).
"""
"""
r
=
None
r
=
None
"""The `Variable` instance for which conflicting values were computed"""
"""
The `Variable` instance for which conflicting values were computed.
"""
thunk1
=
''
thunk1
=
''
val1
=
None
val1
=
None
"""The value computed by `thunk1`"""
"""
The value computed by `thunk1`.
"""
thunk2
=
''
thunk2
=
''
val2
=
None
val2
=
None
"""The value computed by `thunk2`"""
"""
The value computed by `thunk2`.
"""
def
__init__
(
self
,
r
,
thunk1
,
val1
,
thunk2
,
val2
,
inputs_val
=
()):
def
__init__
(
self
,
r
,
thunk1
,
val1
,
thunk2
,
val2
,
inputs_val
=
()):
"""Initialize members"""
super
(
BadThunkOutput
,
self
)
.
__init__
()
super
(
BadThunkOutput
,
self
)
.
__init__
()
self
.
r
=
r
self
.
r
=
r
self
.
thunk1
=
thunk1
self
.
thunk1
=
thunk1
...
@@ -159,16 +173,22 @@ class BadThunkOutput(DebugModeError):
...
@@ -159,16 +173,22 @@ class BadThunkOutput(DebugModeError):
self
.
inputs_val
=
inputs_val
self
.
inputs_val
=
inputs_val
def
offending_op
(
self
):
def
offending_op
(
self
):
"""Return the Op class whose c_code and perform
"""
implementations didn't match"""
Return the Op class whose c_code and perform implementations
didn't match.
"""
return
type
(
self
.
r
.
owner
.
op
)
return
type
(
self
.
r
.
owner
.
op
)
def
__str__
(
self
):
def
__str__
(
self
):
return
self
.
str_diagnostic
()
return
self
.
str_diagnostic
()
def
str_diagnostic
(
self
):
def
str_diagnostic
(
self
):
"""Return a pretty multiline string representating the cause
"""
of the exception"""
Return a pretty multiline string representing the cause of
the exception.
"""
sio
=
StringIO
()
sio
=
StringIO
()
print
(
"BadThunkOutput"
,
file
=
sio
)
print
(
"BadThunkOutput"
,
file
=
sio
)
print
(
" Apply :"
,
self
.
r
.
owner
,
file
=
sio
)
print
(
" Apply :"
,
self
.
r
.
owner
,
file
=
sio
)
...
@@ -202,41 +222,61 @@ class BadThunkOutput(DebugModeError):
...
@@ -202,41 +222,61 @@ class BadThunkOutput(DebugModeError):
class
BadOptimization
(
DebugModeError
):
class
BadOptimization
(
DebugModeError
):
"""Exception: some variable and its substitute take different
"""
runtime values.
Exception: some variable and its substitute take different runtime values.
"""
"""
new_r
=
None
new_r
=
None
"""A `Variable` instance that took a different value from `old_r`,
"""
but which replaced `old_r`."""
A `Variable` instance that took a different value from `old_r`,
but which replaced `old_r`.
"""
old_r
=
None
old_r
=
None
"""A `Variable` instance that was replaced by `new_r`."""
"""
A `Variable` instance that was replaced by `new_r`.
"""
old_r_val
=
None
old_r_val
=
None
"""The value computed for `old_r`."""
"""
The value computed for `old_r`.
"""
new_r_val
=
None
new_r_val
=
None
"""The value computed for `new_r`."""
"""
The value computed for `new_r`.
"""
reason
=
None
reason
=
None
"""An object that indicates why old_r was turned into new_r.
"""
An object that indicates why old_r was turned into new_r.
Convention is that this is the name of the optimization that
Convention is that this is the name of the optimization that
requested the replacement.
requested the replacement.
"""
"""
old_graph
=
""
old_graph
=
""
"""A multiline string representation of the graph leading to
"""
old_r, at the time of the replacement."""
A multiline string representation of the graph leading to
old_r, at the time of the replacement.
"""
new_graph
=
""
new_graph
=
""
"""A multiline string representation of the graph leading to
"""
new_r, at the time of the replacement."""
A multiline string representation of the graph leading to
new_r, at the time of the replacement.
"""
def
__init__
(
self
,
old_r
,
new_r
,
old_r_val
,
new_r_val
,
reason
,
def
__init__
(
self
,
old_r
,
new_r
,
old_r_val
,
new_r_val
,
reason
,
old_graph
,
new_graph
):
old_graph
,
new_graph
):
"""Initialize members"""
super
(
BadOptimization
,
self
)
.
__init__
()
super
(
BadOptimization
,
self
)
.
__init__
()
self
.
old_r
=
old_r
self
.
old_r
=
old_r
self
.
new_r
=
new_r
self
.
new_r
=
new_r
...
@@ -250,8 +290,11 @@ class BadOptimization(DebugModeError):
...
@@ -250,8 +290,11 @@ class BadOptimization(DebugModeError):
return
self
.
str_diagnostic
()
return
self
.
str_diagnostic
()
def
str_diagnostic
(
self
):
def
str_diagnostic
(
self
):
"""Return a pretty multiline string representating the cause
"""
of the exception"""
Return a pretty multiline string representating the cause
of the exception.
"""
sio
=
StringIO
()
sio
=
StringIO
()
val_str_len_limit
=
800
val_str_len_limit
=
800
print
(
"BadOptimization Error"
,
super
(
BadOptimization
,
print
(
"BadOptimization Error"
,
super
(
BadOptimization
,
...
@@ -340,8 +383,11 @@ class BadOptimization(DebugModeError):
...
@@ -340,8 +383,11 @@ class BadOptimization(DebugModeError):
class
BadDestroyMap
(
DebugModeError
):
class
BadDestroyMap
(
DebugModeError
):
"""Exception: Some perform() or c_code() modified an input that
"""
wasn't in the destroy_map"""
Exception: Some perform() or c_code() modified an input that
wasn't in the destroy_map.
"""
def
__init__
(
self
,
node
,
idx
,
old_val
,
new_val
,
perform
):
def
__init__
(
self
,
node
,
idx
,
old_val
,
new_val
,
perform
):
super
(
BadDestroyMap
,
self
)
.
__init__
()
super
(
BadDestroyMap
,
self
)
.
__init__
()
self
.
node
=
node
self
.
node
=
node
...
@@ -395,8 +441,12 @@ class BadDestroyMap(DebugModeError):
...
@@ -395,8 +441,12 @@ class BadDestroyMap(DebugModeError):
class
BadViewMap
(
DebugModeError
):
class
BadViewMap
(
DebugModeError
):
"""Exception: Some perform() or c_code() created a memory alias
"""
that wasn't in the view_map"""
Exception: Some perform() or c_code() created a memory alias
that wasn't in the view_map.
"""
def
__init__
(
self
,
node
,
output_idx
,
out_storage
,
def
__init__
(
self
,
node
,
output_idx
,
out_storage
,
in_alias_idx
=
None
,
out_alias_idx
=
None
):
in_alias_idx
=
None
,
out_alias_idx
=
None
):
super
(
BadViewMap
,
self
)
.
__init__
()
super
(
BadViewMap
,
self
)
.
__init__
()
...
@@ -426,7 +476,8 @@ class BadViewMap(DebugModeError):
...
@@ -426,7 +476,8 @@ class BadViewMap(DebugModeError):
class
StochasticOrder
(
DebugModeError
):
class
StochasticOrder
(
DebugModeError
):
"""Exception: Repeated Optimizations of the same graph do not give
"""
Exception: Repeated Optimizations of the same graph do not give
identical results.
identical results.
The most common cause is that an Optimization iterates over some
The most common cause is that an Optimization iterates over some
...
@@ -440,8 +491,12 @@ class StochasticOrder(DebugModeError):
...
@@ -440,8 +491,12 @@ class StochasticOrder(DebugModeError):
class
InvalidValueError
(
DebugModeError
):
class
InvalidValueError
(
DebugModeError
):
"""Exception: some Op an output value that is inconsistent with
"""
the Type of that output"""
Exception: some Op an output value that is inconsistent with
the Type of that output.
"""
def
__init__
(
self
,
r
,
v
,
client_node
=
None
,
hint
=
'none'
,
def
__init__
(
self
,
r
,
v
,
client_node
=
None
,
hint
=
'none'
,
specific_hint
=
'none'
):
specific_hint
=
'none'
):
super
(
InvalidValueError
,
self
)
.
__init__
()
super
(
InvalidValueError
,
self
)
.
__init__
()
...
@@ -498,8 +553,11 @@ class InvalidValueError(DebugModeError):
...
@@ -498,8 +553,11 @@ class InvalidValueError(DebugModeError):
def
char_from_number
(
number
):
def
char_from_number
(
number
):
""" Converts number to string by rendering it in base 26 using
"""
capital letters as digits """
Converts number to string by rendering it in base 26 using
capital letters as digits.
"""
base
=
26
base
=
26
...
@@ -523,31 +581,45 @@ def debugprint(r, prefix='', depth=-1, done=None, print_type=False,
...
@@ -523,31 +581,45 @@ def debugprint(r, prefix='', depth=-1, done=None, print_type=False,
stop_on_name
=
False
,
prefix_child
=
None
,
stop_on_name
=
False
,
prefix_child
=
None
,
scan_ops
=
None
,
profile
=
None
,
scan_ops
=
None
,
profile
=
None
,
scan_inner_to_outer_inputs
=
None
):
scan_inner_to_outer_inputs
=
None
):
"""Print the graph leading to `r` to given depth.
"""
Print the graph leading to `r` to given depth.
:param r: Variable instance
:param prefix: prefix to each line (typically some number of spaces)
Parameters
:param depth: maximum recursion depth (Default -1 for unlimited).
----------
:param done: dict of Apply instances that have already been printed
r
and their associated printed ids
Variable instance.
:param print_type: whether to print the Variable type after the other infos
prefix
:param file: file-like object to which to print
Prefix to each line (typically some number of spaces).
:param print_destroy_map: whether to print the op destroy_map after
depth
other info
Maximum recursion depth (Default -1 for unlimited).
:param print_view_map: whether to print the op view_map after other info
done
:param order: If not empty will print the index in the toposort.
dict of Apply instances that have already been printed and their
:param ids: How do we print the identifier of the variable
associated printed ids.
id - print the python id value
print_type
int - print integer character
Whether to print the Variable type after the other infos.
CHAR - print capital character
file
"" - don't print an identifier
File-like object to which to print.
:param stop_on_name: When True, if a node in the graph has a name,
print_destroy_map
we don't print anything below it.
Whether to print the op destroy_map after other info.
:param scan_ops: Scan ops in the graph will be added inside this list
print_view_map
for later printing purposes.
Whether to print the op view_map after other info.
:param scan_inner_to_outer_inputs: a dictionary mapping a scan ops
order
inner function inputs to the scan op inputs (outer inputs) for
If not empty will print the index in the toposort.
printing purposes.
ids
How do we print the identifier of the variable :
id - print the python id value,
int - print integer character,
CHAR - print capital character,
"" - don't print an identifier.
stop_on_name
When True, if a node in the graph has a name, we don't print anything
below it.
scan_ops
Scan ops in the graph will be added inside this list for later printing
purposes.
scan_inner_to_outer_inputs
A dictionary mapping a scan ops inner function inputs to the scan op
inputs (outer inputs) for printing purposes.
"""
"""
if
depth
==
0
:
if
depth
==
0
:
...
@@ -712,17 +784,24 @@ def debugprint(r, prefix='', depth=-1, done=None, print_type=False,
...
@@ -712,17 +784,24 @@ def debugprint(r, prefix='', depth=-1, done=None, print_type=False,
def
_optcheck_fgraph
(
input_specs
,
output_specs
,
accept_inplace
=
False
):
def
_optcheck_fgraph
(
input_specs
,
output_specs
,
accept_inplace
=
False
):
"""Create an FunctionGraph for debugging.
"""
Create a FunctionGraph for debugging.
:param input_specs: fgraph inputs
:type input_specs: WRITEME
Parameters
:param output_specs: fgraph outputs
----------
:type output_specs: WRITEME
input_specs: WRITEME
:param accept_inplace: are inplace ops permitted in the original graph?
fgraph inputs.
:type accept_inplace: Bool
output_specs: WRITEME
:rtype: `FunctionGraph`
fgraph outputs.
:returns: a new FunctionGraph with a cloned graph, with debugging
accept_inplace : bool
`Feature` instances already installed.
Are inplace ops permitted in the original graph?
Returns
-------
FunctionGraph
A new FunctionGraph with a cloned graph, with debugging `Feature`
instances already installed.
"""
"""
orig_inputs
=
[
spec
.
variable
for
spec
in
input_specs
]
orig_inputs
=
[
spec
.
variable
for
spec
in
input_specs
]
updates
=
[
spec
.
update
for
spec
in
input_specs
if
spec
.
update
]
updates
=
[
spec
.
update
for
spec
in
input_specs
if
spec
.
update
]
...
@@ -784,7 +863,8 @@ def check_eq(var, val1, val2):
...
@@ -784,7 +863,8 @@ def check_eq(var, val1, val2):
def
_check_inputs
(
node
,
storage_map
,
r_vals
,
dr_vals
,
active_nodes
,
def
_check_inputs
(
node
,
storage_map
,
r_vals
,
dr_vals
,
active_nodes
,
clobber_dr_vals
=
True
,
clobber_dr_vals
=
True
,
perform
=
None
,
warn_input_not_reused
=
True
):
perform
=
None
,
warn_input_not_reused
=
True
):
"""Raise BadDestroyMap if necessary, update dr_vals
"""
Raise BadDestroyMap if necessary, update dr_vals.
Returns a list of output variables that actually worked inplace
Returns a list of output variables that actually worked inplace
(their value is aliased to the value of at least one input).
(their value is aliased to the value of at least one input).
...
@@ -871,10 +951,11 @@ def _check_viewmap(node, storage_map):
...
@@ -871,10 +951,11 @@ def _check_viewmap(node, storage_map):
"""
"""
This functions raises a BadViewMap exception when it detects the
This functions raises a BadViewMap exception when it detects the
following:
following:
- output node storages aliased to input storage, with no declaration
- Output node storages aliased to input storage, with no declaration
in view_map
in view_map.
- if not aliased to an input, check if two outputs are aliased together
- If not aliased to an input, check if two outputs are aliased together
and used subsequently in the graph
and used subsequently in the graph.
"""
"""
for
oi
,
onode
in
enumerate
(
node
.
outputs
):
for
oi
,
onode
in
enumerate
(
node
.
outputs
):
...
@@ -937,14 +1018,24 @@ def _check_viewmap(node, storage_map):
...
@@ -937,14 +1018,24 @@ def _check_viewmap(node, storage_map):
def
_is_used_in_graph
(
var
):
def
_is_used_in_graph
(
var
):
"""
"""
Returns True if `var` is used by another node in the graph
Returns
-------
bool
True if `var` is used by another node in the graph.
"""
"""
return
not
(
var
.
clients
==
[(
'output'
,
1
)]
or
var
.
clients
==
[])
return
not
(
var
.
clients
==
[(
'output'
,
1
)]
or
var
.
clients
==
[])
def
_check_strides_match
(
a
,
b
,
warn_err
,
op
):
def
_check_strides_match
(
a
,
b
,
warn_err
,
op
):
"""
"""
param: warn_err: if 0, no warning, if 1 warning, if 2 error
Parameters
----------
warn_err
If 0, no warning, if 1 warning, if 2 error.
"""
"""
if
warn_err
==
0
:
if
warn_err
==
0
:
return
return
...
@@ -965,12 +1056,20 @@ def _check_strides_match(a, b, warn_err, op):
...
@@ -965,12 +1056,20 @@ def _check_strides_match(a, b, warn_err, op):
def
_lessbroken_deepcopy
(
a
):
def
_lessbroken_deepcopy
(
a
):
"""
"""
:param a: any object
Returns a copy of `a` that shares no internal storage with the original
Parameters
(a deep copy).
----------
This function handles numpy arrays specially, because copy.deepcopy()
a
called on a 0-d array will return a numpy scalar, not an array.
Any object
Returns
-------
object
A copy of `a` that shares no internal storage with the original
(a deep copy). This function handles numpy arrays specially, because
copy.deepcopy() called on a 0-d array will return a numpy scalar,
not an array.
"""
"""
# this exists because copy.deepcopy on numpy arrays is broken
# this exists because copy.deepcopy on numpy arrays is broken
# This logic is also in link.py
# This logic is also in link.py
...
@@ -990,13 +1089,15 @@ def _lessbroken_deepcopy(a):
...
@@ -990,13 +1089,15 @@ def _lessbroken_deepcopy(a):
def
_find_bad_optimizations0
(
order
,
reasons
,
r_vals
):
def
_find_bad_optimizations0
(
order
,
reasons
,
r_vals
):
"""Use a simple algorithm to find broken optimizations.
"""
Use a simple algorithm to find broken optimizations.
This algorithm is simple to understand, but sometimes when there's
This algorithm is simple to understand, but sometimes when there's
a problem it identifies the wrong optimization as the culprit.
a problem it identifies the wrong optimization as the culprit.
The problem stems from the fact that results are not evaluated in
The problem stems from the fact that results are not evaluated in
chronological order (looking at when they were introduced to the
chronological order (looking at when they were introduced to the
graph).
graph).
"""
"""
# iterate over variables looking for values that don't match the
# iterate over variables looking for values that don't match the
# values of the variables they replaced. This is the sign of a
# values of the variables they replaced. This is the sign of a
...
@@ -1078,19 +1179,24 @@ def _find_bad_optimizations1(order, reasons, r_vals):
...
@@ -1078,19 +1179,24 @@ def _find_bad_optimizations1(order, reasons, r_vals):
def
_find_bad_optimizations2
(
order
,
reasons
,
r_vals
):
def
_find_bad_optimizations2
(
order
,
reasons
,
r_vals
):
"""Use a simple algorithm to find broken optimizations.
"""
Use a simple algorithm to find broken optimizations.
This algorithm is simple to understand, but sometimes when there's
This algorithm is simple to understand, but sometimes when there's
a problem it identifies the wrong optimization as the culprit.
a problem it identifies the wrong optimization as the culprit.
The problem stems from the fact that results are not evaluated in
The problem stems from the fact that results are not evaluated in
chronological order (looking at when they were introduced to the
chronological order (looking at when they were introduced to the
graph).
graph).
"""
"""
checked_variables
=
set
()
checked_variables
=
set
()
def
check_variable_norec
(
new_r
):
def
check_variable_norec
(
new_r
):
"""Verify that `r` has the same value as the results it replaces """
"""
Verify that `r` has the same value as the results it replaces.
"""
for
reason
,
r
,
old_graph_str
,
new_graph_str
in
reasons
[
new_r
]:
for
reason
,
r
,
old_graph_str
,
new_graph_str
in
reasons
[
new_r
]:
new_r_val
=
r_vals
[
new_r
]
new_r_val
=
r_vals
[
new_r
]
r_val
=
r_vals
[
r
]
r_val
=
r_vals
[
r
]
...
@@ -1134,7 +1240,10 @@ _find_bad_optimizations = _find_bad_optimizations0
...
@@ -1134,7 +1240,10 @@ _find_bad_optimizations = _find_bad_optimizations0
def
_get_preallocated_maps
(
node
,
thunk
,
prealloc_modes
,
def_val
,
def
_get_preallocated_maps
(
node
,
thunk
,
prealloc_modes
,
def_val
,
storage_map
,
r_vals
,
dr_vals
,
perform
,
storage_map
,
r_vals
,
dr_vals
,
perform
,
active_order_set
,
inplace_outs
,
init_outputs
):
active_order_set
,
inplace_outs
,
init_outputs
):
'''Preallocate outputs in different memory layouts'''
"""
Preallocate outputs in different memory layouts.
"""
# To avoid circular imports
# To avoid circular imports
from
theano.tensor
import
TensorType
from
theano.tensor
import
TensorType
...
@@ -1357,7 +1466,10 @@ def _get_preallocated_maps(node, thunk, prealloc_modes, def_val,
...
@@ -1357,7 +1466,10 @@ def _get_preallocated_maps(node, thunk, prealloc_modes, def_val,
def
_check_preallocated_output
(
node
,
thunk
,
prealloc_modes
,
def_val
,
def
_check_preallocated_output
(
node
,
thunk
,
prealloc_modes
,
def_val
,
storage_map
,
r_vals
,
dr_vals
,
perform
,
storage_map
,
r_vals
,
dr_vals
,
perform
,
active_order_set
,
inplace_outs
,
init_outputs
):
active_order_set
,
inplace_outs
,
init_outputs
):
'''Try to apply thunk() on different output storages'''
"""
Try to apply thunk() on different output storages.
"""
# If node has an inner compiled Theano function with mode DebugMode,
# If node has an inner compiled Theano function with mode DebugMode,
# disable memory checks in that mode, since they were already run.
# disable memory checks in that mode, since they were already run.
...
@@ -1460,26 +1572,40 @@ def _check_preallocated_output(node, thunk, prealloc_modes, def_val,
...
@@ -1460,26 +1572,40 @@ def _check_preallocated_output(node, thunk, prealloc_modes, def_val,
class
_FunctionGraphEvent
(
object
):
class
_FunctionGraphEvent
(
object
):
"""A record of an event in the life of an FunctionGraph.
"""
A record of an event in the life of an FunctionGraph.
The __eq__ function is important here, as it is the basis for
The __eq__ function is important here, as it is the basis for
comparing optimization runs.
comparing optimization runs.
"""
"""
kind
=
""
kind
=
""
"""One of 'import', 'change', 'prune'"""
"""
One of 'import', 'change', 'prune'.
"""
node
=
None
node
=
None
"""Either 'output' or an Apply instance"""
"""
Either 'output' or an Apply instance.
"""
op
=
None
op
=
None
"""Either 'output' or an Op instance"""
"""Either 'output' or an Op instance"""
idx
=
None
idx
=
None
"""change events involve an position index of the input variable"""
"""
Change events involve an position index of the input variable.
"""
reason
=
None
reason
=
None
"""change events sometimes have a reason"""
"""
Change events sometimes have a reason.
"""
def
__init__
(
self
,
kind
,
node
,
idx
=
None
,
reason
=
None
):
def
__init__
(
self
,
kind
,
node
,
idx
=
None
,
reason
=
None
):
self
.
kind
=
kind
self
.
kind
=
kind
...
@@ -1522,8 +1648,11 @@ class _FunctionGraphEvent(object):
...
@@ -1522,8 +1648,11 @@ class _FunctionGraphEvent(object):
class
_VariableEquivalenceTracker
(
object
):
class
_VariableEquivalenceTracker
(
object
):
"""A FunctionGraph Feature that keeps tabs on an FunctionGraph and
"""
tries to detect problems."""
A FunctionGraph Feature that keeps tabs on an FunctionGraph and
tries to detect problems.
"""
fgraph
=
None
fgraph
=
None
"""WRITEME"""
"""WRITEME"""
...
@@ -1675,7 +1804,11 @@ class _DummyLinker(object):
...
@@ -1675,7 +1804,11 @@ class _DummyLinker(object):
class
_Linker
(
gof
.
link
.
LocalLinker
):
class
_Linker
(
gof
.
link
.
LocalLinker
):
"""Special debugging linker"""
"""
Special debugging linker.
"""
def
__init__
(
self
,
maker
,
schedule
=
None
):
def
__init__
(
self
,
maker
,
schedule
=
None
):
super
(
gof
.
LocalLinker
,
self
)
.
__init__
()
super
(
gof
.
LocalLinker
,
self
)
.
__init__
()
self
.
fgraph
=
None
self
.
fgraph
=
None
...
@@ -2236,11 +2369,39 @@ _NODEFAULT = ['NODEFAULT']
...
@@ -2236,11 +2369,39 @@ _NODEFAULT = ['NODEFAULT']
class
_Maker
(
FunctionMaker
):
# inheritance buys a few helper functions
class
_Maker
(
FunctionMaker
):
# inheritance buys a few helper functions
"""Special debugging FunctionMaker
"""
"""
Special debugging FunctionMaker.
Parameters
----------
inputs : list of SymbolicInput instances
outputs : list of SymbolicOutput instances
Outputs may also be a single Variable (not a list), in which case
the functions produced by FunctionMaker will return their output
value directly.
accept_inplace
True iff it is acceptable to have inplace operations in the graph from
the inputs to the outputs.
on_unused_input
What to do if a variable in the 'inputs' list is not used in the
graph. Possible values are 'raise', 'warn' and 'ignore'.
output_keys
If the outputs argument for theano.function was a list, then
output_keys is None. If the outputs argument was a dict, then
output_keys is a sorted list of the keys from that dict.
Notes
-----
The constructor sets TensorType.filter_checks_isfinite when
`mode.check_isfinite` is True.
"""
verbose
=
0
verbose
=
0
"""Verbosity level of compile-time and run-time checks. (Default
"""
0: silent)"""
Verbosity level of compile-time and run-time checks. (Default 0: silent).
"""
def
__init__
(
self
,
inputs
,
outputs
,
optimizer
,
mode
,
def
__init__
(
self
,
inputs
,
outputs
,
optimizer
,
mode
,
accept_inplace
=
False
,
accept_inplace
=
False
,
...
@@ -2248,33 +2409,6 @@ class _Maker(FunctionMaker): # inheritance buys a few helper functions
...
@@ -2248,33 +2409,6 @@ class _Maker(FunctionMaker): # inheritance buys a few helper functions
profile
=
None
,
profile
=
None
,
on_unused_input
=
None
,
on_unused_input
=
None
,
output_keys
=
None
):
output_keys
=
None
):
"""
:type inputs: a list of SymbolicInput instances
:type outputs: a list of SymbolicOutput instances outputs may
also be a single Variable (not a list), in
which case the functions produced by
FunctionMaker will return their output value
directly
:param accept_inplace: True iff it is acceptable to have
inplace operations in the graph from the inputs to
the outputs
:param on_unused_input: What to do if a variable in the
'inputs' list is not used in the
graph. Possible values are 'raise',
'warn', and 'ignore'.
:param output_keys: If the outputs argument for
theano.function was a list, then
output_keys is None. If the outputs
argument was a dict, then output_keys is a
sorted list of the keys from that dict.
:note: this function sets TensorType.filter_checks_isfinite
when `mode.check_isfinite` is True
"""
self
.
profile
=
profile
self
.
profile
=
profile
# Handle the case where inputs and/or outputs is a single
# Handle the case where inputs and/or outputs is a single
# Variable (not in a list)
# Variable (not in a list)
...
@@ -2395,11 +2529,15 @@ class _Maker(FunctionMaker): # inheritance buys a few helper functions
...
@@ -2395,11 +2529,15 @@ class _Maker(FunctionMaker): # inheritance buys a few helper functions
"""
"""
Create a function.
Create a function.
defaults -> a list matching the inputs list and providing default
Parameters
values if the default for an input is None, then that input
----------
is a required input. For an input with an update, the
defaults
default acts as initialization.
A list matching the inputs list and providing default values if the
trustme -> disables some exceptions, used internally
default for an input is None, then that input is a required input.
For an input with an update, the default acts as initialization.
trustme
Disables some exceptions, used internally.
"""
"""
if
defaults
is
None
:
if
defaults
is
None
:
defaults
=
[
None
]
*
len
(
self
.
inputs
)
defaults
=
[
None
]
*
len
(
self
.
inputs
)
...
@@ -2514,35 +2652,40 @@ copyreg.pickle(_Maker, _pickle_DebugMode_Maker)
...
@@ -2514,35 +2652,40 @@ copyreg.pickle(_Maker, _pickle_DebugMode_Maker)
class
DebugMode
(
Mode
):
class
DebugMode
(
Mode
):
"""Evaluation Mode that detects internal theano errors.
"""
Evaluation Mode that detects internal theano errors.
This mode catches several kinds of internal error:
This mode catches several kinds of internal error:
-
i
nconsistent outputs when calling the same Op twice with the same
-
I
nconsistent outputs when calling the same Op twice with the same
inputs, for instance if c_code and perform implementations, are
inputs, for instance if c_code and perform implementations, are
inconsistent, or in case of incorrect handling of output memory
inconsistent, or in case of incorrect handling of output memory
(see `BadThunkOutput`)
,
(see `BadThunkOutput`)
.
-
a
variable replacing another when their runtime values don't
-
A
variable replacing another when their runtime values don't
match. This is a symptom of an incorrect optimization step, or
match. This is a symptom of an incorrect optimization step, or
faulty Op implementation (raises `BadOptimization`)
faulty Op implementation (raises `BadOptimization`)
.
-
stochastic optimization ordering (raises `StochasticOrder`)
-
Stochastic optimization ordering (raises `StochasticOrder`).
-
incomplete `destroy_map` specification (raises `BadDestroyMap`)
-
Incomplete `destroy_map` specification (raises `BadDestroyMap`).
-
a
n op that returns an illegal value not matching the output
-
A
n op that returns an illegal value not matching the output
Variable Type (raises InvalidValueError)
Variable Type (raises InvalidValueError)
.
Each of these exceptions inherits from the more generic `DebugModeError`.
Each of these exceptions inherits from the more generic `DebugModeError`.
If there are no internal errors, this mode behaves like FAST_RUN
If there are no internal errors, this mode behaves like FAST_RUN
or FAST_COMPILE, but takes a little longer and uses more memory.
or FAST_COMPILE, but takes a little longer and uses more memory.
If there are internal errors, this mode will raise an
Raises
`DebugModeError` exception.
------
DebugModeError
If there are internal errors.
:remark: The work of debugging is implemented by the `_Maker`, `_Linker`,
Notes
-----
The work of debugging is implemented by the `_Maker`, `_Linker`,
and `_VariableEquivalenceTracker` classes.
and `_VariableEquivalenceTracker` classes.
"""
"""
...
@@ -2551,22 +2694,26 @@ class DebugMode(Mode):
...
@@ -2551,22 +2694,26 @@ class DebugMode(Mode):
"""
"""
When checking for the stability of optimization, recompile the
When checking for the stability of optimization, recompile the
graph this many times.
graph this many times.
"""
"""
check_c_code
=
config
.
DebugMode
.
check_c
check_c_code
=
config
.
DebugMode
.
check_c
"""
"""
Should we evaluate (and check) the `c_code` implementations?
Should we evaluate (and check) the `c_code` implementations?
"""
"""
check_py_code
=
config
.
DebugMode
.
check_py
check_py_code
=
config
.
DebugMode
.
check_py
"""
"""
Should we evaluate (and check) the `perform` implementations?
Should we evaluate (and check) the `perform` implementations?
Always checked if no `c_code`.
Always checked if no `c_code`.
"""
"""
check_isfinite
=
config
.
DebugMode
.
check_finite
check_isfinite
=
config
.
DebugMode
.
check_finite
"""
"""
Should we check for (and complain about) NaN/Inf ndarray elements?
Should we check for (and complain about) NaN/Inf ndarray elements?
"""
"""
require_matching_strides
=
config
.
DebugMode
.
check_strides
require_matching_strides
=
config
.
DebugMode
.
check_strides
...
@@ -2574,6 +2721,7 @@ class DebugMode(Mode):
...
@@ -2574,6 +2721,7 @@ class DebugMode(Mode):
Should we check for (and complain about) Ops whose python and C
Should we check for (and complain about) Ops whose python and C
outputs are ndarrays with different strides? (This can catch bugs,
outputs are ndarrays with different strides? (This can catch bugs,
but is generally overly strict.) 0 no check, 1 warn, 2 err.
but is generally overly strict.) 0 no check, 1 warn, 2 err.
"""
"""
check_preallocated_output
=
config
.
DebugMode
.
check_preallocated_output
check_preallocated_output
=
config
.
DebugMode
.
check_preallocated_output
...
@@ -2584,13 +2732,15 @@ class DebugMode(Mode):
...
@@ -2584,13 +2732,15 @@ class DebugMode(Mode):
"c_contiguous", "f_contiguous", "strided" (positive and negative
"c_contiguous", "f_contiguous", "strided" (positive and negative
strides), "wrong_size" (larger and smaller dimensions), and "ALL"
strides), "wrong_size" (larger and smaller dimensions), and "ALL"
(all of the above).
(all of the above).
"""
"""
# This function will be used to create a FunctionMaker in
# This function will be used to create a FunctionMaker in
# function_module.function
# function_module.function
def
function_maker
(
self
,
i
,
o
,
m
,
*
args
,
**
kwargs
):
def
function_maker
(
self
,
i
,
o
,
m
,
*
args
,
**
kwargs
):
"""
"""
Return an instance of `_Maker` which handles much of the debugging work
Return an instance of `_Maker` which handles much of the debugging work.
"""
"""
assert
m
is
self
assert
m
is
self
return
_Maker
(
i
,
o
,
self
.
optimizer
,
self
,
*
args
,
**
kwargs
)
return
_Maker
(
i
,
o
,
self
.
optimizer
,
self
,
*
args
,
**
kwargs
)
...
@@ -2604,12 +2754,11 @@ class DebugMode(Mode):
...
@@ -2604,12 +2754,11 @@ class DebugMode(Mode):
check_preallocated_output
=
None
,
check_preallocated_output
=
None
,
require_matching_strides
=
None
,
require_matching_strides
=
None
,
linker
=
_DummyLinker
()):
linker
=
_DummyLinker
()):
"""Initialize member variables.
"""
If any of these arguments (except optimizer) is not None, it overrides
If any of these arguments (except optimizer) is not None, it overrides
the class default.
the class default.
The linker argument is not used. It is set there to
The linker argument is not used. It is set there to allow
allow Mode.requiring() and some other fct to work with DebugMode too.
Mode.requiring() and some other fct to work with DebugMode too.
"""
"""
if
not
isinstance
(
linker
,
_DummyLinker
):
if
not
isinstance
(
linker
,
_DummyLinker
):
...
...
theano/compile/function.py
浏览文件 @
690d3628
"""Define the `function` function
"""
Define the `function` function.
"""
"""
import
six.moves.cPickle
as
pickle
import
six.moves.cPickle
as
pickle
import
logging
import
logging
...
@@ -23,8 +25,9 @@ def function_dump(filename, inputs, outputs=None, mode=None, updates=None,
...
@@ -23,8 +25,9 @@ def function_dump(filename, inputs, outputs=None, mode=None, updates=None,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
profile
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
profile
=
None
,
on_unused_input
=
None
):
on_unused_input
=
None
):
"""This is helpful to make a reproducable case for problem during
"""
Theano compilation.
This is helpful to make a reproducable case for problem during Theano
compilation.
Ex:
Ex:
...
@@ -65,78 +68,67 @@ def function(inputs, outputs=None, mode=None, updates=None, givens=None,
...
@@ -65,78 +68,67 @@ def function(inputs, outputs=None, mode=None, updates=None, givens=None,
"""
"""
Return a callable object that will calculate `outputs` from `inputs`.
Return a callable object that will calculate `outputs` from `inputs`.
:type inputs: list of either Variable or Param instances.
Parameters
:param inputs: function parameters, these are not allowed to be shared
----------
variables
inputs : list of either Variable or Param instances.
Function parameters, these are not allowed to be shared variables.
:type outputs: list or dict of Variables or Out instances. If it is a
outputs : list or dict of Variables or Out instances.
dict, the keys must be strings
If it is a dict, the keys must be strings. Expressions to compute.
:param outputs: expressions to compute
mode : string or `Mode` instance.
Compilation mode.
:type mode: string or `Mode` instance.
updates : iterable over pairs (shared_variable, new_expression). List, tuple
:param mode: compilation mode
or OrderedDict.
Updates the values for SharedVariable inputs according to these
:type updates: iterable over pairs (shared_variable, new_expression).
expressions.
List, tuple or OrderedDict.
givens : iterable over pairs (Var1, Var2) of Variables. List, tuple or dict.
:param updates: update the values for SharedVariable inputs
The Var1 and Var2 in each pair must have the same Type.
according to these expressions
Specific substitutions to make in the computation graph (Var2 replaces
Var1).
:type givens: iterable over pairs (Var1, Var2) of Variables. List,
no_default_updates: either bool or list of Variables
tuple or dict. The Var1 and Var2 in each pair must
If True, do not perform any automatic update on Variables. If False
have the same Type.
(default), perform them all. Else, perform automatic updates on all
:param givens: specific substitutions to make in the computation
Variables that are neither in "updates" nor in "no_default_updates".
graph (Var2 replaces Var1).
name : str
An optional name for this function. The profile mode will print the time
:type no_default_updates: either bool or list of Variables
spent in this function.
:param no_default_updates: if True, do not perform any automatic
rebuild_strict : bool
update on Variables. If False (default), perform them
True (Default) is the safer and better tested setting, in which case
all. Else, perform automatic updates on all Variables that are
`givens` must substitute new variables with the same Type as the
neither in "updates" nor in "no_default_updates".
variables they replace.
False is a you-better-know-what-you-are-doing setting, that permits
:param name: an optional name for this function. The profile mode
`givens` to replace variables with new variables of any Type.
will print the time spent in this function.
The consequence of changing a Type is that all results depending on that
variable may have a different Type too (the graph is rebuilt from inputs
:param rebuild_strict: True (Default) is the safer and better
to outputs). If one of the new types does not make sense for one of the
tested setting, in which case `givens` must substitute new
Ops in the graph, an Exception will be raised.
variables with the same Type as the variables they replace.
allow_input_downcast: bool or None
False is a you-better-know-what-you-are-doing setting, that
True means that the values passed as inputs when calling the function
permits `givens` to replace variables with new variables of
can be silently downcasted to fit the dtype of the corresponding
any Type. The consequence of changing a Type is that all
Variable, which may lose precision. False means that it will only be
results depending on that variable may have a different Type
cast to a more general, or precise, type. None (default) is almost like
too (the graph is rebuilt from inputs to outputs). If one of
False, but allows downcasting of Python float scalars to floatX.
the new types does not make sense for one of the Ops in the
profile: None, True, or ProfileStats instance
graph, an Exception will be raised.
Accumulate profiling information into a given ProfileStats instance.
If argument is `True` then a new ProfileStats instance will be used.
:type allow_input_downcast: Boolean or None
This profiling object will be available via self.profile.
:param allow_input_downcast: True means that the values passed as
on_unused_input
inputs when calling the function can be silently downcasted to
What to do if a variable in the 'inputs' list is not used in the graph.
fit the dtype of the corresponding Variable, which may lose
Possible values are 'raise', 'warn', 'ignore' and None.
precision. False means that it will only be cast to a more
general, or precise, type. None (default) is almost like
Returns
False, but allows downcasting of Python float scalars to
-------
floatX.
Function instance
A callable object that will compute the outputs (given the inputs) and
:type profile: None, True, or ProfileStats instance
update the implicit function arguments according to the `updates`.
:param profile: accumulate profiling information into a given
ProfileStats instance. If argument is `True` then a new
Notes
ProfileStats instance will be used. This profiling object
-----
will be available via self.profile.
Regarding givens: Be careful to make sure that these
substitutions are independent--behaviour when Var1 of one pair
:param on_unused_input: What to do if a variable in the 'inputs'
appears in the graph leading to Var2 in another expression is
list is not used in the graph. Possible values are 'raise',
undefined. Replacements specified with givens are different
'warn', 'ignore' and None.
from optimizations in that Var2 is not expected to be
equivalent to Var1.
:rtype: Function instance
:returns: a callable object that will compute the outputs (given
the inputs) and update the implicit function arguments
according to the `updates`.
:note: Regarding givens: Be careful to make sure that these
substitutions are independent--behaviour when Var1 of one pair
appears in the graph leading to Var2 in another expression is
undefined. Replacements specified with givens are different
from optimizations in that Var2 is not expected to be
equivalent to Var1.
Internal documentation:
Internal documentation:
...
@@ -214,6 +206,7 @@ def function(inputs, outputs=None, mode=None, updates=None, givens=None,
...
@@ -214,6 +206,7 @@ def function(inputs, outputs=None, mode=None, updates=None, givens=None,
was easier to develop the VM in Python then translate it to C instead
was easier to develop the VM in Python then translate it to C instead
of just writing it in C from scratch.
of just writing it in C from scratch.
CVM stands for C Virtual Machine.
CVM stands for C Virtual Machine.
"""
"""
if
isinstance
(
outputs
,
dict
):
if
isinstance
(
outputs
,
dict
):
output_items
=
list
(
outputs
.
items
())
output_items
=
list
(
outputs
.
items
())
...
...
theano/compile/function_module.py
浏览文件 @
690d3628
"""Driver of graph construction, optimization, and linking.
"""
Driver of graph construction, optimization, and linking.
"""
"""
from
__future__
import
print_function
from
__future__
import
print_function
...
@@ -31,13 +33,18 @@ __docformat__ = "restructuredtext en"
...
@@ -31,13 +33,18 @@ __docformat__ = "restructuredtext en"
class
UnusedInputError
(
Exception
):
class
UnusedInputError
(
Exception
):
"""
"""
A symbolic input passed to function is not needed
A symbolic input passed to function is not needed.
"""
"""
pass
pass
def
alias_root
(
v
):
def
alias_root
(
v
):
"Return the variable to which v is aliased by view_maps and destroy_maps"
"""
Return the variable to which v is aliased by view_maps and destroy_maps.
"""
if
v
.
owner
is
None
:
if
v
.
owner
is
None
:
return
v
return
v
vmap
=
getattr
(
v
.
owner
.
op
,
'view_map'
,
{})
vmap
=
getattr
(
v
.
owner
.
op
,
'view_map'
,
{})
...
@@ -56,8 +63,11 @@ def alias_root(v):
...
@@ -56,8 +63,11 @@ def alias_root(v):
def
view_tree_set
(
v
,
treeset
):
def
view_tree_set
(
v
,
treeset
):
"""Add to `treeset` all variables that are views of v, given that v is
"""
not a view"""
Add to `treeset` all variables that are views of v, given that v is
not a view.
"""
treeset
.
add
(
v
)
treeset
.
add
(
v
)
for
cl
,
v_input_pos_to_cl
in
v
.
clients
:
for
cl
,
v_input_pos_to_cl
in
v
.
clients
:
if
cl
==
'output'
:
if
cl
==
'output'
:
...
@@ -79,6 +89,7 @@ def infer_reuse_pattern(fgraph, outputs_to_disown):
...
@@ -79,6 +89,7 @@ def infer_reuse_pattern(fgraph, outputs_to_disown):
This list (or set) is also refered to as no_recycling sometimes,
This list (or set) is also refered to as no_recycling sometimes,
especially by linker code.
especially by linker code.
"""
"""
rval
=
set
()
rval
=
set
()
for
o
in
outputs_to_disown
:
for
o
in
outputs_to_disown
:
...
@@ -94,7 +105,10 @@ def fgraph_updated_vars(fgraph, expanded_inputs):
...
@@ -94,7 +105,10 @@ def fgraph_updated_vars(fgraph, expanded_inputs):
Reconstruct the full "updates" dictionary, mapping from FunctionGraph input
Reconstruct the full "updates" dictionary, mapping from FunctionGraph input
variables to the fgraph outputs that will replace their values.
variables to the fgraph outputs that will replace their values.
:rtype: dict variable -> variable
Returns
-------
dict variable -> variable
"""
"""
updated_vars
=
{}
updated_vars
=
{}
potential_values
=
list
(
fgraph
.
outputs
)
# copy the list
potential_values
=
list
(
fgraph
.
outputs
)
# copy the list
...
@@ -111,7 +125,9 @@ class Supervisor:
...
@@ -111,7 +125,9 @@ class Supervisor:
Listener for FunctionGraph events which makes sure that no
Listener for FunctionGraph events which makes sure that no
operation overwrites the contents of protected Variables. The
operation overwrites the contents of protected Variables. The
outputs of the FunctionGraph are protected by default.
outputs of the FunctionGraph are protected by default.
"""
"""
def
__init__
(
self
,
protected
):
def
__init__
(
self
,
protected
):
self
.
protected
=
list
(
protected
)
self
.
protected
=
list
(
protected
)
...
@@ -139,6 +155,7 @@ def std_fgraph(input_specs, output_specs, accept_inplace=False):
...
@@ -139,6 +155,7 @@ def std_fgraph(input_specs, output_specs, accept_inplace=False):
The returned FunctionGraph is a clone of the graph between the provided
The returned FunctionGraph is a clone of the graph between the provided
inputs and outputs.
inputs and outputs.
"""
"""
orig_inputs
=
[
spec
.
variable
for
spec
in
input_specs
]
orig_inputs
=
[
spec
.
variable
for
spec
in
input_specs
]
updates
=
[
spec
.
update
for
spec
in
input_specs
if
spec
.
update
]
updates
=
[
spec
.
update
for
spec
in
input_specs
if
spec
.
update
]
...
@@ -173,7 +190,10 @@ std_fgraph.features = [gof.toolbox.PreserveNames]
...
@@ -173,7 +190,10 @@ std_fgraph.features = [gof.toolbox.PreserveNames]
class
AliasedMemoryError
(
Exception
):
class
AliasedMemoryError
(
Exception
):
"""Memory is aliased that should not be"""
"""
Memory is aliased that should not be.
"""
pass
pass
...
@@ -190,19 +210,16 @@ class Function(object):
...
@@ -190,19 +210,16 @@ class Function(object):
Type of the functions returned by theano.function or
Type of the functions returned by theano.function or
theano.FunctionMaker.create.
theano.FunctionMaker.create.
`Function` is the callable object that does computation. It has the storage
of inputs and outputs, performs the packing and unpacking of inputs and
return values. It implements the square-bracket indexing so that you can
look up the value of a symbolic node.
`Function` is the callable object that does computation. It has
Functions are copyable via {{{fn.copy()}}} and {{{copy.copy(fn)}}}.
the storage of inputs and outputs, performs the packing and
When a function is copied, this instance is duplicated. Contrast with
unpacking of inputs and return values. It implements the
self.maker (instance of `FunctionMaker`) that is shared between copies.
square-bracket indexing so that you can look up the value of a
The meaning of copying a function is that the containers and their current
symbolic node.
values will all be duplicated. This requires that mutable inputs be
Functions are copyable via {{{fn.copy()}}} and
{{{copy.copy(fn)}}}. When a function is copied, this instance is
duplicated. Contrast with self.maker (instance of
`FunctionMaker`) that is shared between copies. The meaning of
copying a function is that the containers and their current values
will all be duplicated. This requires that mutable inputs be
copied, whereas immutable inputs may be shared between copies.
copied, whereas immutable inputs may be shared between copies.
A Function instance is hashable, on the basis of its memory
A Function instance is hashable, on the basis of its memory
...
@@ -220,62 +237,93 @@ class Function(object):
...
@@ -220,62 +237,93 @@ class Function(object):
the good results if you pass a python or numpy scalar instead of a
the good results if you pass a python or numpy scalar instead of a
numpy tensor. C code should raise an error if you pass an object
numpy tensor. C code should raise an error if you pass an object
of the wrong type.
of the wrong type.
Attributes
----------
finder
inv_finder
"""
"""
pickle_aliased_memory_strategy
=
'warn'
pickle_aliased_memory_strategy
=
'warn'
"""How to deal with pickling finding aliased storage.
"""
How to deal with pickling finding aliased storage.
Meaningful settings are: 'ignore', 'warn', 'raise'
Meaningful settings are: 'ignore', 'warn', 'raise'
.
If the value is 'warn', then a message will be printed to stderr
If the value is 'warn', then a message will be printed to stderr
if aliased storage is dectected during pickle.dump.
if aliased storage is dectected during pickle.dump.
If the value is 'raise', then an AliasedMemoryError will be raised
If the value is 'raise', then an AliasedMemoryError will be raised
if aliased storage is detected during pickle.dump.
if aliased storage is detected during pickle.dump.
"""
"""
input_storage
=
None
input_storage
=
None
"""list of Container instances"""
"""
List of Container instances.
"""
output_storage
=
None
output_storage
=
None
"""list of Container instances"""
"""
List of Container instances.
"""
indices
=
None
indices
=
None
"""list of (SymbolicInput|SymbolicInputKit, indices,
"""
[SymbolicInput,...]), one tuple for each input
List of (SymbolicInput|SymbolicInputKit, indices, [SymbolicInput,...]),
one tuple for each input.
The first tuple element is the SymbolicInput object for the
The first tuple element is the SymbolicInput object for the
corresponding
corresponding
function input.
function input.
The second and third tuple elements are used only by Kits, which
The second and third tuple elements are used only by Kits, which
are deprecated.
are deprecated.
"""
"""
defaults
=
None
defaults
=
None
""" list of 3-tuples, one 3-tuple for each input.
"""
List of 3-tuples, one 3-tuple for each input.
Tuple element 0: Bool: Is this input required at each function call?
Tuple element 0: Bool: Is this input required at each function call?
Tuple element 1: Bool: Should this inputs value be reverted after
Tuple element 1: Bool: Should this inputs value be reverted after
each call?
each call?
Tuple element 2: Any: The value associated with this input.
Tuple element 2: Any: The value associated with this input.
"""
"""
unpack_single
=
None
unpack_single
=
None
"""Bool: for outputs lists of length 1, should the 0'th element be
"""
returned directly?"""
Bool: for outputs lists of length 1, should the 0'th element be
returned directly?
"""
return_none
=
None
return_none
=
None
"""Bool: whether the function should return None or not"""
"""
Bool: whether the function should return None or not.
"""
maker
=
None
maker
=
None
"""FunctionMaker instance"""
"""
FunctionMaker instance.
"""
fn
=
None
fn
=
None
"""a function that evaluates the graph. Typically a linker's
"""
make_thunk method created this function."""
A function that evaluates the graph. Typically a linker's make_thunk method
created this function.
"""
finder
=
None
finder
=
None
"""Dictionary mapping several kinds of things to containers.
"""
Dictionary mapping several kinds of things to containers.
We set an entry in finder for:
We set an entry in finder for:
...
@@ -286,21 +334,20 @@ returned directly?"""
...
@@ -286,21 +334,20 @@ returned directly?"""
- the name of the input
- the name of the input
All entries map to the container or to DUPLICATE if an ambiguity
All entries map to the container or to DUPLICATE if an ambiguity
is detected
is detected.
"""
"""
inv_finder
=
None
inv_finder
=
None
"""Dict. Reverse lookup of `finder`.
"""
Dict. Reverse lookup of `finder`.
It maps container -> SymbolicInput
It maps container -> SymbolicInput
"""
"""
def
__init__
(
self
,
fn
,
input_storage
,
output_storage
,
indices
,
outputs
,
def
__init__
(
self
,
fn
,
input_storage
,
output_storage
,
indices
,
outputs
,
defaults
,
unpack_single
,
return_none
,
output_keys
,
maker
):
defaults
,
unpack_single
,
return_none
,
output_keys
,
maker
):
"""
Initialize attributes. create finder, inv_finder.
"""
self
.
fn
=
fn
self
.
fn
=
fn
self
.
input_storage
=
input_storage
self
.
input_storage
=
input_storage
self
.
output_storage
=
output_storage
self
.
output_storage
=
output_storage
...
@@ -875,13 +922,35 @@ NODEFAULT = ['NODEFAULT']
...
@@ -875,13 +922,35 @@ NODEFAULT = ['NODEFAULT']
class
FunctionMaker
(
object
):
class
FunctionMaker
(
object
):
"""`FunctionMaker` is the class to `create` `Function` instances.
"""
`FunctionMaker` is the class to `create` `Function` instances.
This class has the fgraph, the optimizer, and the linker.
When
This class has the fgraph, the optimizer, and the linker. When
copying a `Function`, there is no need to duplicate the
copying a `Function`, there is no need to duplicate the
`FunctionMaker` instance.
Deepcopy still copies both, which can
`FunctionMaker` instance. Deepcopy still copies both, which can
variable in re-compilation.
variable in re-compilation.
Parameters
----------
inputs : list of SymbolicInput instances
outputs : list of SymbolicOutput instances
Outputs may also be a single Variable (not a list), in which case the
functions produced by FunctionMaker will return their output value
directly.
mode : Mode instance
Telling FunctionMaker how to optimize and link. None means to use the
`config.mode`.
accept_inplace : bool
True iff it is acceptable to have inplace operations in the graph from
the inputs to the outputs.
on_unused_input : {'raise', 'warn', 'ignore', None}
What to do if a variable in the 'inputs' list is not used in the graph.
Possible values are:
- 'raise': raise an error
- 'warn': log a warning
- 'ignore': do not do anything
- None: Use the value in the Theano flags on_unused_input.
"""
"""
@staticmethod
@staticmethod
...
@@ -1101,29 +1170,6 @@ class FunctionMaker(object):
...
@@ -1101,29 +1170,6 @@ class FunctionMaker(object):
mode
=
None
,
accept_inplace
=
False
,
function_builder
=
Function
,
mode
=
None
,
accept_inplace
=
False
,
function_builder
=
Function
,
profile
=
None
,
on_unused_input
=
None
,
fgraph
=
None
,
profile
=
None
,
on_unused_input
=
None
,
fgraph
=
None
,
output_keys
=
None
):
output_keys
=
None
):
"""
:type inputs: a list of SymbolicInput instances
:type outputs: a list of SymbolicOutput instances outputs may
also be a single Variable (not a list), in which case the
functions produced by FunctionMaker will return their
output value directly
:param mode: a Mode instance telling FunctionMaker how to
optimize and link. None means to use the `config.mode`.
:param accept_inplace: True iff it is acceptable to have
inplace operations in the graph from the inputs to the
outputs
:param on_unused_input: What to do if a variable in the 'inputs' list
is not used in the graph. Possible values are:
- 'raise': raise an error
- 'warn': log a warning
- 'ignore': do not do anything
- None: Use the value in the Theano flags on_unused_input
"""
mode
=
theano
.
compile
.
mode
.
get_mode
(
mode
)
mode
=
theano
.
compile
.
mode
.
get_mode
(
mode
)
# figure out which profile object to use (if any)
# figure out which profile object to use (if any)
...
@@ -1313,12 +1359,15 @@ class FunctionMaker(object):
...
@@ -1313,12 +1359,15 @@ class FunctionMaker(object):
"""
"""
Create a function.
Create a function.
input_storage -> a list matching the inputs list and providing
Parameters
default values if the default for an input is
----------
None, then that input is a required input. For an
input_storage
input with an update, the default acts as
A list matching the inputs list and providing default values if the
initialization.
default for an input is None, then that input is a required input.
trustme -> disables some exceptions, used internally
For an input with an update, the default acts as initialization.
trustme
Disables some exceptions, used internally.
"""
"""
if
input_storage
is
None
:
if
input_storage
is
None
:
...
@@ -1453,43 +1502,43 @@ def orig_function(inputs, outputs, mode=None, accept_inplace=False,
...
@@ -1453,43 +1502,43 @@ def orig_function(inputs, outputs, mode=None, accept_inplace=False,
"""
"""
Return a Function that will calculate the outputs from the inputs.
Return a Function that will calculate the outputs from the inputs.
:param inputs: list of `SymbolicInput` or `In` instances
Parameters
----------
:param outputs: a SymbolicOutput or a list of `SymbolicOutput` or
inputs : list of `SymbolicInput` or `In` instances
`Out` instances. The return value of the returned function
outputs : a SymbolicOutput or a list of `SymbolicOutput` or `Out` instances
will match the format of this argument (either the value
The return value of the returned function will match the format of this
itself or a list of one or more return values)
argument (either the value itself or a list of one or more return
values).
:param mode: a descriptive string or a Mode instance. (Default of None
mode : descriptive string or Mode instance
means to use `config.mode` (See below for descriptive string list).
Default of None means to use `config.mode` (see below for descriptive
string list).
:param name: an optional name for this fct. If used, the profile mode will
name : str
print the time spent in this fct.
An optional name for this fct. If used, the profile mode will print the
time spent in this fct.
accept_inplace : bool
True iff the graph can contain inplace operations prior to the
optimization phase (default is False).
profile : None or ProfileStats instance
on_unused_input : {'raise', 'warn', 'ignore', None}
What to do if a variable in the 'inputs' list is not used in the graph.
output_keys :
If the outputs were provided to theano.function as a list, then
output_keys is None. Otherwise, if outputs were provided as a dict,
output_keys is the sorted list of keys from the outputs.
Notes
-----
Currently, the library provides the following mode strings:
Currently, the library provides the following mode strings:
- FAST_RUN (default) (optimize without too much time)
- FAST_RUN (default) (optimize without too much time)
- FAST_COMPILE (minimal optimization)
- ProfileMode(deprecated): allow to print a profile mode with
mode.print_summary
- DebugMode: verify many internal conditions that are normally assumed
(slow)
:param accept_inplace: True iff the graph can contain inplace operations
prior to the optimization phase (default is False)
:param profile: None or ProfileStats instance
- FAST_COMPILE (minimal optimization)
:param on_unused_input: What to do if a variable in the 'inputs' list is
- ProfileMode(deprecated): allow to print a profile mode with
not used in the graph. Possible values are 'raise', 'warn', 'ignore'
mode.print_summary
and None
:param output_keys: If the outputs were provided to theano.function as a
- DebugMode: verify many internal conditions that are normally assumed
list, then output_keys is None. Otherwise, if outputs were provided
(slow)
as a dict, output_keys is the sorted list of keys from the outputs
"""
"""
...
@@ -1554,6 +1603,7 @@ def convert_function_input(input):
...
@@ -1554,6 +1603,7 @@ def convert_function_input(input):
- a tuple (name, (r,up), val) will be
- a tuple (name, (r,up), val) will be
`In`(r, name=name, value=val, update=up, autoname=True)
`In`(r, name=name, value=val, update=up, autoname=True)
"""
"""
if
isinstance
(
input
,
(
SymbolicInput
,
SymbolicInputKit
)):
if
isinstance
(
input
,
(
SymbolicInput
,
SymbolicInputKit
)):
return
input
return
input
...
@@ -1615,7 +1665,10 @@ def convert_function_input(input):
...
@@ -1615,7 +1665,10 @@ def convert_function_input(input):
def
get_info_on_inputs
(
named_inputs
,
n_unnamed_inputs
):
def
get_info_on_inputs
(
named_inputs
,
n_unnamed_inputs
):
"""Return a human-readable description of named and un-named inputs."""
"""
Return a human-readable description of named and un-named inputs.
"""
n_named_inputs
=
len
(
named_inputs
)
n_named_inputs
=
len
(
named_inputs
)
def
get_plural
(
n
):
def
get_plural
(
n
):
...
...
theano/compile/io.py
浏览文件 @
690d3628
"""Define `SymbolicInput`, `SymbolicOutput`, `In`, `Out` """
"""
Define `SymbolicInput`, `SymbolicOutput`, `In`, `Out`.
"""
from
theano
import
gof
from
theano
import
gof
from
.sharedvalue
import
SharedVariable
from
.sharedvalue
import
SharedVariable
...
@@ -15,52 +17,43 @@ class SymbolicInput(object):
...
@@ -15,52 +17,43 @@ class SymbolicInput(object):
"""
"""
Represents a symbolic input for use with function or FunctionMaker.
Represents a symbolic input for use with function or FunctionMaker.
variable: a Variable instance.
Parameters
This will be assigned a value before running the function,
----------
not computed from its owner.
variable : a Variable instance
This will be assigned a value before running the function, not computed
name: Any type. (If autoname=True, defaults to variable.name).
from its owner.
If name is a valid Python identifier, this input can be set by
name : Any type
kwarg, and its value can be accessed by self.<name>.
If autoname=True, defaults to variable.name.
If name is a valid Python identifier, this input can be set by kwarg,
update: Variable instance (default: None)
and its value can be accessed by self.<name>.
value (see previous) will be replaced with this expression
update : Variable instance
variable after each function call. If update is None, the
Defaults to None. Value (see previous) will be replaced with this
expression variable after each function call. If update is None, the
update will be the default value of the input.
update will be the default value of the input.
mutable : bool
Defaults to False if update is None, True if update is not None.
True: permit the compiled function to modify the python object being
passed as the input.
False: do not permit the compiled function to modify the python object
being passed as the input.
strict : bool
Defaults to False.
True: means that the value you pass for this input must have exactly the
right type.
False: the value you pass for this input may be cast automatically to
the proper type.
allow_downcast : bool or None
Defaults to None. Only applies when `strict` is False.
True: the value you pass for this input can be silently downcasted to
fit the right type, which may lose precision.
False: the value will only be cast to a more general, or precise, type.
None: Almost like False, but allows downcast of Python floats to floatX.
autoname : bool
Defaults to True. See the name option.
implicit : bool
Defaults to False. See help(In). Note that 'None' is not allowed here,
since we are in the symbolic case.
mutable: Bool (default: False if update is None, True if update is
not None)
True: permit the compiled function to modify the python object
being passed as the input
False: do not permit the compiled function to modify the
python object being passed as the input.
strict: Bool (default: False)
True: means that the value you pass for this input must have
exactly the right type
False: the value you pass for this input may be cast
automatically to the proper type
allow_downcast: Bool or None (default: None)
Only applies when `strict` is False.
True: the value you pass for this input can be silently
downcasted to fit the right type, which may lose precision.
False: the value will only be cast to a more general, or
precise, type. None: Almost like False, but allows downcast
of Python floats to floatX.
autoname: Bool (default: True)
See the name option.
implicit: Bool (default: False)
See help(In). Note that 'None' is not allowed here, since we
are in the symbolic case.
"""
"""
def
__init__
(
self
,
variable
,
name
=
None
,
update
=
None
,
mutable
=
None
,
def
__init__
(
self
,
variable
,
name
=
None
,
update
=
None
,
mutable
=
None
,
...
@@ -105,19 +98,22 @@ class SymbolicInputKit(object):
...
@@ -105,19 +98,22 @@ class SymbolicInputKit(object):
A SymbolicInputKit provides the distribute function in order to set or
A SymbolicInputKit provides the distribute function in order to set or
initialize several inputs from a single value. Specialized Kits should
initialize several inputs from a single value. Specialized Kits should
override it.
override it.
"""
"""
def
__init__
(
self
,
name
):
def
__init__
(
self
,
name
):
if
not
isinstance
(
name
,
string_types
):
if
not
isinstance
(
name
,
string_types
):
raise
TypeError
(
'na
em
must be a string (got:
%
s)'
%
name
)
raise
TypeError
(
'na
me
must be a string (got:
%
s)'
%
name
)
self
.
name
=
name
self
.
name
=
name
self
.
sinputs
=
[]
self
.
sinputs
=
[]
self
.
variables
=
[]
self
.
variables
=
[]
def
add_input
(
self
,
sinput
):
def
add_input
(
self
,
sinput
):
"""
"""
Add a SymbolicInput to this SymbolicInputKit. It will be given the
Add a SymbolicInput to this SymbolicInputKit.
next available index.
It will be given the next available index.
"""
"""
self
.
sinputs
.
append
(
sinput
)
self
.
sinputs
.
append
(
sinput
)
self
.
variables
.
append
(
sinput
.
variable
)
self
.
variables
.
append
(
sinput
.
variable
)
...
@@ -127,18 +123,20 @@ class SymbolicInputKit(object):
...
@@ -127,18 +123,20 @@ class SymbolicInputKit(object):
Given a list of indices corresponding to SymbolicInputs in this kit
Given a list of indices corresponding to SymbolicInputs in this kit
as well as a corresponding list of containers, initialize all the
as well as a corresponding list of containers, initialize all the
containers using the provided value.
containers using the provided value.
"""
"""
raise
NotImplementedError
raise
NotImplementedError
def
complete
(
self
,
inputs
):
def
complete
(
self
,
inputs
):
"""
"""
Given inputs (a list of Variable instances), checks through all
Given inputs (a list of Variable instances), checks through all the
the SymbolicInputs in the kit and return a sorted list of
SymbolicInputs in the kit and return a sorted list of indices and a list
indices and a list of their corresponding SymbolicInputs such
of their corresponding SymbolicInputs such that each of them represents
that each of them represents some variable in the inputs list.
some variable in the inputs list.
Not all the provided inputs will have a corresponding SymbolicInput in
the kit.
Not all the provided inputs will have a corresponding
SymbolicInput in the kit.
"""
"""
ret
=
[]
ret
=
[]
for
input
in
inputs
:
for
input
in
inputs
:
...
@@ -157,73 +155,62 @@ class In(SymbolicInput):
...
@@ -157,73 +155,62 @@ class In(SymbolicInput):
"""
"""
Represents a symbolic input for use with function or FunctionMaker.
Represents a symbolic input for use with function or FunctionMaker.
variable: a Variable instance.
Parameters
This will be assigned a value before running the function,
----------
not computed from its owner.
variable : a Variable instance
This will be assigned a value before running the function, not computed
name: Any type. (If autoname=True, defaults to variable.name).
from its owner.
If name is a valid Python identifier, this input can be set by
name : Any type
kwarg, and its value can be accessed by self.<name>.
If autoname=True, defaults to variable.name.
If name is a valid Python identifier, this input can be set by kwarg,
value: Any type.
and its value can be accessed by self.<name>.
value : Any type
The initial/default value for this input. If update is None,
The initial/default value for this input. If update is None,
this input acts just like an argument with a default value in
this input acts just like an argument with a default value in
Python. If update is not None, changes to this value will
Python. If update is not None, changes to this value will
"stick around", whether due to an update or a user's explicit
"stick around", whether due to an update or a user's explicit
action.
action.
update : Variable instance
update: Variable instance (default: None)
Defaults to None. Value (see previous) will be replaced with this
value (see previous) will be replaced with this expression
expression variable after each function call. If update is None, the
variable after each function call. If update is None, the
update will be the default value of the input.
update will be the default value of the input.
mutable : bool
mutable: Bool (default: False if update is None, True if update is
Defaults to False if update is None, True if update is not None.
not None)
True: permit the compiled function to modify the python object
True: permit the compiled function to modify the python object
being passed as the input
being passed as the input.
False: do not permit the compiled function to modify the
False: do not permit the compiled function to modify the
python object being passed as the input.
python object being passed as the input.
borrow : bool
borrow: Bool (default: take the same value as mutable)
Default : take the same value as mutable.
True: permit the output of the compiled function to be aliased
True: permit the output of the compiled function to be aliased
to the input
to the input.
False: do not permit any output to be aliased to the input.
False: do not permit any output to be aliased to the input
strict : bool
Defaults to False.
strict: Bool (default: False)
True: means that the value you pass for this input must have exactly
the right type.
True: means that the value you pass for this input must have
False: the value you pass for this input may be cast automatically to
exactly the right type
the proper type.
allow_downcast : bool or None
False: the value you pass for this input may be cast
Defaults to None. Only applies when `strict` is False.
automatically to the proper type
True: the value you pass for this input can be silently downcasted to
fit the right type, which may lose precision.
allow_downcast: Bool or None (default: None)
False: the value will only be cast to a more general, or precise, type.
Only applies when `strict` is False.
None: Almost like False, but allows downcast of Python floats to floatX.
autoname : bool
True: the value you pass for this input can be silently
Defaults to True. See the name option.
downcasted to fit the right type, which may lose precision.
implicit : bool or None
Defaults to None.
False: the value will only be cast to a more general, or
precise, type. None: Almost like False, but allows downcast
of Python floats to floatX.
autoname: Bool (default: True)
See the name option.
implicit: Bool or None (default: None)
True: This input is implicit in the sense that the user is not allowed
True: This input is implicit in the sense that the user is not allowed
to provide a value for it. Requires 'value' to be set.
to provide a value for it. Requires 'value' to be set.
False: The user can provide a value for this input. Be careful when
False: The user can provide a value for this input. Be careful when
'value' is a container, because providing an input value will
'value' is a container, because providing an input value will
overwrite the content of this container.
overwrite the content of this container.
None: Automatically choose between True or False depending on the
None: Automatically choose between True or False depending on the
situation. It will be set to False in all cases except if 'value'
situation. It will be set to False in all cases except if 'value' is a
is a container (so that there is less risk of accidentally
container (so that there is less risk of accidentally overwriting its
overwriting its content without being aware of it).
content without being aware of it).
"""
"""
# Note: the documentation above is duplicated in doc/topics/function.txt,
# Note: the documentation above is duplicated in doc/topics/function.txt,
# try to keep it synchronized.
# try to keep it synchronized.
...
@@ -273,10 +260,14 @@ class SymbolicOutput(object):
...
@@ -273,10 +260,14 @@ class SymbolicOutput(object):
"""
"""
Represents a symbolic output for use with function or FunctionMaker.
Represents a symbolic output for use with function or FunctionMaker.
borrow: set this to True to indicate that a reference to
Parameters
function's internal storage may be returned. A value
----------
returned for this output might be clobbered by running
borrow : bool
the function again, but the function might be faster.
Set this to True to indicate that a reference to function's internal
storage may be returned. A value returned for this output might be
clobbered by running the function again, but the function might be
faster.
"""
"""
def
__init__
(
self
,
variable
,
borrow
=
False
):
def
__init__
(
self
,
variable
,
borrow
=
False
):
...
...
theano/compile/mode.py
浏览文件 @
690d3628
"""WRITEME
"""
WRITEME
"""
"""
from
__future__
import
print_function
from
__future__
import
print_function
import
logging
import
logging
...
@@ -34,8 +36,9 @@ AddConfigVar('optimizer_requiring',
...
@@ -34,8 +36,9 @@ AddConfigVar('optimizer_requiring',
def
check_equal
(
x
,
y
):
def
check_equal
(
x
,
y
):
"""
"""
Returns True iff x[0] and y[0] are equal (checks the dtype and
Returns True iff x[0] and y[0] are equal (checks the dtype and shape if x
shape if x and y are numpy.ndarray instances). Used internally.
and y are numpy.ndarray instances). Used internally.
"""
"""
# I put the import here to allow using theano without scipy.
# I put the import here to allow using theano without scipy.
import
scipy.sparse
as
sp
import
scipy.sparse
as
sp
...
@@ -125,17 +128,19 @@ def register_optimizer(name, opt):
...
@@ -125,17 +128,19 @@ def register_optimizer(name, opt):
class
AddDestroyHandler
(
gof
.
Optimizer
):
class
AddDestroyHandler
(
gof
.
Optimizer
):
"""This optimizer performs two important functions:
"""
This optimizer performs two important functions:
1) It has a 'requirement' of the destroyhandler. This means that the fgraph
1) It has a 'requirement' of the destroyhandler. This means that the fgraph
will include it as a feature for this optimization, and keep this feature
will include it as a feature for this optimization, and keep this feature
enabled for subsequent optimizations.
All optimizations that work inplace
enabled for subsequent optimizations. All optimizations that work inplace
on any of their inputs must run *after* this optimization to ensure that
on any of their inputs must run *after* this optimization to ensure that
the DestroyHandler has been included in the fgraph.
the DestroyHandler has been included in the fgraph.
2) It tries to replace each output with an Op that purports to destroy it
2) It tries to replace each output with an Op that purports to destroy it
(but it won't I promise). If this replacement succeeds it means that
(but it won't I promise). If this replacement succeeds it means that
there is a bug in theano. It should not be possible to destroy outputs.
there is a bug in theano. It should not be possible to destroy outputs.
"""
"""
def
apply
(
self
,
fgraph
):
def
apply
(
self
,
fgraph
):
for
o
in
fgraph
.
outputs
:
for
o
in
fgraph
.
outputs
:
...
@@ -157,11 +162,13 @@ class AddDestroyHandler(gof.Optimizer):
...
@@ -157,11 +162,13 @@ class AddDestroyHandler(gof.Optimizer):
class
AddNoOutputFromInplace
(
gof
.
Optimizer
):
class
AddNoOutputFromInplace
(
gof
.
Optimizer
):
"""This optimizer adds to the fgraph a feature that will prevent outputs
"""
This optimizer adds to the fgraph a feature that will prevent outputs
of a fgraph to be created by performing inplace operations on intermediary
of a fgraph to be created by performing inplace operations on intermediary
variables. This is useful when the outputs of the fgraph are preallocated
variables. This is useful when the outputs of the fgraph are preallocated
to prevent useless copying of the data. Currently, scan preallocates its
to prevent useless copying of the data. Currently, scan preallocates its
outputs
outputs
"""
"""
def
add_requirements
(
self
,
fgraph
):
def
add_requirements
(
self
,
fgraph
):
super
(
AddNoOutputFromInplace
,
self
)
.
add_requirements
(
fgraph
)
super
(
AddNoOutputFromInplace
,
self
)
.
add_requirements
(
fgraph
)
...
@@ -169,10 +176,12 @@ class AddNoOutputFromInplace(gof.Optimizer):
...
@@ -169,10 +176,12 @@ class AddNoOutputFromInplace(gof.Optimizer):
class
PrintCurrentFunctionGraph
(
gof
.
Optimizer
):
class
PrintCurrentFunctionGraph
(
gof
.
Optimizer
):
"""This optimizer is for debugging.
"""
This optimizer is for debugging.
Toss it into the optimization pipeline to see the state of things at any
Toss it into the optimization pipeline to see the state of things at any
given point.
given point.
"""
"""
def
__init__
(
self
,
header
):
def
__init__
(
self
,
header
):
self
.
header
=
header
self
.
header
=
header
...
@@ -233,18 +242,23 @@ optdb.register('merge3', gof.MergeOptimizer(),
...
@@ -233,18 +242,23 @@ optdb.register('merge3', gof.MergeOptimizer(),
class
Mode
(
object
):
class
Mode
(
object
):
"""
"""
The Mode represents a way to optimize and then link a computation
The Mode represents a way to optimize and then link a computation graph.
graph.
Parameters
* optimizer -> a structure of type Optimizer. An Optimizer may
----------
simplify the math, put similar computations together, improve
optimizer : a structure of type Optimizer
numerical stability and various other improvements.
An Optimizer may simplify the math, put similar computations together,
* linker -> a structure of type Linker. A Linker decides which
improve numerical stability and various other improvements.
implementations to use (C or Python, for example) and how to
linker : a structure of type Linker
string them together to perform the computation.
A Linker decides which implementations to use (C or Python, for example)
and how to string them together to perform the computation.
See predefined_linkers, predefined_optimizers and also
predefined_modes.
See Also
--------
predefined_linkers
predefined_optimizers
predefined_modes
"""
"""
def
__init__
(
self
,
linker
=
None
,
optimizer
=
'default'
):
def
__init__
(
self
,
linker
=
None
,
optimizer
=
'default'
):
...
@@ -326,6 +340,7 @@ class Mode(object):
...
@@ -326,6 +340,7 @@ class Mode(object):
Keyword arguments can be provided for the linker,
Keyword arguments can be provided for the linker,
in which case its `clone` method will be called with these
in which case its `clone` method will be called with these
arguments.
arguments.
"""
"""
new_linker
=
self
.
linker
.
clone
(
**
link_kwargs
)
new_linker
=
self
.
linker
.
clone
(
**
link_kwargs
)
new_optimizer
=
self
.
provided_optimizer
new_optimizer
=
self
.
provided_optimizer
...
@@ -412,7 +427,10 @@ def get_default_mode():
...
@@ -412,7 +427,10 @@ def get_default_mode():
def
register_mode
(
name
,
mode
):
def
register_mode
(
name
,
mode
):
"""Add a `Mode` which can be referred to by `name` in `function`."""
"""
Add a `Mode` which can be referred to by `name` in `function`.
"""
if
name
in
predefined_modes
:
if
name
in
predefined_modes
:
raise
ValueError
(
'Mode name already taken:
%
s'
%
name
)
raise
ValueError
(
'Mode name already taken:
%
s'
%
name
)
predefined_modes
[
name
]
=
mode
predefined_modes
[
name
]
=
mode
theano/compile/monitormode.py
浏览文件 @
690d3628
...
@@ -8,7 +8,6 @@ from theano.compile.mode import Mode
...
@@ -8,7 +8,6 @@ from theano.compile.mode import Mode
class
MonitorMode
(
Mode
):
class
MonitorMode
(
Mode
):
"""
"""
`MonitorMode` is a debug mode to easily step through function execution.
`MonitorMode` is a debug mode to easily step through function execution.
...
@@ -19,28 +18,28 @@ class MonitorMode(Mode):
...
@@ -19,28 +18,28 @@ class MonitorMode(Mode):
A typical use case is to detect the introduction of NaN values in a graph.
A typical use case is to detect the introduction of NaN values in a graph.
For an example of such a use case, see doc/tutorial/debug_faq.txt.
For an example of such a use case, see doc/tutorial/debug_faq.txt.
Parameters
----------
pre_func
A function to call before executing a thunk, with arguments:
- the thunk index
- the Apply node
- the thunk to be called
post_func
A function to call after executing a thunk, with the same three
arguments as `pre_func`.
optimizer
The optimizer to use. One may use for instance 'fast_compile' to skip
optimizations.
linker
DO NOT USE. This mode uses its own linker. The parameter is needed to
allow selecting optimizers to use.
"""
"""
def
__init__
(
self
,
pre_func
=
None
,
post_func
=
None
,
def
__init__
(
self
,
pre_func
=
None
,
post_func
=
None
,
optimizer
=
'default'
,
linker
=
None
):
optimizer
=
'default'
,
linker
=
None
):
"""
Constructor.
:param pre_func: A function to call before executing a thunk, with
arguments:
- the thunk index
- the Apply node
- the thunk to be called
:param post_func: A function to call after executing a thunk, with the
same three arguments as `pre_func`.
:param optimizer: The optimizer to use. One may use for instance
'fast_compile' to skip optimizations.
:param linker: DO NOT USE. This mode uses its own linker.
The parameter is needed to allow selecting optimizers to use.
"""
self
.
pre_func
=
pre_func
self
.
pre_func
=
pre_func
self
.
post_func
=
post_func
self
.
post_func
=
post_func
wrap_linker
=
theano
.
gof
.
WrapLinkerMany
([
theano
.
gof
.
OpWiseCLinker
()],
wrap_linker
=
theano
.
gof
.
WrapLinkerMany
([
theano
.
gof
.
OpWiseCLinker
()],
...
@@ -67,6 +66,7 @@ class MonitorMode(Mode):
...
@@ -67,6 +66,7 @@ class MonitorMode(Mode):
def
eval
(
self
,
i
,
node
,
fn
):
def
eval
(
self
,
i
,
node
,
fn
):
"""
"""
The method that calls the thunk `fn`.
The method that calls the thunk `fn`.
"""
"""
if
self
.
pre_func
is
not
None
:
if
self
.
pre_func
is
not
None
:
self
.
pre_func
(
i
,
node
,
fn
)
self
.
pre_func
(
i
,
node
,
fn
)
...
@@ -96,9 +96,9 @@ class MonitorMode(Mode):
...
@@ -96,9 +96,9 @@ class MonitorMode(Mode):
"""
"""
Create a new instance of this Mode.
Create a new instance of this Mode.
Keyword arguments can be provided for the linker,
Keyword arguments can be provided for the linker,
but they will be
but they will be ignored, because ProfileMode needs
ignored, because ProfileMode needs to use its own linker.
to use its own linker.
"""
"""
new_mode
=
type
(
self
)(
pre_func
=
self
.
pre_func
,
new_mode
=
type
(
self
)(
pre_func
=
self
.
pre_func
,
post_func
=
self
.
post_func
,
post_func
=
self
.
post_func
,
...
...
theano/compile/nanguardmode.py
浏览文件 @
690d3628
...
@@ -16,11 +16,14 @@ def flatten(l):
...
@@ -16,11 +16,14 @@ def flatten(l):
Parameters
Parameters
----------
----------
l : List/tuple/other objects, might be nested.
l : list/tuple/other objects
Might be nested.
Returns
Returns
-------
-------
A flattened list of objects
object
A flattened list of objects.
"""
"""
if
isinstance
(
l
,
(
list
,
tuple
,
collections
.
ValuesView
)):
if
isinstance
(
l
,
(
list
,
tuple
,
collections
.
ValuesView
)):
rval
=
[]
rval
=
[]
...
@@ -53,6 +56,7 @@ def contains_nan(arr):
...
@@ -53,6 +56,7 @@ def contains_nan(arr):
This approach is faster and more memory efficient than the obvious
This approach is faster and more memory efficient than the obvious
alternative, calling `np.any(np.isnan(ndarray))`, which requires the
alternative, calling `np.any(np.isnan(ndarray))`, which requires the
construction of a boolean array with the same shape as the input array.
construction of a boolean array with the same shape as the input array.
"""
"""
if
isinstance
(
arr
,
theano
.
gof
.
type
.
CDataType
.
_cdata_type
):
if
isinstance
(
arr
,
theano
.
gof
.
type
.
CDataType
.
_cdata_type
):
return
False
return
False
...
@@ -81,6 +85,7 @@ def contains_inf(arr):
...
@@ -81,6 +85,7 @@ def contains_inf(arr):
This approach is more memory efficient than the obvious alternative,
This approach is more memory efficient than the obvious alternative,
calling `np.any(np.isinf(ndarray))`, which requires the construction of a
calling `np.any(np.isinf(ndarray))`, which requires the construction of a
boolean array with the same shape as the input array.
boolean array with the same shape as the input array.
"""
"""
if
isinstance
(
arr
,
theano
.
gof
.
type
.
CDataType
.
_cdata_type
):
if
isinstance
(
arr
,
theano
.
gof
.
type
.
CDataType
.
_cdata_type
):
return
False
return
False
...
@@ -97,14 +102,16 @@ class NanGuardMode(Mode):
...
@@ -97,14 +102,16 @@ class NanGuardMode(Mode):
Parameters
Parameters
----------
----------
nan_is_error : bool
nan_is_error : bool
If True, raise an error anytime a NaN is encountered
If True, raise an error anytime a NaN is encountered
.
inf_is_error: bool
inf_is_error
: bool
If True, raise an error anytime an Inf is encountered. Note that some
If True, raise an error anytime an Inf is encountered. Note that some
pylearn2 modules currently use np.inf as a default value (e.g.
pylearn2 modules currently use np.inf as a default value (e.g.
mlp.max_pool) and these will cause an error if inf_is_error is True.
mlp.max_pool) and these will cause an error if inf_is_error is True.
big_is_error: bool
big_is_error
: bool
If True, raise an error when a value greater than 1e10 is encountered.
If True, raise an error when a value greater than 1e10 is encountered.
"""
"""
def
__init__
(
self
,
nan_is_error
,
inf_is_error
,
big_is_error
=
True
):
def
__init__
(
self
,
nan_is_error
,
inf_is_error
,
big_is_error
=
True
):
if
cuda
.
cuda_available
:
if
cuda
.
cuda_available
:
self
.
guard_input
=
cuda
.
fvector
(
'nan_guard'
)
self
.
guard_input
=
cuda
.
fvector
(
'nan_guard'
)
...
@@ -135,12 +142,13 @@ class NanGuardMode(Mode):
...
@@ -135,12 +142,13 @@ class NanGuardMode(Mode):
var : numpy.ndarray
var : numpy.ndarray
The value to be checked.
The value to be checked.
nd : theano.gof.Apply
nd : theano.gof.Apply
The Apply node being executed
The Apply node being executed
.
f : callable
f : callable
The thunk for the apply node
The thunk for the apply node
.
is_input : bool
is_input : bool
If True, `var` is an input to `nd`.
If True, `var` is an input to `nd`.
If False, it is an output.
If False, it is an output.
"""
"""
error
=
False
error
=
False
if
nan_is_error
:
if
nan_is_error
:
...
@@ -193,15 +201,18 @@ class NanGuardMode(Mode):
...
@@ -193,15 +201,18 @@ class NanGuardMode(Mode):
def
nan_check
(
i
,
node
,
fn
):
def
nan_check
(
i
,
node
,
fn
):
"""
"""
Runs `fn` while checking its inputs and outputs for NaNs / Infs
Runs `fn` while checking its inputs and outputs for NaNs / Infs
.
Parameters
Parameters
----------
----------
i : currently ignored (TODO: determine why it is here or remove)
i :
Currently ignored.
TODO: determine why it is here or remove).
node : theano.gof.Apply
node : theano.gof.Apply
The Apply node currently being executed
The Apply node currently being executed
.
fn : callable
fn : callable
The thunk to execute for this Apply node
The thunk to execute for this Apply node.
"""
"""
inputs
=
fn
.
inputs
inputs
=
fn
.
inputs
# TODO: figure out why individual inputs are themselves lists
# TODO: figure out why individual inputs are themselves lists
...
...
theano/compile/ops.py
浏览文件 @
690d3628
"""This file contains auxiliary Ops, used during the compilation phase
"""
and Ops building class (:class:`FromFunctionOp`) and decorator
This file contains auxiliary Ops, used during the compilation phase and Ops
(:func:`as_op`) that help make new Ops more rapidly.
building class (:class:`FromFunctionOp`) and decorator (:func:`as_op`) that
help make new Ops more rapidly.
"""
"""
import
copy
import
copy
...
@@ -18,14 +19,19 @@ import numpy
...
@@ -18,14 +19,19 @@ import numpy
def
register_view_op_c_code
(
type
,
code
,
version
=
()):
def
register_view_op_c_code
(
type
,
code
,
version
=
()):
""" Tell ViewOp how to generate C code for a Theano Type
"""
Tell ViewOp how to generate C code for a Theano Type.
:param type: A Theano type. It must be the Theano class itself and not an
instance of the class.
Parameters
:param code: C code that returns a view for the Theano type 'type'.
----------
Use
%(iname)
s and
%(oname)
s for the input and output C
type : Theano type
variable names respectively.
It must be the Theano class itself and not an instance of the class.
:param version: A number indicating the version of the code, for cache.
code : C code
Returns a view for the Theano type 'type'. Use
%(iname)
s and
%(oname)
s
for the input and output C variable names respectively.
version
A number indicating the version of the code, for cache.
"""
"""
ViewOp
.
c_code_and_version
[
type
]
=
(
code
,
version
)
ViewOp
.
c_code_and_version
[
type
]
=
(
code
,
version
)
...
@@ -33,7 +39,9 @@ def register_view_op_c_code(type, code, version=()):
...
@@ -33,7 +39,9 @@ def register_view_op_c_code(type, code, version=()):
class
ViewOp
(
gof
.
Op
):
class
ViewOp
(
gof
.
Op
):
"""
"""
Returns an inplace view of the input. Used internally by Theano.
Returns an inplace view of the input. Used internally by Theano.
"""
"""
view_map
=
{
0
:
[
0
]}
view_map
=
{
0
:
[
0
]}
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
# In the C code, the name of the input variable is %(iname)s,
# In the C code, the name of the input variable is %(iname)s,
...
@@ -96,9 +104,9 @@ class OutputGuard(ViewOp):
...
@@ -96,9 +104,9 @@ class OutputGuard(ViewOp):
Only the AddDestroyHandler optimizer tries to insert them in the graph.
Only the AddDestroyHandler optimizer tries to insert them in the graph.
This Op is declared as destructive while it is not destroying
This Op is declared as destructive while it is not destroying
anything.
anything. It returns a view. This is used to prevent destruction of
It returns a view. This is used to prevent destruction of the output
the output
variables of a Theano function.
variables of a Theano function.
There is a mechanism in Theano that should prevent this, but the use
There is a mechanism in Theano that should prevent this, but the use
of OutputGuard adds a safeguard: it may be possible for some optimization
of OutputGuard adds a safeguard: it may be possible for some optimization
...
@@ -106,6 +114,7 @@ class OutputGuard(ViewOp):
...
@@ -106,6 +114,7 @@ class OutputGuard(ViewOp):
making in-place optimizations.
making in-place optimizations.
TODO: find a current full explanation.
TODO: find a current full explanation.
"""
"""
destroy_map
=
{
0
:
[
0
]}
destroy_map
=
{
0
:
[
0
]}
...
@@ -115,14 +124,19 @@ _output_guard = OutputGuard()
...
@@ -115,14 +124,19 @@ _output_guard = OutputGuard()
def
register_deep_copy_op_c_code
(
typ
,
code
,
version
=
()):
def
register_deep_copy_op_c_code
(
typ
,
code
,
version
=
()):
""" Tell DeepCopyOp how to generate C code for a Theano Type
"""
Tell DeepCopyOp how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not an
instance of the class.
Parameters
:param code: C code that deep copies the Theano type 'typ'.
----------
Use
%(iname)
s and
%(oname)
s for the input and output C
typ : Theano type
variable names respectively.
It must be the Theano class itself and not an instance of the class.
:param version: A number indicating the version of the code, for cache.
code: C code
Deep copies the Theano type 'typ'. Use
%(iname)
s and
%(oname)
s for the
input and output C variable names respectively.
version
A number indicating the version of the code, for cache.
"""
"""
DeepCopyOp
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
DeepCopyOp
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
...
@@ -189,15 +203,20 @@ deep_copy_op = DeepCopyOp()
...
@@ -189,15 +203,20 @@ deep_copy_op = DeepCopyOp()
def
register_shape_c_code
(
type
,
code
,
version
=
()):
def
register_shape_c_code
(
type
,
code
,
version
=
()):
""" Tell Shape Op how to generate C code for a Theano Type
"""
Tell Shape Op how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not an
instance of the class.
Parameters
:param code: C code that return a vector representing the shape
----------
for the Theano type 'typ'.
typ : Theano type
Use
%(iname)
s and
%(oname)
s for the input and output C
It must be the Theano class itself and not an instance of the class.
variable names respectively.
code : C code
:param version: A number indicating the version of the code, for cache.
Returns a vector representing the shape for the Theano type 'typ'.
Use
%(iname)
s and
%(oname)
s for the input and output C variable names
respectively.
version
A number indicating the version of the code, for cache.
"""
"""
Shape
.
c_code_and_version
[
type
]
=
(
code
,
version
)
Shape
.
c_code_and_version
[
type
]
=
(
code
,
version
)
...
@@ -206,8 +225,12 @@ class Shape(gof.Op):
...
@@ -206,8 +225,12 @@ class Shape(gof.Op):
"""
"""
L{Op} to return the shape of a matrix.
L{Op} to return the shape of a matrix.
@note: Non-differentiable.
Notes
-----
Non-differentiable.
"""
"""
_f16_ok
=
True
_f16_ok
=
True
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
...
@@ -293,8 +316,12 @@ class Shape_i(gof.Op):
...
@@ -293,8 +316,12 @@ class Shape_i(gof.Op):
"""
"""
L{Op} to return the shape of a matrix.
L{Op} to return the shape of a matrix.
@note: Non-differentiable.
Notes
-----
Non-differentiable.
"""
"""
_f16_ok
=
True
_f16_ok
=
True
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
...
@@ -381,18 +408,24 @@ class Shape_i(gof.Op):
...
@@ -381,18 +408,24 @@ class Shape_i(gof.Op):
def
shape_i
(
var
,
i
,
fgraph
=
None
):
def
shape_i
(
var
,
i
,
fgraph
=
None
):
"""Equivalent of var.shape[i], but apply if possible the shape
"""
feature optimization
Equivalent of var.shape[i], but apply if possible the shape feature
optimization.
This is useful in optimization that need to get the shape. This
This is useful in optimization that need to get the shape. This
remove the need of the following shape_feature optimization that
remove the need of the following shape_feature optimization that
convert it. So this speed up optimization and remove Equilibrium
convert it. So this speed up optimization and remove Equilibrium
max iteration problems.
max iteration problems.
:param var: the variable we want to take the shape of
Parameters
:param i: The shape dimensions we want
----------
:param fgraph: optional. If var.fgraph do not exist, the fgraph that
var
have the shape_feature to introduce var in to get the optimized shape.
The variable we want to take the shape of.
i
The shape dimensions we want
fgraph : optional
If var.fgraph do not exist, the fgraph that have the shape_feature to
introduce var in to get the optimized shape.
"""
"""
if
fgraph
is
None
and
hasattr
(
var
,
'fgraph'
):
if
fgraph
is
None
and
hasattr
(
var
,
'fgraph'
):
...
@@ -421,15 +454,20 @@ def shape_i(var, i, fgraph=None):
...
@@ -421,15 +454,20 @@ def shape_i(var, i, fgraph=None):
def
register_shape_i_c_code
(
typ
,
code
,
check_input
,
version
=
()):
def
register_shape_i_c_code
(
typ
,
code
,
check_input
,
version
=
()):
""" Tell Shape_i how to generate C code for a Theano Type
"""
Tell Shape_i how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not
an instance of the class.
Parameters
:param code: C code that gets the shape of dimensions
%(i)
s for the
----------
Theano type 'typ'.
typ : Theano type
Use
%(iname)
s and
%(oname)
s for the input and output C
It must be the Theano class itself and not an instance of the class.
variable names respectively.
code : C code
:param version: A number indicating the version of the code, for cache.
Gets the shape of dimensions
%(i)
s for the Theano type 'typ'.
Use
%(iname)
s and
%(oname)
s for the input and output C variable names
respectively.
version
A number indicating the version of the code, for cache.
"""
"""
Shape_i
.
c_code_and_version
[
typ
]
=
(
code
,
check_input
,
version
)
Shape_i
.
c_code_and_version
[
typ
]
=
(
code
,
check_input
,
version
)
...
@@ -459,6 +497,7 @@ class FromFunctionOp(gof.Op):
...
@@ -459,6 +497,7 @@ class FromFunctionOp(gof.Op):
Also the gradient is undefined in the resulting op and Theano will
Also the gradient is undefined in the resulting op and Theano will
raise an error if you attempt to get the gradient of a graph
raise an error if you attempt to get the gradient of a graph
containing this op.
containing this op.
"""
"""
def
__init__
(
self
,
fn
,
itypes
,
otypes
,
infer_shape
):
def
__init__
(
self
,
fn
,
itypes
,
otypes
,
infer_shape
):
...
@@ -519,29 +558,29 @@ class FromFunctionOp(gof.Op):
...
@@ -519,29 +558,29 @@ class FromFunctionOp(gof.Op):
def
as_op
(
itypes
,
otypes
,
infer_shape
=
None
):
def
as_op
(
itypes
,
otypes
,
infer_shape
=
None
):
"""
"""
Decorator that converts a function into a basic Theano op that
Decorator that converts a function into a basic Theano op that
will call
will call
the supplied function as its implementation.
the supplied function as its implementation.
It takes an optional infer_shape parameter that should be a
It takes an optional infer_shape parameter that should be a
callable with
callable with
this signature:
this signature:
def infer_shape(node, input_shapes):
def infer_shape(node, input_shapes):
...
...
return output_shapes
return output_shapes
Here `input_shapes` and `output_shapes` are lists of tuples that
Here `input_shapes` and `output_shapes` are lists of tuples that
represent
represent
the shape of the corresponding inputs/outputs.
the shape of the corresponding inputs/outputs.
This should not be used when performance is a concern since the
This should not be used when performance is a concern since the very basic
very basic nature of the resulting Op may interfere with certain
nature of the resulting Op may interfere with certain graph optimizations.
graph optimizations.
Example usage:
Examples
--------
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
otypes=[theano.tensor.fmatrix])
def numpy_dot(a, b):
return numpy.dot(a, b)
@as_op(itypes=[theano.tensor.fmatrix, theano.tensor.fmatrix],
otypes=[theano.tensor.fmatrix])
def numpy_dot(a, b):
return numpy.dot(a, b)
"""
"""
if
not
isinstance
(
itypes
,
(
list
,
tuple
)):
if
not
isinstance
(
itypes
,
(
list
,
tuple
)):
itypes
=
[
itypes
]
itypes
=
[
itypes
]
...
@@ -565,18 +604,19 @@ def as_op(itypes, otypes, infer_shape=None):
...
@@ -565,18 +604,19 @@ def as_op(itypes, otypes, infer_shape=None):
def
register_rebroadcast_c_code
(
typ
,
code
,
version
=
()):
def
register_rebroadcast_c_code
(
typ
,
code
,
version
=
()):
"""Tell Rebroadcast how to generate C code for a Theano Type
"""
Tell Rebroadcast how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and not an
instance of the class.
typ : Theano type
It must be the Theano class itself and not an instance of the class.
:param code: C code that checks if the dimension
%(axis)
s is of
code : C code
shape 1 for the Theano type 'typ'. Use
%(iname)
s and
That checks if the dimension
%(axis)
s is of shape 1 for the Theano type
%(oname)
s for the input and output C variable names
'typ'. Use
%(iname)
s and
%(oname)
s for the input and output C variable
respectively, and
%(axis)
s for the axis that we need to
names respectively, and
%(axis)
s for the axis that we need to check.
check. This code is put in a loop for all axes.
This code is put in a loop for all axes.
version
A number indicating the version of the code, for cache.
:param version: A number indicating the version of the code, for cache.
"""
"""
Rebroadcast
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
Rebroadcast
.
c_code_and_version
[
typ
]
=
(
code
,
version
)
...
@@ -585,17 +625,23 @@ class Rebroadcast(gof.Op):
...
@@ -585,17 +625,23 @@ class Rebroadcast(gof.Op):
"""
"""
Change the input's broadcastable fields in some predetermined way.
Change the input's broadcastable fields in some predetermined way.
:code:`Rebroadcast((0, True), (1, False))(x)` would make :code:`x`
See Also
broadcastable in axis 0 and not broadcastable in axis 1
--------
unbroadcast <theano.tensor.unbroadcast>
addbroadcast <theano.tensor.addbroadcast>
patternbroadcast <theano.tensor.patternbroadcast>
.. seealso::
Notes
-----
Works inplace and works for CudaNdarrayType.
:func:`unbroadcast <theano.tensor.unbroadcast>`
Example
:func:`addbroadcast <theano.tensor.addbroadcast>`
-------
:func:`patternbroadcast <theano.tensor.patternbroadcast>`
`Rebroadcast((0, True), (1, False))(x)` would make `x` broadcastable in
axis 0 and not broadcastable in axis 1.
..note: works inplace and works for CudaNdarrayType
"""
"""
view_map
=
{
0
:
[
0
]}
view_map
=
{
0
:
[
0
]}
_f16_ok
=
True
_f16_ok
=
True
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
...
@@ -717,17 +763,23 @@ class Rebroadcast(gof.Op):
...
@@ -717,17 +763,23 @@ class Rebroadcast(gof.Op):
def
register_specify_shape_c_code
(
typ
,
code
,
version
=
(),
def
register_specify_shape_c_code
(
typ
,
code
,
version
=
(),
c_support_code_apply
=
None
):
c_support_code_apply
=
None
):
""" Tell SpecifyShape how to generate C code for a Theano Type
"""
Tell SpecifyShape how to generate C code for a Theano Type.
:param typ: A Theano type. It must be the Theano class itself and
not an instance of the class.
Parameters
:param code: C code that checks the shape and returns a view for
----------
the Theano type 'typ'. Use
%(iname)
s and
%(oname)
s
typ : Theano type
for the input and output C variable names
It must be the Theano class itself and not an instance of the class.
respectively.
%(shape)
s is the vector of shape of
code : C code
%(iname)
s. Check that its length is good.
Checks the shape and returns a view for the Theano type 'typ'.
:param version: A number indicating the version of the code, for cache.
Use
%(iname)
s and
%(oname)
s for the input and output C variable names
:param c_support_code_apply: extra code.
respectively.
%(shape)
s is the vector of shape of
%(iname)
s.
Check that its length is good.
version
A number indicating the version of the code, for cache.
c_support_code_apply
Extra code.
"""
"""
SpecifyShape
.
c_code_and_version
[
typ
]
=
(
code
,
version
,
SpecifyShape
.
c_code_and_version
[
typ
]
=
(
code
,
version
,
c_support_code_apply
)
c_support_code_apply
)
...
@@ -742,12 +794,16 @@ class SpecifyShape(gof.Op):
...
@@ -742,12 +794,16 @@ class SpecifyShape(gof.Op):
the case most of the time if we only take the shape of the output.
the case most of the time if we only take the shape of the output.
Maybe there are other optimizations that will mess with this.
Maybe there are other optimizations that will mess with this.
@note: Maybe in the future we will never do the assert!
Notes
@note: We currently don't support specifying partial shape information.
-----
Maybe in the future we will never do the assert!
We currently don't support specifying partial shape information.
TODO : test this op with sparse and cuda ndarray. Do C code for them too.
@todo: test this op with sparse and cuda ndarray.
Do C code for them too.
"""
"""
view_map
=
{
0
:
[
0
]}
view_map
=
{
0
:
[
0
]}
# Mapping from Type to C code (and version) to use.
# Mapping from Type to C code (and version) to use.
# In the C code, the name of the input variable is %(iname)s,
# In the C code, the name of the input variable is %(iname)s,
...
...
theano/compile/pfunc.py
浏览文件 @
690d3628
"""Provide a simple user friendly API """
"""
Provide a simple user friendly API.
"""
from
theano
import
config
from
theano
import
config
from
six
import
iteritems
from
six
import
iteritems
from
theano.compile
import
orig_function
,
In
,
Out
from
theano.compile
import
orig_function
,
In
,
Out
...
@@ -22,42 +25,35 @@ def rebuild_collect_shared(outputs,
...
@@ -22,42 +25,35 @@ def rebuild_collect_shared(outputs,
no_default_updates
=
False
,
no_default_updates
=
False
,
):
):
"""
"""
Function that allows replacing subgraphs of a computational
Function that allows replacing subgraphs of a computational graph.
graph.
It returns a set of dictionaries and lists which collect (partial?)
It returns a set of dictionaries and lists which collect (partial?)
different information about shared variables. This info is required by
different information about shared variables. This info is required by
`pfunc`.
`pfunc`.
Parameters
:type outputs: list of Theano Variables ( or Theano expressions)
----------
:param outputs: list of Theano variables or expressions representing the
outputs : list of Theano Variables (or Theano expressions)
outputs of the computational graph
List of Theano variables or expressions representing the outputs of the
computational graph.
:type inputs: list of Theano Variables ( or Theano expressions)
inputs : list of Theano Variables (or Theano expressions)
:param inputs: list of Theano variables or expressions representing the
List of Theano variables or expressions representing the inputs of the
inputs of the computational graph (or None)
computational graph (or None).
:type replace: dict
replace : dict
:param replace: dictionary describing which subgraphs should be
Dictionary describing which subgraphs should be replaced by what.
replaced by what. orig_value => new_value
orig_value => new_value
updates : dict
:type updates: dict
Dictionary describing updates expressions for shared variables.
:param updates: dictionary describing updates expressions for shared
rebuild_strict : bool
variables
Flag, if true the type of all inputs should be the same as the one for
the current node.
:type rebuild_strict: bool
copy_inputs_over : bool
:param rebuild_strict: flag, if true the type of all inputs should be
Flag; if False it will clone inputs.
the same as the for the current node
no_default_updates : either bool or list of Variables
If True, do not perform any automatic update on Variables.
:type copy_inputs_over: bool
If False (default), perform them all.
:param copy_inputs_over: flag; if False it will clone inputs
Else, perform automatic updates on all Variables that are neither in
"updates" nor in "no_default_updates".
:type no_default_updates: either bool or list of Variables
:param no_default_updates: if True, do not perform any automatic update
on Variables. If False (default), perform
them all. Else, perform automatic updates
on all Variables that are neither in
"updates" nor in "no_default_updates".
"""
"""
...
@@ -73,15 +69,15 @@ def rebuild_collect_shared(outputs,
...
@@ -73,15 +69,15 @@ def rebuild_collect_shared(outputs,
shared_inputs
=
[]
shared_inputs
=
[]
def
clone_v_get_shared_updates
(
v
,
copy_inputs_over
):
def
clone_v_get_shared_updates
(
v
,
copy_inputs_over
):
'''
"""
Clones a variable and its inputs recursively until all are in
Clones a variable and its inputs recursively until all are in clone_d.
clone_d. Also appends all shared variables met along the way to
Also appends all shared variables met along the way to shared inputs,
shared inputs, and their default_update (if applicable) to update_d
and their default_update (if applicable) to update_d and update_expr.
and update_expr.
v can have an fgraph attached to it, case in which we want to clone
v can have an fgraph attached to it, case in which we want to clone
constants ( to avoid having a constant belonging to two fgraphs)
constants (to avoid having a constant belonging to two fgraphs).
'''
"""
# this co-recurses with clone_a
# this co-recurses with clone_a
assert
v
is
not
None
assert
v
is
not
None
if
v
in
clone_d
:
if
v
in
clone_d
:
...
@@ -119,10 +115,11 @@ def rebuild_collect_shared(outputs,
...
@@ -119,10 +115,11 @@ def rebuild_collect_shared(outputs,
return
clone_d
.
setdefault
(
v
,
v
)
return
clone_d
.
setdefault
(
v
,
v
)
def
clone_a
(
a
,
copy_inputs_over
):
def
clone_a
(
a
,
copy_inputs_over
):
'''
"""
Clones a variable and its inputs recursively until all are in
Clones a variable and its inputs recursively until all are in
clone_d. It occures with clone_v_get_shared_updates
clone_d. It occures with clone_v_get_shared_updates.
'''
"""
if
a
is
None
:
if
a
is
None
:
return
None
return
None
if
a
not
in
clone_d
:
if
a
not
in
clone_d
:
...
@@ -275,40 +272,43 @@ def rebuild_collect_shared(outputs,
...
@@ -275,40 +272,43 @@ def rebuild_collect_shared(outputs,
class
Param
(
object
):
class
Param
(
object
):
def
__init__
(
self
,
variable
,
default
=
None
,
name
=
None
,
mutable
=
False
,
"""
strict
=
False
,
allow_downcast
=
None
,
implicit
=
None
,
borrow
=
None
):
"""
:param variable: A variable in an expression graph to use as a
compiled-function parameter
:param default: The default value to use at call-time (can
also be a Container where the function will find a value
at call-time.)
:param name: A string to identify this parameter from function kwargs.
:param mutable: True -> function is allowed to modify this argument.
:param borrow: Whether the function is allowed to alias some
output to this input. Using None (default) means we re-use
the same value as the `mutable` flag.
False: do not permit any output to be aliased to the input
:param strict: False -> function arguments may be copied or
Parameters
cast to match the type required by the parameter
----------
`variable`.
variable
True -> function arguments must exactly match the type
A variable in an expression graph to use as a compiled-function
required by `variable`.
parameter.
default
The default value to use at call-time (can also be a Container where
the function will find a value at call-time).
name : str
A string to identify this parameter from function kwargs.
mutable : bool
True : function is allowed to modify this argument.
borrow
Whether the function is allowed to alias some output to this input.
Using None (default) means we re-use the same value as the `mutable`
flag. False: do not permit any output to be aliased to the input.
strict : bool
False : function arguments may be copied or cast to match the type
required by the parameter `variable`.
True : function arguments must exactly match the type required by
`variable`.
allow_downcast : bool or None
Only applies if `strict` is False.
True : allow assigned value to lose precision when cast during
assignment.
False : never allow precision loss.
None : only allow downcasting of a Python float to a scalar floatX.
implicit
See help(theano.io.In)
:param allow_downcast: Only applies if `strict` is False.
"""
True -> allow assigned value to lose precision when cast
during assignment.
False -> never allow precision loss.
None -> only allow downcasting of a Python float to a scalar floatX.
:param implicit: see help(theano.io.In)
def
__init__
(
self
,
variable
,
default
=
None
,
name
=
None
,
mutable
=
False
,
"""
strict
=
False
,
allow_downcast
=
None
,
implicit
=
None
,
borrow
=
None
):
self
.
variable
=
variable
self
.
variable
=
variable
self
.
default
=
default
self
.
default
=
default
self
.
name
=
name
self
.
name
=
name
...
@@ -340,75 +340,61 @@ def pfunc(params, outputs=None, mode=None, updates=None, givens=None,
...
@@ -340,75 +340,61 @@ def pfunc(params, outputs=None, mode=None, updates=None, givens=None,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
no_default_updates
=
False
,
accept_inplace
=
False
,
name
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
rebuild_strict
=
True
,
allow_input_downcast
=
None
,
profile
=
None
,
on_unused_input
=
None
,
output_keys
=
None
):
profile
=
None
,
on_unused_input
=
None
,
output_keys
=
None
):
"""Function-constructor for graphs with shared variables.
"""
Function-constructor for graphs with shared variables.
:type params: list of either Variable or Param instances.
:param params: function parameters, these are not allowed to be shared
Parameters
variables
----------
params : list of either Variable or Param instances
:type outputs: list of Variables or Out instances
Function parameters, these are not allowed to be shared variables.
:param outputs: expressions to compute
outputs : list of Variables or Out instances
Expressions to compute.
:type mode: string or `theano.compile.Mode` instance.
mode : string or `theano.compile.Mode` instance
:param mode: compilation mode
Compilation mode.
updates : iterable over pairs (shared_variable, new_expression). List, tuple or dict.
:type updates: iterable over pairs (shared_variable,
Update the values for SharedVariable inputs according to these
new_expression). List, tuple or dict.
expressions
:param updates: update the values for SharedVariable inputs
givens : iterable over pairs (Var1, Var2) of Variables. List, tuple or dict.
according to these expressions
The Var1 and Var2 in each pair must have the same Type. Specific
substitutions to make in the computation graph (Var2 replaces Var1).
:type givens: iterable over pairs (Var1, Var2) of Variables. List,
no_default_updates : either bool or list of Variables
tuple or dict. The Var1 and Var2 in each pair must have the
If True, do not perform any automatic update on Variables.
same Type.
If False (default), perform them all. Else, perform automatic updates
on all Variables that are neither in "updates" nor in
:param givens: specific substitutions to make in the computation
"no_default_updates".
graph (Var2 replaces Var1).
name : None or string
Attaches a name to the profiling result of this function.
:type no_default_updates: either bool or list of Variables
allow_input_downcast : bool
:param no_default_updates: if True, do not perform any automatic
True means that the values passed as inputs when calling the function
update on Variables. If False (default), perform them
can be silently downcasted to fit the dtype of the corresponding
all. Else, perform automatic updates on all Variables that are
Variable, which may lose precision. False means that it will only be cast to a more
neither in "updates" nor in "no_default_updates".
:type name: None or string
:param name: attaches a name to the profiling result of this function.
:type allow_input_downcast: Boolean
:param allow_input_downcast: True means that the values passed as
inputs when calling the function can be silently downcasted to
fit the dtype of the corresponding Variable, which may lose
precision. False means that it will only be cast to a more
general, or precise, type. None (default) is almost like
general, or precise, type. None (default) is almost like
False, but allows downcasting of Python float scalars to
False, but allows downcasting of Python float scalars to
floatX.
floatX.
profile : None, True, str, or ProfileStats instance
Accumulate profiling information into a given ProfileStats instance.
None is the default, and means to use the value of config.profile.
If argument is `True` then a new ProfileStats instance will be used.
If argument is a string, a new ProfileStats instance will be created
with that string as its `message` attribute. This profiling object will
be available via self.profile.
on_unused_input : {'raise', 'warn','ignore', None}
What to do if a variable in the 'inputs' list is not used in the graph.
Returns
-------
theano.compile.Function
A callable object that will compute the outputs (given the inputs) and
update the implicit function arguments according to the `updates`.
Notes
-----
Regarding givens: Be careful to make sure that these substitutions are
independent--behaviour when Var1 of one pair appears in the graph leading
to Var2 in another expression is undefined. Replacements specified with
givens are different from optimizations in that Var2 is not expected to be
equivalent to Var1.
:type profile: None, True, str, or ProfileStats instance
:param profile: accumulate profiling information into a given ProfileStats
instance. None is the default, and means to use the value of
config.profile.
If argument is `True` then a new ProfileStats instance will be
used. If argument is a string, a new ProfileStats instance will be created
with that string as its `message` attribute. This profiling object will be
available via self.profile.
:type on_unused_input: str
:param on_unused_input: What to do if a variable in the 'inputs' list
is not used in the graph. Possible values are 'raise', 'warn',
'ignore' and None.
:rtype: theano.compile.Function
:returns: a callable object that will compute the outputs (given
the inputs) and update the implicit function arguments
according to the `updates`.
:note: Regarding givens: Be careful to make sure that these
substitutions are independent--behaviour when Var1 of one pair
appears in the graph leading to Var2 in another expression is
undefined. Replacements specified with givens are different
from optimizations in that Var2 is not expected to be
equivalent to Var1.
"""
"""
#
#
# This function works by cloning the graph (except for the
# This function works by cloning the graph (except for the
...
@@ -547,13 +533,17 @@ def iter_over_pairs(pairs):
...
@@ -547,13 +533,17 @@ def iter_over_pairs(pairs):
"""
"""
Return an iterator over pairs present in the 'pairs' input.
Return an iterator over pairs present in the 'pairs' input.
:type pairs: dictionary or iterable
Parameters
:param pairs: The pairs to iterate upon. These may be stored either as
----------
(key, value) items in a dictionary, or directly as pairs in any kind of
pairs : dictionary or iterable
iterable structure
The pairs to iterate upon. These may be stored either as (key, value)
items in a dictionary, or directly as pairs in any kind of iterable
:rtype: iterable
structure.
:returns: an iterable yielding pairs
Returns
-------
iterable
An iterable yielding pairs.
"""
"""
if
isinstance
(
pairs
,
dict
):
if
isinstance
(
pairs
,
dict
):
...
...
theano/compile/profilemode.py
浏览文件 @
690d3628
...
@@ -122,7 +122,10 @@ class ProfileMode(Mode):
...
@@ -122,7 +122,10 @@ class ProfileMode(Mode):
profile_stats
))
profile_stats
))
def
function_maker
(
self
,
i
,
o
,
m
,
*
args
,
**
kwargs
):
def
function_maker
(
self
,
i
,
o
,
m
,
*
args
,
**
kwargs
):
"""Return an instance of `Profiler_Maker` which init the count"""
"""
Return an instance of `Profiler_Maker` which init the count.
"""
assert
m
is
self
assert
m
is
self
return
Profile_Maker
(
i
,
o
,
self
,
*
args
,
**
kwargs
)
return
Profile_Maker
(
i
,
o
,
self
,
*
args
,
**
kwargs
)
...
@@ -147,7 +150,9 @@ class ProfileMode(Mode):
...
@@ -147,7 +150,9 @@ class ProfileMode(Mode):
self
.
profile_stats
=
profile_stats
self
.
profile_stats
=
profile_stats
def
profile_thunk
(
i
,
node
,
th
):
def
profile_thunk
(
i
,
node
,
th
):
""" Profile only the execution time
"""
Profile only the execution time.
"""
"""
global
run_cthunk
global
run_cthunk
if
hasattr
(
th
,
'cthunk'
):
if
hasattr
(
th
,
'cthunk'
):
...
@@ -169,7 +174,9 @@ class ProfileMode(Mode):
...
@@ -169,7 +174,9 @@ class ProfileMode(Mode):
self
.
apply_time
[
node
]
+=
max
(
dt
,
1e-14
)
self
.
apply_time
[
node
]
+=
max
(
dt
,
1e-14
)
def
profile_thunk2
(
i
,
node
,
th
):
def
profile_thunk2
(
i
,
node
,
th
):
""" Profile the execution time and the memory size.
"""
Profile the execution time and the memory size.
"""
"""
global
run_cthunk
global
run_cthunk
if
hasattr
(
th
,
'cthunk'
):
if
hasattr
(
th
,
'cthunk'
):
...
@@ -211,7 +218,8 @@ class ProfileMode(Mode):
...
@@ -211,7 +218,8 @@ class ProfileMode(Mode):
self
.
fn_time
=
0
self
.
fn_time
=
0
def
print_summary
(
self
,
**
kwargs
):
def
print_summary
(
self
,
**
kwargs
):
""" Print 3 summaries that show where time is spent. The first shows
"""
Print 3 summaries that show where time is spent. The first shows
an Apply-wise summary, the second an Op-wise summary and the
an Apply-wise summary, the second an Op-wise summary and the
third a type-Op-wise summary.
third a type-Op-wise summary.
...
@@ -235,10 +243,13 @@ class ProfileMode(Mode):
...
@@ -235,10 +243,13 @@ class ProfileMode(Mode):
There is an hack with the Op-wise summary. Go see it if you
There is an hack with the Op-wise summary. Go see it if you
want to know more.
want to know more.
:param kwargs: They are passed to print_summary_ expanded.
Parameters
Currently there is n_apply_to_print,
----------
n_ops_to_print and min_memory_size that are
kwargs
accepted.
They are passed to print_summary_ expanded. Currently there is
n_apply_to_print, n_ops_to_print and min_memory_size that are
accepted.
"""
"""
compile_time
=
sum
([
ps
.
compile_time
for
ps
compile_time
=
sum
([
ps
.
compile_time
for
ps
in
self
.
profile_stats
.
values
()])
in
self
.
profile_stats
.
values
()])
...
@@ -280,18 +291,23 @@ class ProfileMode(Mode):
...
@@ -280,18 +291,23 @@ class ProfileMode(Mode):
**
kwargs
)
**
kwargs
)
def
print_diff_summary
(
self
,
other
,
**
kwargs
):
def
print_diff_summary
(
self
,
other
,
**
kwargs
):
""" As print_summary, but print the difference on two different
"""
As print_summary, but print the difference on two different
profile mode.
profile mode.
TODO: Also we don't print the Apply-wise summary as it don't
TODO: Also we don't print the Apply-wise summary as it don't
work for now.
work for now.
TODO: make comparaison with gpu code.
TODO: make comparaison with gpu code.
:param other: the other instance of ProfileMode that we want
Parameters
to be compared to.
----------
:param kwargs: They are passed to print_summary_ expanded.
other
The other instance of ProfileMode that we want to be compared to.
kwargs
They are passed to print_summary_ expanded.
Currently there is n_apply_to_print, n_ops_to_print and
Currently there is n_apply_to_print, n_ops_to_print and
min_memory_size that are accepted.
min_memory_size that are accepted.
"""
"""
def
diff_dict
(
a_time
,
b_time_
):
def
diff_dict
(
a_time
,
b_time_
):
...
@@ -343,13 +359,18 @@ class ProfileMode(Mode):
...
@@ -343,13 +359,18 @@ class ProfileMode(Mode):
min_memory_size
=
config
.
ProfileMode
.
min_memory_size
,
min_memory_size
=
config
.
ProfileMode
.
min_memory_size
,
):
):
"""
"""
do the actual printing of print_summary and print_diff_summary.
Do the actual printing of print_summary and print_diff_summary.
:param n_apply_to_print: the number of apply to print. Default 15.
Parameters
----------
n_apply_to_print
The number of apply to print. Default 15.
n_ops_to_print
The number of ops to print. Default 20.
min_memory_size
Don't print memory profile of apply whose outputs memory size is
lower than that.
:param n_ops_to_print: the number of ops to print. Default 20.
:param min_memory_size: Don't print memory profile of apply
whose outputs memory size is lower then that.
"""
"""
print
(
"ProfileMode is deprecated! Use the new profiler."
)
print
(
"ProfileMode is deprecated! Use the new profiler."
)
...
@@ -700,9 +721,9 @@ Test them first, as they are not guaranteed to always provide a speedup.""")
...
@@ -700,9 +721,9 @@ Test them first, as they are not guaranteed to always provide a speedup.""")
"""
"""
Create a new instance of this Mode.
Create a new instance of this Mode.
Keyword arguments can be provided for the linker,
Keyword arguments can be provided for the linker,
in which case its
in which case its `clone` method will be called with these
`clone` method will be called with these arguments.
arguments.
"""
"""
new_linker
=
self
.
linker
.
clone
(
**
link_kwargs
)
new_linker
=
self
.
linker
.
clone
(
**
link_kwargs
)
new_optimizer
=
self
.
provided_optimizer
new_optimizer
=
self
.
provided_optimizer
...
@@ -727,10 +748,11 @@ prof_mode_instance_to_print = [predefined_modes["PROFILE_MODE"]]
...
@@ -727,10 +748,11 @@ prof_mode_instance_to_print = [predefined_modes["PROFILE_MODE"]]
def
atexit_print_default_profile_mode
():
def
atexit_print_default_profile_mode
():
"""Print the summary of the predefined mode ProfileMode if used.
"""
Print the summary of the predefined mode ProfileMode if used.
This all to have the summary printed at exit when config.mode=ProfileMode.
This all to have the summary printed at exit when
config.mode=ProfileMode
"""
"""
for
prof_mode
in
prof_mode_instance_to_print
:
for
prof_mode
in
prof_mode_instance_to_print
:
if
prof_mode
.
local_time
>
0
:
if
prof_mode
.
local_time
>
0
:
...
...
theano/compile/profiling.py
浏览文件 @
690d3628
"""ProfileStats object for runtime and memory profiling.
"""
ProfileStats object for runtime and memory profiling.
"""
"""
from
__future__
import
print_function
from
__future__
import
print_function
#
#
...
@@ -76,7 +78,9 @@ AddConfigVar('profiling.destination',
...
@@ -76,7 +78,9 @@ AddConfigVar('profiling.destination',
def
_atexit_print_fn
():
def
_atexit_print_fn
():
"""Print ProfileStat objects in _atexit_print_list to _atexit_print_file
"""
Print ProfileStat objects in _atexit_print_list to _atexit_print_file.
"""
"""
to_sum
=
[]
to_sum
=
[]
...
@@ -135,6 +139,16 @@ class ProfileStats(object):
...
@@ -135,6 +139,16 @@ class ProfileStats(object):
"""
"""
Object to store runtime and memory profiling information for all of
Object to store runtime and memory profiling information for all of
Theano's operations: compilation, optimization, execution.
Theano's operations: compilation, optimization, execution.
Parameters
----------
atexit_print : bool
True means that this object will be printed to stderr (using .summary())
at the end of the program.
**kwargs : misc initializers
These should (but need not) match the names of the class vars declared
in this class.
"""
"""
#
#
...
@@ -212,12 +226,6 @@ class ProfileStats(object):
...
@@ -212,12 +226,6 @@ class ProfileStats(object):
# param is called flag_time_thunks because most other attributes with time
# param is called flag_time_thunks because most other attributes with time
# in the name are times *of* something, rather than configuration flags.
# in the name are times *of* something, rather than configuration flags.
def
__init__
(
self
,
atexit_print
=
True
,
flag_time_thunks
=
None
,
**
kwargs
):
def
__init__
(
self
,
atexit_print
=
True
,
flag_time_thunks
=
None
,
**
kwargs
):
"""
atexit_print - bool. True means that this object will be printed to
stderr (using .summary()) at the end of the program.
**kwargs - misc initializers. These should (but need not) match the
names of the class vars declared in this class.
"""
if
(
hasattr
(
theano
,
'sandbox'
)
and
if
(
hasattr
(
theano
,
'sandbox'
)
and
hasattr
(
theano
.
sandbox
,
'cuda'
)
and
hasattr
(
theano
.
sandbox
,
'cuda'
)
and
theano
.
sandbox
.
cuda
.
cuda_enabled
):
theano
.
sandbox
.
cuda
.
cuda_enabled
):
...
@@ -250,7 +258,10 @@ class ProfileStats(object):
...
@@ -250,7 +258,10 @@ class ProfileStats(object):
_atexit_registered
=
True
_atexit_registered
=
True
def
class_time
(
self
):
def
class_time
(
self
):
"""dict op -> total time on thunks"""
"""
dict op -> total time on thunks
"""
# timing is stored by node, we compute timing by class on demand
# timing is stored by node, we compute timing by class on demand
rval
=
{}
rval
=
{}
for
node
,
t
in
iteritems
(
self
.
apply_time
):
for
node
,
t
in
iteritems
(
self
.
apply_time
):
...
@@ -260,7 +271,10 @@ class ProfileStats(object):
...
@@ -260,7 +271,10 @@ class ProfileStats(object):
return
rval
return
rval
def
class_callcount
(
self
):
def
class_callcount
(
self
):
"""dict op -> total number of thunk calls"""
"""
dict op -> total number of thunk calls
"""
# timing is stored by node, we compute timing by class on demand
# timing is stored by node, we compute timing by class on demand
rval
=
{}
rval
=
{}
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
...
@@ -270,7 +284,10 @@ class ProfileStats(object):
...
@@ -270,7 +284,10 @@ class ProfileStats(object):
return
rval
return
rval
def
class_nodes
(
self
):
def
class_nodes
(
self
):
"""dict op -> total number of nodes"""
"""
dict op -> total number of nodes
"""
# timing is stored by node, we compute timing by class on demand
# timing is stored by node, we compute timing by class on demand
rval
=
{}
rval
=
{}
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
...
@@ -280,7 +297,10 @@ class ProfileStats(object):
...
@@ -280,7 +297,10 @@ class ProfileStats(object):
return
rval
return
rval
def
class_impl
(
self
):
def
class_impl
(
self
):
"""dict op -> total number of nodes"""
"""
dict op -> total number of nodes
"""
# timing is stored by node, we compute timing by class on demand
# timing is stored by node, we compute timing by class on demand
rval
=
{}
rval
=
{}
for
node
in
self
.
apply_callcount
:
for
node
in
self
.
apply_callcount
:
...
@@ -295,7 +315,10 @@ class ProfileStats(object):
...
@@ -295,7 +315,10 @@ class ProfileStats(object):
return
rval
return
rval
def
op_time
(
self
):
def
op_time
(
self
):
"""dict op -> total time on thunks"""
"""
dict op -> total time on thunks
"""
# timing is stored by node, we compute timing by Op on demand
# timing is stored by node, we compute timing by Op on demand
rval
=
{}
rval
=
{}
for
node
,
t
in
iteritems
(
self
.
apply_time
):
for
node
,
t
in
iteritems
(
self
.
apply_time
):
...
@@ -304,7 +327,10 @@ class ProfileStats(object):
...
@@ -304,7 +327,10 @@ class ProfileStats(object):
return
rval
return
rval
def
fill_node_total_time
(
self
,
node
,
total_times
):
def
fill_node_total_time
(
self
,
node
,
total_times
):
"""node -> fill total time icluding its parents (returns nothing)"""
"""
node -> fill total time icluding its parents (returns nothing)
"""
# timing is stored by node, we compute total time on demand
# timing is stored by node, we compute total time on demand
total
=
self
.
apply_time
[
node
]
total
=
self
.
apply_time
[
node
]
for
parent
in
node
.
get_parents
():
for
parent
in
node
.
get_parents
():
...
@@ -315,7 +341,10 @@ class ProfileStats(object):
...
@@ -315,7 +341,10 @@ class ProfileStats(object):
total_times
[
node
]
=
total
total_times
[
node
]
=
total
def
compute_total_times
(
self
):
def
compute_total_times
(
self
):
"""dict op -> total time icluding the time for parents"""
"""
dict op -> total time icluding the time for parents
"""
rval
=
{}
rval
=
{}
for
node
in
self
.
apply_time
:
for
node
in
self
.
apply_time
:
if
node
not
in
rval
:
if
node
not
in
rval
:
...
@@ -323,7 +352,10 @@ class ProfileStats(object):
...
@@ -323,7 +352,10 @@ class ProfileStats(object):
return
rval
return
rval
def
op_callcount
(
self
):
def
op_callcount
(
self
):
"""dict op -> total number of thunk calls"""
"""
dict op -> total number of thunk calls
"""
# timing is stored by node, we compute timing by Op on demand
# timing is stored by node, we compute timing by Op on demand
rval
=
{}
rval
=
{}
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
...
@@ -332,7 +364,10 @@ class ProfileStats(object):
...
@@ -332,7 +364,10 @@ class ProfileStats(object):
return
rval
return
rval
def
op_nodes
(
self
):
def
op_nodes
(
self
):
"""dict op -> total number of nodes"""
"""
dict op -> total number of nodes
"""
# timing is stored by node, we compute timing by Op on demand
# timing is stored by node, we compute timing by Op on demand
rval
=
{}
rval
=
{}
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
for
node
,
count
in
iteritems
(
self
.
apply_callcount
):
...
@@ -341,7 +376,10 @@ class ProfileStats(object):
...
@@ -341,7 +376,10 @@ class ProfileStats(object):
return
rval
return
rval
def
op_impl
(
self
):
def
op_impl
(
self
):
"""dict op -> 'C' or 'Py' depending how the op is implemented"""
"""
dict op -> 'C' or 'Py' depending how the op is implemented
"""
# timing is stored by node, we compute timing by Op on demand
# timing is stored by node, we compute timing by Op on demand
rval
=
{}
rval
=
{}
for
node
in
self
.
apply_callcount
:
for
node
in
self
.
apply_callcount
:
...
@@ -711,21 +749,23 @@ class ProfileStats(object):
...
@@ -711,21 +749,23 @@ class ProfileStats(object):
def
count_running_memory
(
order
,
fgraph
,
nodes_mem
):
def
count_running_memory
(
order
,
fgraph
,
nodes_mem
):
"""
"""
Calculate memory with specific node order
Calculate memory with specific node order.
Return a list including the following values
Return a list including the following values
1. node_memory_size
1. node_memory_size
Sum of the size of all variables that actually allocate
Sum of the size of all variables that actually allocate
memory (excluding views, and inplace)
;
memory (excluding views, and inplace)
.
2. running_memory_size
2.
running_memory_size
The memory allocated after the current apply node
The memory allocated after the current apply node
.
3. running_max_memory_size
3.
running_max_memory_size
The maximum of running_memory_size during the function
The maximum of running_memory_size during the function
.
4. node_memory_saved_by_view
4. node_memory_saved_by_view
The sum of memory saved by returning view instead of new
The sum of memory saved by returning view instead of new
allocation
allocation
.
5. node_memory_saved_by_inplace
5. node_memory_saved_by_inplace
The sum of memory saved by reusing the input instead of
The sum of memory saved by reusing the input instead of
new allocation
new allocation.
"""
"""
from
theano.sandbox.cuda
import
CudaNdarrayType
from
theano.sandbox.cuda
import
CudaNdarrayType
# Initial Mem info values [CPU, GPU]
# Initial Mem info values [CPU, GPU]
...
@@ -874,10 +914,14 @@ class ProfileStats(object):
...
@@ -874,10 +914,14 @@ class ProfileStats(object):
def
min_memory_generator
(
executable_nodes
,
viewed_by
,
view_of
):
def
min_memory_generator
(
executable_nodes
,
viewed_by
,
view_of
):
"""
"""
Generate all valid node order from node_list
Generate all valid node order from node_list and compute its
and compute its memory peak.
memory peak.
Parameters
----------
executable_nodes
Set of executable nodes.
:param executable_nodes: Set of executable nodes
"""
"""
global
mem_count
,
mem_bound
,
max_mem_count
global
mem_count
,
mem_bound
,
max_mem_count
...
@@ -1255,9 +1299,13 @@ if False: # old code still to be ported from ProfileMode
...
@@ -1255,9 +1299,13 @@ if False: # old code still to be ported from ProfileMode
"""
"""
Print a readable summary of the stats.
Print a readable summary of the stats.
param: n_apply_to_print the number of apply to print. Default 15.
Parameters
----------
n_apply_to_print
The number of apply to print. Default 15.
n_ops_to_print
The number of ops to print. Default 20.
param: n_ops_to_print the number of ops to print. Default 20.
"""
"""
local_time
=
sum
(
self
.
apply_time
.
values
())
local_time
=
sum
(
self
.
apply_time
.
values
())
...
@@ -1483,11 +1531,13 @@ if False: # old code still to be ported from ProfileMode
...
@@ -1483,11 +1531,13 @@ if False: # old code still to be ported from ProfileMode
There is a hack with the Op-wise summary. Go see it if you want to know
There is a hack with the Op-wise summary. Go see it if you want to know
more.
more.
:param n_apply_to_print: the number of apply to print. Default 15, or
Parameters
n_ops_to_print flag.
----------
n_apply_to_print
The number of apply to print. Default 15, or n_ops_to_print flag.
n_ops_to_print
The number of ops to print. Default 20, or n_apply_to_print flag.
:param n_ops_to_print: the number of ops to print. Default 20, or
n_apply_to_print flag.
"""
"""
fct_call_time
=
self
.
mode
.
fct_call_time
fct_call_time
=
self
.
mode
.
fct_call_time
fct_call
=
self
.
mode
.
fct_call
fct_call
=
self
.
mode
.
fct_call
...
@@ -1517,12 +1567,15 @@ if False: # old code still to be ported from ProfileMode
...
@@ -1517,12 +1567,15 @@ if False: # old code still to be ported from ProfileMode
now.
now.
TODO: make comparaison with gpu code.
TODO: make comparaison with gpu code.
:param other: the other instance of ProfileMode that we want to be
Parameters
compared to.
----------
other
:param n_apply_to_print: the number of apply to print. Default 15.
The other instance of ProfileMode that we want to be compared to.
n_apply_to_print
The number of apply to print. Default 15.
n_ops_to_print
The number of ops to print. Default 20.
:param n_ops_to_print: the number of ops to print. Default 20.
"""
"""
def
diff_dict
(
a_time
,
b_time_
):
def
diff_dict
(
a_time
,
b_time_
):
...
...
theano/compile/sharedvalue.py
浏览文件 @
690d3628
"""Provide a simple user friendly API to Theano-managed memory"""
"""
Provide a simple user friendly API to Theano-managed memory.
"""
# Standard imports
# Standard imports
import
copy
import
copy
import
logging
import
logging
...
@@ -18,6 +21,32 @@ class SharedVariable(Variable):
...
@@ -18,6 +21,32 @@ class SharedVariable(Variable):
Variable that is (defaults to being) shared between functions that
Variable that is (defaults to being) shared between functions that
it appears in.
it appears in.
Parameters
----------
name : str
The name for this variable (see `Variable`).
type : str
The type for this variable (see `Variable`).
value
A value to associate with this variable (a new container will be
created).
strict
True : assignments to .value will not be cast or copied, so they must
have the correct type.
allow_downcast
Only applies if `strict` is False.
True : allow assigned value to lose precision when cast during
assignment.
False : never allow precision loss.
None : only allow downcasting of a Python float to a scalar floatX.
container
The container to use for this variable. Illegal to pass this as well as
a value.
Notes
-----
For more user-friendly constructor, see `shared`.
"""
"""
# Container object
# Container object
...
@@ -36,29 +65,6 @@ class SharedVariable(Variable):
...
@@ -36,29 +65,6 @@ class SharedVariable(Variable):
def
__init__
(
self
,
name
,
type
,
value
,
strict
,
def
__init__
(
self
,
name
,
type
,
value
,
strict
,
allow_downcast
=
None
,
container
=
None
):
allow_downcast
=
None
,
container
=
None
):
"""
:param name: The name for this variable (see `Variable`).
:param type: The type for this variable (see `Variable`).
:param value: A value to associate with this variable (a new
container will be created).
:param strict: True -> assignments to .value will not be cast
or copied, so they must have the correct type.
:param allow_downcast: Only applies if `strict` is False.
True -> allow assigned value to lose precision when cast
during assignment.
False -> never allow precision loss.
None -> only allow downcasting of a Python float to a scalar floatX.
:param container: The container to use for this
variable. Illegal to pass this as well as a value.
:note: For more user-friendly constructor, see `shared`
"""
super
(
SharedVariable
,
self
)
.
__init__
(
type
=
type
,
name
=
name
,
super
(
SharedVariable
,
self
)
.
__init__
(
type
=
type
,
name
=
name
,
owner
=
None
,
index
=
None
)
owner
=
None
,
index
=
None
)
...
@@ -79,18 +85,21 @@ class SharedVariable(Variable):
...
@@ -79,18 +85,21 @@ class SharedVariable(Variable):
allow_downcast
=
allow_downcast
)
allow_downcast
=
allow_downcast
)
def
get_value
(
self
,
borrow
=
False
,
return_internal_type
=
False
):
def
get_value
(
self
,
borrow
=
False
,
return_internal_type
=
False
):
"""Get the non-symbolic value associated with this SharedVariable.
"""
Get the non-symbolic value associated with this SharedVariable.
:param borrow: True to permit returning of an object aliased
Parameters
to internal memory.
----------
:param return_internal_type: True to permit the returning of
borrow : bool
an arbitrary type object used internally to store the
True to permit returning of an object aliased to internal memory.
shared variable.
return_internal_type : bool
True to permit the returning of an arbitrary type object used
internally to store the shared variable.
Only with borrow=False and return_internal_type=True does this
Only with borrow=False and return_internal_type=True does this
function
function
guarantee that you actually get the internal object.
guarantee that you actually get the internal object.
But in that case, you may get different return types when
But in that case, you may get different return types when
using
using
different compute devices.
different compute devices.
"""
"""
if
borrow
:
if
borrow
:
...
@@ -99,14 +108,18 @@ class SharedVariable(Variable):
...
@@ -99,14 +108,18 @@ class SharedVariable(Variable):
return
copy
.
deepcopy
(
self
.
container
.
value
)
return
copy
.
deepcopy
(
self
.
container
.
value
)
def
set_value
(
self
,
new_value
,
borrow
=
False
):
def
set_value
(
self
,
new_value
,
borrow
=
False
):
"""Set the non-symbolic value associated with this SharedVariable.
"""
Set the non-symbolic value associated with this SharedVariable.
:param borrow:
Parameters
----------
borrow : bool
True to use the new_value directly, potentially creating problems
True to use the new_value directly, potentially creating problems
related to aliased memory.
related to aliased memory.
Changes to this value will be visible to all functions using
Changes to this value will be visible to all functions using
this SharedVariable.
this SharedVariable.
"""
"""
if
borrow
:
if
borrow
:
self
.
container
.
value
=
new_value
self
.
container
.
value
=
new_value
...
@@ -114,15 +127,19 @@ class SharedVariable(Variable):
...
@@ -114,15 +127,19 @@ class SharedVariable(Variable):
self
.
container
.
value
=
copy
.
deepcopy
(
new_value
)
self
.
container
.
value
=
copy
.
deepcopy
(
new_value
)
def
zero
(
self
,
borrow
=
False
):
def
zero
(
self
,
borrow
=
False
):
"""Set the values of a shared variable to 0.
"""
Set the values of a shared variable to 0.
:param borrow:
Parameters
----------
borrow : bbol
True to modify the value of a shared variable directly by using
True to modify the value of a shared variable directly by using
its previous value. Potentially this can cause problems
its previous value. Potentially this can cause problems
regarding to the aliased memory.
regarding to the aliased memory.
Changes done with this function will be visible to all functions using
Changes done with this function will be visible to all functions using
this SharedVariable.
this SharedVariable.
"""
"""
if
borrow
:
if
borrow
:
self
.
container
.
value
[
...
]
=
0
self
.
container
.
value
[
...
]
=
0
...
@@ -183,7 +200,8 @@ def shared_constructor(ctor, remove=False):
...
@@ -183,7 +200,8 @@ def shared_constructor(ctor, remove=False):
def
shared
(
value
,
name
=
None
,
strict
=
False
,
allow_downcast
=
None
,
**
kwargs
):
def
shared
(
value
,
name
=
None
,
strict
=
False
,
allow_downcast
=
None
,
**
kwargs
):
"""Return a SharedVariable Variable, initialized with a copy or
"""
Return a SharedVariable Variable, initialized with a copy or
reference of `value`.
reference of `value`.
This function iterates over
This function iterates over
...
@@ -196,23 +214,25 @@ def shared(value, name=None, strict=False, allow_downcast=None, **kwargs):
...
@@ -196,23 +214,25 @@ def shared(value, name=None, strict=False, allow_downcast=None, **kwargs):
``theano.shared`` is a shortcut to this function.
``theano.shared`` is a shortcut to this function.
:note: By passing kwargs, you effectively limit the set of
Notes
potential constructors to those that can accept those kwargs.
-----
By passing kwargs, you effectively limit the set of potential constructors
to those that can accept those kwargs.
:note:
Some shared variable have ``borrow`` as extra kwargs.
Some shared variable have ``borrow`` as extra kwargs.
`See <http://deeplearning.net/software/theano/tutorial/aliasing.
\
`See <http://deeplearning.net/software/theano/tutorial/aliasing.
\
html#borrowing-when-creating-shared-variables>`_ for detail
.
html#borrowing-when-creating-shared-variables>`_ for details
.
:note: Some shared variable have ``broadcastable`` as extra kwargs.
Some shared variable have ``broadcastable`` as extra kwargs. As shared
As shared variable shapes can change, all dimensions default
variable shapes can change, all dimensions default to not being
to not being broadcastable, even if ``value`` has a shape of 1
broadcastable, even if ``value`` has a shape of 1 along some dimension.
along some dimension. This parameter allows you to create
This parameter allows you to create for example a `row` or `column` 2d
for example a `row` or `column` 2d
tensor.
tensor.
.. attribute:: constructors
.. attribute:: constructors
A list of shared variable constructors that will be tried in reverse
A list of shared variable constructors that will be tried in reverse
order.
order.
"""
"""
...
@@ -251,6 +271,9 @@ shared.constructors = []
...
@@ -251,6 +271,9 @@ shared.constructors = []
@shared_constructor
@shared_constructor
def
generic_constructor
(
value
,
name
=
None
,
strict
=
False
,
allow_downcast
=
None
):
def
generic_constructor
(
value
,
name
=
None
,
strict
=
False
,
allow_downcast
=
None
):
"""SharedVariable Constructor"""
"""
SharedVariable Constructor.
"""
return
SharedVariable
(
type
=
generic
,
value
=
value
,
name
=
name
,
strict
=
strict
,
return
SharedVariable
(
type
=
generic
,
value
=
value
,
name
=
name
,
strict
=
strict
,
allow_downcast
=
allow_downcast
)
allow_downcast
=
allow_downcast
)
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论