Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
214ef4cf
提交
214ef4cf
authored
7月 14, 2022
作者:
Brandon T. Willard
提交者:
Brandon T. Willard
8月 17, 2022
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Rename GlobalOptimizer to GraphRewriter
上级
7ce7b0c2
隐藏空白字符变更
内嵌
并排
正在显示
9 个修改的文件
包含
48 行增加
和
43 行删除
+48
-43
mode.py
aesara/compile/mode.py
+5
-5
opt.py
aesara/graph/opt.py
+16
-11
optdb.py
aesara/graph/optdb.py
+6
-6
ifelse.py
aesara/ifelse.py
+2
-2
opt.py
aesara/scan/opt.py
+3
-3
basic_opt.py
aesara/tensor/basic_opt.py
+5
-5
blas.py
aesara/tensor/blas.py
+2
-2
graph_rewriting.rst
doc/extending/graph_rewriting.rst
+8
-8
test_optdb.py
tests/graph/test_optdb.py
+1
-1
没有找到文件。
aesara/compile/mode.py
浏览文件 @
214ef4cf
...
@@ -12,7 +12,7 @@ from aesara.configdefaults import config
...
@@ -12,7 +12,7 @@ from aesara.configdefaults import config
from
aesara.graph.destroyhandler
import
DestroyHandler
from
aesara.graph.destroyhandler
import
DestroyHandler
from
aesara.graph.opt
import
(
from
aesara.graph.opt
import
(
CheckStackTraceOptimization
,
CheckStackTraceOptimization
,
G
lobalOptimiz
er
,
G
raphRewrit
er
,
MergeOptimizer
,
MergeOptimizer
,
NavigatorOptimizer
,
NavigatorOptimizer
,
)
)
...
@@ -106,13 +106,13 @@ predefined_optimizers = {
...
@@ -106,13 +106,13 @@ predefined_optimizers = {
def
register_optimizer
(
name
,
opt
):
def
register_optimizer
(
name
,
opt
):
"""Add a `G
lobalOptimiz
er` which can be referred to by `name` in `Mode`."""
"""Add a `G
raphRewrit
er` which can be referred to by `name` in `Mode`."""
if
name
in
predefined_optimizers
:
if
name
in
predefined_optimizers
:
raise
ValueError
(
f
"Optimizer name already taken: {name}"
)
raise
ValueError
(
f
"Optimizer name already taken: {name}"
)
predefined_optimizers
[
name
]
=
opt
predefined_optimizers
[
name
]
=
opt
class
AddDestroyHandler
(
G
lobalOptimiz
er
):
class
AddDestroyHandler
(
G
raphRewrit
er
):
"""
"""
This optimizer performs two important functions:
This optimizer performs two important functions:
...
@@ -145,7 +145,7 @@ class AddDestroyHandler(GlobalOptimizer):
...
@@ -145,7 +145,7 @@ class AddDestroyHandler(GlobalOptimizer):
fgraph
.
attach_feature
(
DestroyHandler
())
fgraph
.
attach_feature
(
DestroyHandler
())
class
AddFeatureOptimizer
(
G
lobalOptimiz
er
):
class
AddFeatureOptimizer
(
G
raphRewrit
er
):
"""
"""
This optimizer adds a provided feature to the function graph.
This optimizer adds a provided feature to the function graph.
"""
"""
...
@@ -161,7 +161,7 @@ class AddFeatureOptimizer(GlobalOptimizer):
...
@@ -161,7 +161,7 @@ class AddFeatureOptimizer(GlobalOptimizer):
pass
pass
class
PrintCurrentFunctionGraph
(
G
lobalOptimiz
er
):
class
PrintCurrentFunctionGraph
(
G
raphRewrit
er
):
"""
"""
This optimizer is for debugging.
This optimizer is for debugging.
...
...
aesara/graph/opt.py
浏览文件 @
214ef4cf
...
@@ -83,7 +83,7 @@ class Rewriter(abc.ABC):
...
@@ -83,7 +83,7 @@ class Rewriter(abc.ABC):
return
id
(
self
)
return
id
(
self
)
class
G
lobalOptimiz
er
(
Rewriter
):
class
G
raphRewrit
er
(
Rewriter
):
"""A optimizer that can be applied to a `FunctionGraph` in order to transform it.
"""A optimizer that can be applied to a `FunctionGraph` in order to transform it.
It can represent an optimization or, in general, any kind of transformation
It can represent an optimization or, in general, any kind of transformation
...
@@ -96,7 +96,7 @@ class GlobalOptimizer(Rewriter):
...
@@ -96,7 +96,7 @@ class GlobalOptimizer(Rewriter):
"""Apply the optimization to a `FunctionGraph`.
"""Apply the optimization to a `FunctionGraph`.
It may use all the methods defined by the `FunctionGraph`. If the
It may use all the methods defined by the `FunctionGraph`. If the
`G
lobalOptimiz
er` needs to use a certain tool, such as an
`G
raphRewrit
er` needs to use a certain tool, such as an
`InstanceFinder`, it can do so in its `add_requirements` method.
`InstanceFinder`, it can do so in its `add_requirements` method.
"""
"""
...
@@ -185,8 +185,8 @@ class LocalOptimizer(Rewriter):
...
@@ -185,8 +185,8 @@ class LocalOptimizer(Rewriter):
print
(
f
"{' ' * level}{self.__class__.__name__} id={id(self)}"
,
file
=
stream
)
print
(
f
"{' ' * level}{self.__class__.__name__} id={id(self)}"
,
file
=
stream
)
class
FromFunctionOptimizer
(
G
lobalOptimiz
er
):
class
FromFunctionOptimizer
(
G
raphRewrit
er
):
"""A `G
lobalOptimiz
er` constructed from a given function."""
"""A `G
raphRewrit
er` constructed from a given function."""
def
__init__
(
self
,
fn
,
requirements
=
()):
def
__init__
(
self
,
fn
,
requirements
=
()):
self
.
fn
=
fn
self
.
fn
=
fn
...
@@ -225,8 +225,8 @@ def inplace_optimizer(f):
...
@@ -225,8 +225,8 @@ def inplace_optimizer(f):
return
rval
return
rval
class
SeqOptimizer
(
G
lobalOptimiz
er
,
UserList
):
class
SeqOptimizer
(
G
raphRewrit
er
,
UserList
):
"""A `G
lobalOptimiz
er` that applies a list of optimizers sequentially."""
"""A `G
raphRewrit
er` that applies a list of optimizers sequentially."""
@staticmethod
@staticmethod
def
warn
(
exc
,
self
,
optimizer
):
def
warn
(
exc
,
self
,
optimizer
):
...
@@ -258,7 +258,7 @@ class SeqOptimizer(GlobalOptimizer, UserList):
...
@@ -258,7 +258,7 @@ class SeqOptimizer(GlobalOptimizer, UserList):
self
.
failure_callback
=
failure_callback
self
.
failure_callback
=
failure_callback
def
apply
(
self
,
fgraph
):
def
apply
(
self
,
fgraph
):
"""Applies each `G
lobalOptimiz
er` in ``self.data`` to `fgraph`."""
"""Applies each `G
raphRewrit
er` in ``self.data`` to `fgraph`."""
l
=
[]
l
=
[]
if
fgraph
.
profile
:
if
fgraph
.
profile
:
validate_before
=
fgraph
.
profile
.
validate_time
validate_before
=
fgraph
.
profile
.
validate_time
...
@@ -670,7 +670,7 @@ class MergeFeature(Feature):
...
@@ -670,7 +670,7 @@ class MergeFeature(Feature):
self
.
noinput_nodes
.
add
(
node
)
self
.
noinput_nodes
.
add
(
node
)
class
MergeOptimizer
(
G
lobalOptimiz
er
):
class
MergeOptimizer
(
G
raphRewrit
er
):
r"""Merges parts of the graph that are identical and redundant.
r"""Merges parts of the graph that are identical and redundant.
The basic principle is that if two `Apply`\s have `Op`\s that compare equal, and
The basic principle is that if two `Apply`\s have `Op`\s that compare equal, and
...
@@ -1718,7 +1718,7 @@ class Updater(Feature):
...
@@ -1718,7 +1718,7 @@ class Updater(Feature):
self
.
chin
=
None
self
.
chin
=
None
class
NavigatorOptimizer
(
G
lobalOptimiz
er
):
class
NavigatorOptimizer
(
G
raphRewrit
er
):
r"""An optimizer that applies a `LocalOptimizer` with considerations for the new nodes it creates.
r"""An optimizer that applies a `LocalOptimizer` with considerations for the new nodes it creates.
...
@@ -2578,7 +2578,7 @@ class EquilibriumOptimizer(NavigatorOptimizer):
...
@@ -2578,7 +2578,7 @@ class EquilibriumOptimizer(NavigatorOptimizer):
+
list
(
opt
.
final_optimizers
)
+
list
(
opt
.
final_optimizers
)
+
list
(
opt
.
cleanup_optimizers
)
+
list
(
opt
.
cleanup_optimizers
)
)
)
if
o
.
print_profile
.
__code__
is
not
G
lobalOptimiz
er
.
print_profile
.
__code__
if
o
.
print_profile
.
__code__
is
not
G
raphRewrit
er
.
print_profile
.
__code__
]
]
if
not
gf_opts
:
if
not
gf_opts
:
return
return
...
@@ -3043,7 +3043,7 @@ class CheckStackTraceFeature(Feature):
...
@@ -3043,7 +3043,7 @@ class CheckStackTraceFeature(Feature):
)
)
class
CheckStackTraceOptimization
(
G
lobalOptimiz
er
):
class
CheckStackTraceOptimization
(
G
raphRewrit
er
):
"""Optimizer that serves to add `CheckStackTraceOptimization` as a feature."""
"""Optimizer that serves to add `CheckStackTraceOptimization` as a feature."""
def
add_requirements
(
self
,
fgraph
):
def
add_requirements
(
self
,
fgraph
):
...
@@ -3060,6 +3060,11 @@ DEPRECATED_NAMES = [
...
@@ -3060,6 +3060,11 @@ DEPRECATED_NAMES = [
"`LocalMetaOptimizerSkipAssertionError` is deprecated: use `MetaNodeRewriterSkip` instead."
,
"`LocalMetaOptimizerSkipAssertionError` is deprecated: use `MetaNodeRewriterSkip` instead."
,
MetaNodeRewriterSkip
,
MetaNodeRewriterSkip
,
),
),
(
"GlobalOptimizer"
,
"`GlobalOptimizer` is deprecated: use `GraphRewriter` instead."
,
GraphRewriter
,
),
]
]
...
...
aesara/graph/optdb.py
浏览文件 @
214ef4cf
...
@@ -11,14 +11,14 @@ from aesara.misc.ordered_set import OrderedSet
...
@@ -11,14 +11,14 @@ from aesara.misc.ordered_set import OrderedSet
from
aesara.utils
import
DefaultOrderedDict
from
aesara.utils
import
DefaultOrderedDict
OptimizersType
=
Union
[
aesara_opt
.
G
lobalOptimiz
er
,
aesara_opt
.
LocalOptimizer
]
OptimizersType
=
Union
[
aesara_opt
.
G
raphRewrit
er
,
aesara_opt
.
LocalOptimizer
]
class
OptimizationDatabase
:
class
OptimizationDatabase
:
"""A class that represents a collection/database of optimizations.
r
"""A class that represents a collection/database of optimizations.
These databases are used to logically organize collections of optimizers
These databases are used to logically organize collections of optimizers
(i.e. `
`GlobalOptimizer``s and ``LocalOptimizer`
`).
(i.e. `
GraphRewriter`\s and `LocalOptimizer
`).
"""
"""
def
__init__
(
self
):
def
__init__
(
self
):
...
@@ -61,7 +61,7 @@ class OptimizationDatabase:
...
@@ -61,7 +61,7 @@ class OptimizationDatabase:
optimizer
,
optimizer
,
(
(
OptimizationDatabase
,
OptimizationDatabase
,
aesara_opt
.
G
lobalOptimiz
er
,
aesara_opt
.
G
raphRewrit
er
,
aesara_opt
.
LocalOptimizer
,
aesara_opt
.
LocalOptimizer
,
),
),
):
):
...
@@ -311,7 +311,7 @@ class EquilibriumDB(OptimizationDatabase):
...
@@ -311,7 +311,7 @@ class EquilibriumDB(OptimizationDatabase):
Notes
Notes
-----
-----
We can use `LocalOptimizer` and `G
lobalOptimiz
er` since `EquilibriumOptimizer`
We can use `LocalOptimizer` and `G
raphRewrit
er` since `EquilibriumOptimizer`
supports both.
supports both.
It is probably not a good idea to have ignore_newtrees=False and
It is probably not a good idea to have ignore_newtrees=False and
...
@@ -506,7 +506,7 @@ class LocalGroupDB(SequenceDB):
...
@@ -506,7 +506,7 @@ class LocalGroupDB(SequenceDB):
class
TopoDB
(
OptimizationDatabase
):
class
TopoDB
(
OptimizationDatabase
):
"""Generate a `G
lobalOptimiz
er` of type TopoOptimizer."""
"""Generate a `G
raphRewrit
er` of type TopoOptimizer."""
def
__init__
(
def
__init__
(
self
,
db
,
order
=
"in_to_out"
,
ignore_newtrees
=
False
,
failure_callback
=
None
self
,
db
,
order
=
"in_to_out"
,
ignore_newtrees
=
False
,
failure_callback
=
None
...
...
aesara/ifelse.py
浏览文件 @
214ef4cf
...
@@ -22,7 +22,7 @@ from aesara.compile import optdb
...
@@ -22,7 +22,7 @@ from aesara.compile import optdb
from
aesara.configdefaults
import
config
from
aesara.configdefaults
import
config
from
aesara.graph.basic
import
Apply
,
Variable
,
clone_replace
,
is_in_ancestors
from
aesara.graph.basic
import
Apply
,
Variable
,
clone_replace
,
is_in_ancestors
from
aesara.graph.op
import
_NoPythonOp
from
aesara.graph.op
import
_NoPythonOp
from
aesara.graph.opt
import
G
lobalOptimiz
er
,
in2out
,
local_optimizer
from
aesara.graph.opt
import
G
raphRewrit
er
,
in2out
,
local_optimizer
from
aesara.graph.type
import
HasDataType
,
HasShape
from
aesara.graph.type
import
HasDataType
,
HasShape
from
aesara.tensor.shape
import
Reshape
,
Shape
,
SpecifyShape
,
Unbroadcast
from
aesara.tensor.shape
import
Reshape
,
Shape
,
SpecifyShape
,
Unbroadcast
...
@@ -583,7 +583,7 @@ def cond_merge_ifs_false(fgraph, node):
...
@@ -583,7 +583,7 @@ def cond_merge_ifs_false(fgraph, node):
return
op
(
*
old_ins
,
return_list
=
True
)
return
op
(
*
old_ins
,
return_list
=
True
)
class
CondMerge
(
G
lobalOptimiz
er
):
class
CondMerge
(
G
raphRewrit
er
):
"""Graph Optimizer that merges different cond ops"""
"""Graph Optimizer that merges different cond ops"""
def
add_requirements
(
self
,
fgraph
):
def
add_requirements
(
self
,
fgraph
):
...
...
aesara/scan/opt.py
浏览文件 @
214ef4cf
...
@@ -28,7 +28,7 @@ from aesara.graph.destroyhandler import DestroyHandler
...
@@ -28,7 +28,7 @@ from aesara.graph.destroyhandler import DestroyHandler
from
aesara.graph.features
import
ReplaceValidate
from
aesara.graph.features
import
ReplaceValidate
from
aesara.graph.fg
import
FunctionGraph
from
aesara.graph.fg
import
FunctionGraph
from
aesara.graph.op
import
compute_test_value
from
aesara.graph.op
import
compute_test_value
from
aesara.graph.opt
import
G
lobalOptimiz
er
,
in2out
,
local_optimizer
from
aesara.graph.opt
import
G
raphRewrit
er
,
in2out
,
local_optimizer
from
aesara.graph.optdb
import
EquilibriumDB
,
SequenceDB
from
aesara.graph.optdb
import
EquilibriumDB
,
SequenceDB
from
aesara.graph.type
import
HasShape
from
aesara.graph.type
import
HasShape
from
aesara.graph.utils
import
InconsistencyError
from
aesara.graph.utils
import
InconsistencyError
...
@@ -919,7 +919,7 @@ def push_out_add_scan(fgraph, node):
...
@@ -919,7 +919,7 @@ def push_out_add_scan(fgraph, node):
return
False
return
False
class
ScanInplaceOptimizer
(
G
lobalOptimiz
er
):
class
ScanInplaceOptimizer
(
G
raphRewrit
er
):
"""Make `Scan`s perform in-place.
"""Make `Scan`s perform in-place.
This optimization attempts to make `Scan` compute its recurrent outputs inplace
This optimization attempts to make `Scan` compute its recurrent outputs inplace
...
@@ -1658,7 +1658,7 @@ def save_mem_new_scan(fgraph, node):
...
@@ -1658,7 +1658,7 @@ def save_mem_new_scan(fgraph, node):
return
False
return
False
class
ScanMerge
(
G
lobalOptimiz
er
):
class
ScanMerge
(
G
raphRewrit
er
):
r"""Graph optimizer that merges different scan ops.
r"""Graph optimizer that merges different scan ops.
This optimization attempts to fuse distinct `Scan` `Op`s into a single `Scan` `Op`
This optimization attempts to fuse distinct `Scan` `Op`s into a single `Scan` `Op`
...
...
aesara/tensor/basic_opt.py
浏览文件 @
214ef4cf
...
@@ -27,7 +27,7 @@ from aesara.graph.features import AlreadyThere, Feature, ReplaceValidate
...
@@ -27,7 +27,7 @@ from aesara.graph.features import AlreadyThere, Feature, ReplaceValidate
from
aesara.graph.fg
import
FunctionGraph
from
aesara.graph.fg
import
FunctionGraph
from
aesara.graph.op
import
compute_test_value
,
get_test_value
from
aesara.graph.op
import
compute_test_value
,
get_test_value
from
aesara.graph.opt
import
(
from
aesara.graph.opt
import
(
G
lobalOptimiz
er
,
G
raphRewrit
er
,
OpRemove
,
OpRemove
,
check_chain
,
check_chain
,
copy_stack_trace
,
copy_stack_trace
,
...
@@ -162,7 +162,7 @@ def broadcast_like(value, template, fgraph, dtype=None):
...
@@ -162,7 +162,7 @@ def broadcast_like(value, template, fgraph, dtype=None):
return
rval
return
rval
class
InplaceElemwiseOptimizer
(
G
lobalOptimiz
er
):
class
InplaceElemwiseOptimizer
(
G
raphRewrit
er
):
r"""
r"""
This is parameterized so that it works for `Elemwise` `Op`\s.
This is parameterized so that it works for `Elemwise` `Op`\s.
"""
"""
...
@@ -1443,7 +1443,7 @@ class ShapeFeature(Feature):
...
@@ -1443,7 +1443,7 @@ class ShapeFeature(Feature):
return
type
(
self
)()
return
type
(
self
)()
class
ShapeOptimizer
(
G
lobalOptimiz
er
):
class
ShapeOptimizer
(
G
raphRewrit
er
):
"""Optimizer that adds `ShapeFeature` as a feature."""
"""Optimizer that adds `ShapeFeature` as a feature."""
def
add_requirements
(
self
,
fgraph
):
def
add_requirements
(
self
,
fgraph
):
...
@@ -1453,7 +1453,7 @@ class ShapeOptimizer(GlobalOptimizer):
...
@@ -1453,7 +1453,7 @@ class ShapeOptimizer(GlobalOptimizer):
pass
pass
class
UnShapeOptimizer
(
G
lobalOptimiz
er
):
class
UnShapeOptimizer
(
G
raphRewrit
er
):
"""Optimizer that removes `ShapeFeature` as a feature."""
"""Optimizer that removes `ShapeFeature` as a feature."""
def
apply
(
self
,
fgraph
):
def
apply
(
self
,
fgraph
):
...
@@ -3085,7 +3085,7 @@ def elemwise_max_input_fct(node):
...
@@ -3085,7 +3085,7 @@ def elemwise_max_input_fct(node):
local_elemwise_fusion
=
local_elemwise_fusion_op
(
Elemwise
,
elemwise_max_input_fct
)
local_elemwise_fusion
=
local_elemwise_fusion_op
(
Elemwise
,
elemwise_max_input_fct
)
class
FusionOptimizer
(
G
lobalOptimiz
er
):
class
FusionOptimizer
(
G
raphRewrit
er
):
"""Graph optimizer that simply runs local fusion operations.
"""Graph optimizer that simply runs local fusion operations.
TODO: This is basically a `EquilibriumOptimizer`; we should just use that.
TODO: This is basically a `EquilibriumOptimizer`; we should just use that.
...
...
aesara/tensor/blas.py
浏览文件 @
214ef4cf
...
@@ -147,7 +147,7 @@ from aesara.graph.features import ReplacementDidNotRemoveError, ReplaceValidate
...
@@ -147,7 +147,7 @@ from aesara.graph.features import ReplacementDidNotRemoveError, ReplaceValidate
from
aesara.graph.op
import
Op
from
aesara.graph.op
import
Op
from
aesara.graph.opt
import
(
from
aesara.graph.opt
import
(
EquilibriumOptimizer
,
EquilibriumOptimizer
,
G
lobalOptimiz
er
,
G
raphRewrit
er
,
copy_stack_trace
,
copy_stack_trace
,
in2out
,
in2out
,
local_optimizer
,
local_optimizer
,
...
@@ -1496,7 +1496,7 @@ def _gemm_from_node2(fgraph, node):
...
@@ -1496,7 +1496,7 @@ def _gemm_from_node2(fgraph, node):
return
None
,
t1
-
t0
,
0
,
0
return
None
,
t1
-
t0
,
0
,
0
class
GemmOptimizer
(
G
lobalOptimiz
er
):
class
GemmOptimizer
(
G
raphRewrit
er
):
"""Graph optimizer for inserting Gemm operations."""
"""Graph optimizer for inserting Gemm operations."""
def
__init__
(
self
):
def
__init__
(
self
):
...
...
doc/extending/graph_rewriting.rst
浏览文件 @
214ef4cf
...
@@ -39,10 +39,10 @@ we want to define are local.
...
@@ -39,10 +39,10 @@ we want to define are local.
.. optimizer:
.. optimizer:
G
lobal optimization
G
raph Rewriting
---------------
----
---------------
.. class:: G
lobalOptimiz
er
.. class:: G
raphRewrit
er
.. method:: apply(fgraph)
.. method:: apply(fgraph)
...
@@ -54,12 +54,12 @@ Global optimization
...
@@ -54,12 +54,12 @@ Global optimization
This method takes a :class:`FunctionGraph` object and adds :ref:`features
This method takes a :class:`FunctionGraph` object and adds :ref:`features
<libdoc_graph_fgraphfeature>` to it. These features are "plugins" that are needed
<libdoc_graph_fgraphfeature>` to it. These features are "plugins" that are needed
for the :meth:`G
lobalOptimiz
er.apply` method to do its job properly.
for the :meth:`G
raphRewrit
er.apply` method to do its job properly.
.. method:: optimize(fgraph)
.. method:: optimize(fgraph)
This is the interface function called by Aesara. It calls
This is the interface function called by Aesara. It calls
:meth:`G
lobalOptimiz
er.apply` by default.
:meth:`G
raphRewrit
er.apply` by default.
Local optimization
Local optimization
...
@@ -101,10 +101,10 @@ simplification described above:
...
@@ -101,10 +101,10 @@ simplification described above:
.. testcode::
.. testcode::
import aesara
import aesara
from aesara.graph.opt import G
lobalOptimiz
er
from aesara.graph.opt import G
raphRewrit
er
from aesara.graph.features import ReplaceValidate
from aesara.graph.features import ReplaceValidate
class Simplify(G
lobalOptimiz
er):
class Simplify(G
raphRewrit
er):
def add_requirements(self, fgraph):
def add_requirements(self, fgraph):
fgraph.attach_feature(ReplaceValidate())
fgraph.attach_feature(ReplaceValidate())
...
@@ -136,7 +136,7 @@ another while respecting certain validation constraints. As an
...
@@ -136,7 +136,7 @@ another while respecting certain validation constraints. As an
exercise, try to rewrite :class:`Simplify` using :class:`NodeFinder`. (Hint: you
exercise, try to rewrite :class:`Simplify` using :class:`NodeFinder`. (Hint: you
want to use the method it publishes instead of the call to toposort)
want to use the method it publishes instead of the call to toposort)
Then, in :meth:`G
lobalOptimiz
er.apply` we do the actual job of simplification. We start by
Then, in :meth:`G
raphRewrit
er.apply` we do the actual job of simplification. We start by
iterating through the graph in topological order. For each node
iterating through the graph in topological order. For each node
encountered, we check if it's a ``div`` node. If not, we have nothing
encountered, we check if it's a ``div`` node. If not, we have nothing
to do here. If so, we put in ``x``, ``y`` and ``z`` the numerator,
to do here. If so, we put in ``x``, ``y`` and ``z`` the numerator,
...
...
tests/graph/test_optdb.py
浏览文件 @
214ef4cf
...
@@ -10,7 +10,7 @@ from aesara.graph.optdb import (
...
@@ -10,7 +10,7 @@ from aesara.graph.optdb import (
)
)
class
TestOpt
(
opt
.
G
lobalOptimiz
er
):
class
TestOpt
(
opt
.
G
raphRewrit
er
):
name
=
"blah"
name
=
"blah"
def
apply
(
self
,
fgraph
):
def
apply
(
self
,
fgraph
):
...
...
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论