Skip to content
项目
群组
代码片段
帮助
当前项目
正在载入...
登录 / 注册
切换导航面板
P
pytensor
项目
项目
详情
活动
周期分析
仓库
仓库
文件
提交
分支
标签
贡献者
图表
比较
统计图
议题
0
议题
0
列表
看板
标记
里程碑
合并请求
0
合并请求
0
CI / CD
CI / CD
流水线
作业
日程
统计图
Wiki
Wiki
代码片段
代码片段
成员
成员
折叠边栏
关闭边栏
活动
图像
聊天
创建新问题
作业
提交
问题看板
Open sidebar
testgroup
pytensor
Commits
1ddf666e
提交
1ddf666e
authored
6月 06, 2021
作者:
Brandon T. Willard
提交者:
Brandon T. Willard
6月 07, 2021
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Add missing documentation formatting and docstrings
上级
f22d3165
全部展开
隐藏空白字符变更
内嵌
并排
正在显示
30 个修改的文件
包含
96 行增加
和
109 行删除
+96
-109
dnn.py
aesara/gpuarray/dnn.py
+1
-5
basic.py
aesara/graph/basic.py
+0
-0
fg.py
aesara/graph/fg.py
+39
-49
op.py
aesara/graph/op.py
+0
-0
opt.py
aesara/graph/opt.py
+0
-0
type.py
aesara/graph/type.py
+6
-2
interface.py
aesara/link/c/interface.py
+0
-0
opt.py
aesara/sparse/opt.py
+2
-2
basic_opt.py
aesara/tensor/basic_opt.py
+4
-4
extra_ops.py
aesara/tensor/extra_ops.py
+0
-0
subtensor.py
aesara/tensor/subtensor.py
+1
-1
conf.py
doc/conf.py
+1
-1
extending_aesara.rst
doc/extending/extending_aesara.rst
+0
-0
graphstructures.rst
doc/extending/graphstructures.rst
+1
-1
op.rst
doc/extending/op.rst
+0
-0
optimization.rst
doc/extending/optimization.rst
+0
-0
pipeline.rst
doc/extending/pipeline.rst
+24
-25
tips.rst
doc/extending/tips.rst
+17
-19
type.rst
doc/extending/type.rst
+0
-0
unittest.rst
doc/extending/unittest.rst
+0
-0
glossary.rst
doc/glossary.rst
+0
-0
index.rst
doc/index.rst
+0
-0
introduction.rst
doc/introduction.rst
+0
-0
config.rst
doc/library/config.rst
+0
-0
basic.rst
doc/library/tensor/basic.rst
+0
-0
basic.rst
doc/library/tensor/nnet/basic.rst
+0
-0
basic.rst
doc/library/tensor/random/basic.rst
+0
-0
elemwise_compiler.rst
doc/sandbox/elemwise_compiler.rst
+0
-0
sandbox.rst
doc/sandbox/sandbox.rst
+0
-0
examples.rst
doc/tutorial/examples.rst
+0
-0
没有找到文件。
aesara/gpuarray/dnn.py
浏览文件 @
1ddf666e
...
...
@@ -470,11 +470,7 @@ def get_precision(precision, inputs, for_grad=False):
class
DnnBase
(
_NoPythonExternalCOp
):
"""
Creates a handle for cudnn and pulls in the cudnn libraries and headers.
"""
"""An `Op` that creates a handle for cudnn and pulls in the cudnn libraries and headers."""
# dnn does not know about broadcasting, so we do not need to assert
# the input broadcasting pattern.
...
...
aesara/graph/basic.py
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
aesara/graph/fg.py
浏览文件 @
1ddf666e
...
...
@@ -241,9 +241,10 @@ class FunctionGraph(MetaObject):
Parameters
----------
var : Variable.
var : Variable
The `Variable` to be updated.
new_client : (Apply, int)
A `
(node, i)` pair such that `node.inputs[i]
` is `var`.
A `
`(node, i)`` pair such that ``node.inputs[i]`
` is `var`.
"""
self
.
clients
[
var
]
.
append
(
new_client
)
...
...
@@ -251,7 +252,7 @@ class FunctionGraph(MetaObject):
def
remove_client
(
self
,
var
:
Variable
,
client_to_remove
:
Tuple
[
Apply
,
int
],
reason
:
str
=
None
)
->
None
:
"""Recursively remove
s
clients of a variable.
"""Recursively remove clients of a variable.
This is the main method to remove variables or `Apply` nodes from
a `FunctionGraph`.
...
...
@@ -265,7 +266,7 @@ class FunctionGraph(MetaObject):
var : Variable
The clients of `var` that will be removed.
client_to_remove : pair of (Apply, int)
A `
(node, i)` pair such that `node.inputs[i]
` will no longer be
A `
`(node, i)`` pair such that ``node.inputs[i]`
` will no longer be
`var` in this `FunctionGraph`.
"""
...
...
@@ -359,11 +360,11 @@ class FunctionGraph(MetaObject):
reason
:
str
=
None
,
import_missing
:
bool
=
False
,
)
->
None
:
"""Recursively import everything between an `
Apply` node and the `FunctionGraph
`'s outputs.
"""Recursively import everything between an `
`Apply`` node and the ``FunctionGraph`
`'s outputs.
Parameters
----------
apply_node :
aesara.graph.basic.
Apply
apply_node : Apply
The node to be imported.
check : bool
Check that the inputs for the imported nodes are also present in
...
...
@@ -419,7 +420,7 @@ class FunctionGraph(MetaObject):
def
change_input
(
self
,
node
:
Apply
,
node
:
Union
[
Apply
,
str
]
,
i
:
int
,
new_var
:
Variable
,
reason
:
str
=
None
,
...
...
@@ -435,15 +436,15 @@ class FunctionGraph(MetaObject):
Parameters
----------
node
: aesara.graph.basic.Apply or str
node
The node for which an input is to be changed. If the value is
the string ``"output"`` then the ``self.outputs`` will be used
instead of ``node.inputs``.
i
: int
i
The index in `node.inputs` that we want to change.
new_var
: aesara.graph.basic.Variable
new_var
The new variable to take the place of ``node.inputs[i]``.
import_missing
: bool
import_missing
Add missing inputs instead of raising an exception.
"""
# TODO: ERROR HANDLING FOR LISTENERS (should it complete the change or revert it?)
...
...
@@ -494,15 +495,15 @@ class FunctionGraph(MetaObject):
Parameters
----------
var
: aesara.graph.basic.Variable
var
The variable to be replaced.
new_var
: aesara.graph.basic.Variable
new_var
The variable to replace `var`.
reason
: str
reason
The name of the optimization or operation in progress.
verbose
: bool
verbose
Print `reason`, `var`, and `new_var`.
import_missing
: bool
import_missing
Import missing variables.
"""
...
...
@@ -548,12 +549,12 @@ class FunctionGraph(MetaObject):
)
def
replace_all
(
self
,
pairs
:
List
[
Tuple
[
Variable
,
Variable
]],
**
kwargs
)
->
None
:
"""Replace variables in the `
`FunctionGraph`
` according to ``(var, new_var)`` pairs in a list."""
"""Replace variables in the `
FunctionGraph
` according to ``(var, new_var)`` pairs in a list."""
for
var
,
new_var
in
pairs
:
self
.
replace
(
var
,
new_var
,
**
kwargs
)
def
attach_feature
(
self
,
feature
:
Feature
)
->
None
:
"""Add a ``graph.features.Feature`` to this function graph and trigger its
on_attach
callback."""
"""Add a ``graph.features.Feature`` to this function graph and trigger its
``on_attach``
callback."""
# Filter out literally identical `Feature`s
if
feature
in
self
.
_features
:
return
# the feature is already present
...
...
@@ -579,10 +580,9 @@ class FunctionGraph(MetaObject):
self
.
_features
.
append
(
feature
)
def
remove_feature
(
self
,
feature
:
Feature
)
->
None
:
"""
Removes the feature from the graph.
"""Remove a feature from the graph.
Calls
feature.on_detach(function_graph) if an on_detach
method
Calls
``feature.on_detach(function_graph)`` if an ``on_detach``
method
is defined.
"""
...
...
@@ -596,9 +596,9 @@ class FunctionGraph(MetaObject):
detach
(
self
)
def
execute_callbacks
(
self
,
name
:
str
,
*
args
,
**
kwargs
)
->
None
:
"""Execute callbacks
"""Execute callbacks
.
Calls `
getattr(feature, name)(*args)
` for each feature which has
Calls `
`getattr(feature, name)(*args)`
` for each feature which has
a method called after name.
"""
...
...
@@ -619,8 +619,7 @@ class FunctionGraph(MetaObject):
def
collect_callbacks
(
self
,
name
:
str
,
*
args
)
->
Dict
[
Feature
,
Any
]:
"""Collects callbacks
Returns a dictionary d such that
`d[feature] == getattr(feature, name)(*args)`
Returns a dictionary d such that ``d[feature] == getattr(feature, name)(*args)``
For each feature which has a method called after name.
"""
d
=
{}
...
...
@@ -633,17 +632,17 @@ class FunctionGraph(MetaObject):
return
d
def
toposort
(
self
)
->
List
[
Apply
]:
"""
Toposort
"""
Return a toposorted list of the nodes.
Return an ordering of the graph's
Apply nodes such that
Return an ordering of the graph's
``Apply`` nodes such that:
*
All the nodes of the inputs of a node are before that node.
*
Satisfies
the orderings provided by each feature that has
an
'orderings'
method.
*
all the nodes of the inputs of a node are before that node and
*
they satisfy
the orderings provided by each feature that has
an
``orderings``
method.
If a feature has an
'orderings'
method, it will be called with
this
FunctionGraph
as sole argument. It should return a dictionary of
`
{node: predecessors}
` where predecessors is a list of nodes that
If a feature has an
``orderings``
method, it will be called with
this
`FunctionGraph`
as sole argument. It should return a dictionary of
`
`{node: predecessors}`
` where predecessors is a list of nodes that
should be computed before the key node.
"""
if
len
(
self
.
apply_nodes
)
<
2
:
...
...
@@ -661,15 +660,15 @@ class FunctionGraph(MetaObject):
return
order
def
orderings
(
self
)
->
Dict
[
Apply
,
List
[
Apply
]]:
"""Return `
dict` `d` s.t. `d[node]` is a list of nodes that must be evaluated before `node
` itself can be evaluated.
"""Return `
`dict`` ``d`` s.t. ``d[node]`` is a list of nodes that must be evaluated before ``node`
` itself can be evaluated.
This is used primarily by the
destroy_handler
feature to ensure that
This is used primarily by the
``destroy_handler``
feature to ensure that
the clients of any destroyed inputs have already computed their
outputs.
Notes
-----
This only calls the `
orderings()
` function on all features. It does not
This only calls the `
`orderings()`
` function on all features. It does not
take care of computing the dependencies by itself.
"""
...
...
@@ -707,10 +706,7 @@ class FunctionGraph(MetaObject):
return
ords
def
check_integrity
(
self
)
->
None
:
"""
Call this for a diagnosis if things go awry.
"""
"""Check the integrity of nodes in the graph."""
nodes
=
set
(
applys_between
(
self
.
inputs
,
self
.
outputs
))
if
self
.
apply_nodes
!=
nodes
:
missing
=
nodes
.
difference
(
self
.
apply_nodes
)
...
...
@@ -763,10 +759,7 @@ class FunctionGraph(MetaObject):
return
f
"FunctionGraph({', '.join(graph_as_string(self.inputs, self.outputs))})"
def
clone
(
self
,
check_integrity
=
True
)
->
"FunctionGraph"
:
"""
Clone the graph and get a memo( a dict )that map old node to new node
"""
"""Clone the graph."""
return
self
.
clone_get_equiv
(
check_integrity
)[
0
]
def
clone_get_equiv
(
...
...
@@ -806,11 +799,8 @@ class FunctionGraph(MetaObject):
return
e
,
equiv
def
__getstate__
(
self
):
"""
This is needed as some features introduce instance methods.
This is not picklable.
"""
# This is needed as some features introduce instance methods
# This is not picklable
d
=
self
.
__dict__
.
copy
()
for
feature
in
self
.
_features
:
for
attr
in
getattr
(
feature
,
"pickle_rm_attr"
,
[]):
...
...
aesara/graph/op.py
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
aesara/graph/opt.py
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
aesara/graph/type.py
浏览文件 @
1ddf666e
...
...
@@ -33,11 +33,15 @@ class Type(MetaObject):
"""
# the type that will be created by a call to make_variable.
Variable
=
Variable
"""
The `Type` that will be created by a call to `Type.make_variable`.
"""
# the type that will be created by a call to make_constant
Constant
=
Constant
"""
The `Type` that will be created by a call to `Type.make_constant`.
"""
@abstractmethod
def
filter
(
...
...
aesara/link/c/interface.py
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
aesara/sparse/opt.py
浏览文件 @
1ddf666e
...
...
@@ -1845,10 +1845,10 @@ class SamplingDotCSR(_NoPythonCOp):
multiplication.
If we have the input of mixed dtype, we insert cast elemwise
in the graph to be able to call
blas
function as they don't
in the graph to be able to call
BLAS
function as they don't
allow mixed dtype.
This
op is used as an optimization for SamplingDot
.
This
`Op` is used as an optimization for `SamplingDot`
.
"""
...
...
aesara/tensor/basic_opt.py
浏览文件 @
1ddf666e
...
...
@@ -216,8 +216,8 @@ def broadcast_like(value, template, fgraph, dtype=None):
class
InplaceElemwiseOptimizer
(
GlobalOptimizer
):
"""
We parametrise it to make it work for Elemwise and GpuElemwise op
.
r
"""
This is parameterized so that it works for `Elemwise` and `GpuElemwise` `Op`\s
.
"""
def
__init__
(
self
,
OP
):
...
...
@@ -1469,7 +1469,7 @@ class ShapeFeature(features.Feature):
class
ShapeOptimizer
(
GlobalOptimizer
):
"""Optimizer that
serves to add ShapeFeature as an fgraph
feature."""
"""Optimizer that
adds `ShapeFeature` as a
feature."""
def
add_requirements
(
self
,
fgraph
):
fgraph
.
attach_feature
(
ShapeFeature
())
...
...
@@ -1479,7 +1479,7 @@ class ShapeOptimizer(GlobalOptimizer):
class
UnShapeOptimizer
(
GlobalOptimizer
):
"""Optimizer
remove ShapeFeature as an fgraph
feature."""
"""Optimizer
that removes `ShapeFeature` as a
feature."""
def
apply
(
self
,
fgraph
):
for
feature
in
fgraph
.
_features
:
...
...
aesara/tensor/extra_ops.py
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
aesara/tensor/subtensor.py
浏览文件 @
1ddf666e
...
...
@@ -112,7 +112,7 @@ def indices_from_subtensor(
def
as_index_constant
(
a
):
r"""Convert Python literals to Aesara constants--when possible--in
Subtensor
arguments.
r"""Convert Python literals to Aesara constants--when possible--in
`Subtensor`
arguments.
This will leave `Variable`\s untouched.
"""
...
...
doc/conf.py
浏览文件 @
1ddf666e
...
...
@@ -102,7 +102,7 @@ exclude_dirs = ["images", "scripts", "sandbox"]
# The reST default role (used for this markup: `text`) to use for all
# documents.
# default_role = None
default_role
=
"py:obj"
# If true, '()' will be appended to :func: etc. cross-reference text.
# add_function_parentheses = True
...
...
doc/extending/extending_aesara.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/extending/graphstructures.rst
浏览文件 @
1ddf666e
...
...
@@ -91,7 +91,7 @@ output. You can now print the name of the op that is applied to get
>>> y.owner.op.name
'Elemwise{mul,no_inplace}'
Hence, an elementwise multiplication is used to compute *y*. This
Hence, an element
-
wise multiplication is used to compute *y*. This
multiplication is done between the inputs:
>>> len(y.owner.inputs)
...
...
doc/extending/op.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/extending/optimization.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/extending/pipeline.rst
浏览文件 @
1ddf666e
...
...
@@ -24,10 +24,10 @@ in the :ref:`graphstructures` article.
Compilation of the computation graph
------------------------------------
Once the user has built a computation graph,
she
can use
``aesara.function`
` in order to make one or more functions that
Once the user has built a computation graph,
they
can use
:func:`aesara.function
` in order to make one or more functions that
operate on real data. function takes a list of input :ref:`Variables
<variable>` as well as a list of output
Variable
s that define a
<variable>` as well as a list of output
:class:`Variable`\
s that define a
precise subgraph corresponding to the function(s) we want to define,
compile that subgraph and produce a callable.
...
...
@@ -35,32 +35,32 @@ Here is an overview of the various steps that are done with the
computation graph in the compilation phase:
Step 1 - Create a
FunctionGraph
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 1 - Create a
:class:`FunctionGraph`
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^
The subgraph given by the end user is wrapped in a structure called
*FunctionGraph*
. That structure defines several hooks on adding and
:class:`FunctionGraph`
. That structure defines several hooks on adding and
removing (pruning) nodes as well as on modifying links between nodes
(for example, modifying an input of an :ref:`apply` node) (see the
article about :ref:`libdoc_graph_fgraph` for more information).
FunctionGraph provides a method to change the input of an Apply
node from one
Variable to another and a more high-level method to replace a Variable
:class:`FunctionGraph` provides a method to change the input of an :class:`Apply`
node from one
:class:`Variable` to another and a more high-level method to replace a :class:`Variable`
with another. This is the structure that :ref:`Optimizers
<optimization>` work on.
Some relevant :ref:`Features <libdoc_graph_fgraphfeature>` are typically added to the
FunctionGraph
, namely to prevent any optimization from operating inplace on
:class:`FunctionGraph`
, namely to prevent any optimization from operating inplace on
inputs declared as immutable.
Step 2 - Execute main
Optimizer
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step 2 - Execute main
:class:`Optimizer`
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^
Once the
FunctionGraph
is made, an :term:`optimizer` is produced by
the :term:`mode` passed to
``function`` (the Mode
basically has two
important fields,
``linker`` and ``optimizer`
`). That optimizer is
applied on the
FunctionGraph using its optimize()
method.
Once the
:class:`FunctionGraph`
is made, an :term:`optimizer` is produced by
the :term:`mode` passed to
:func:`function` (the :class:`Mode`
basically has two
important fields,
:attr:`linker` and :attr:`optimizer
`). That optimizer is
applied on the
:class:`FunctionGraph` using its :meth:`Optimizer.optimize`
method.
The optimizer is typically obtained through :attr:`optdb`.
...
...
@@ -69,11 +69,10 @@ Step 3 - Execute linker to obtain a thunk
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Once the computation graph is optimized, the :term:`linker` is
extracted from the Mode. It is then called with the FunctionGraph as
argument to
produce a ``thunk``, which is a function with no arguments that
extracted from the :class:`Mode`. It is then called with the :class:`FunctionGraph` as
argument to produce a ``thunk``, which is a function with no arguments that
returns nothing. Along with the thunk, one list of input containers (a
`aesara.link.basic.Container` is a sort of object that wraps another and does
:class:
`aesara.link.basic.Container` is a sort of object that wraps another and does
type casting) and one list of output containers are produced,
corresponding to the input and output :class:`Variable`\s as well as the updates
defined for the inputs when applicable. To perform the computations,
...
...
@@ -83,18 +82,18 @@ where the thunk put them.
Typically, the linker calls the ``toposort`` method in order to obtain
a linear sequence of operations to perform. How they are linked
together depends on the Linker used. The `CLinker` produces a single
block of C code for the whole computation, whereas the `OpWiseCLinker`
together depends on the Linker used. The
:class:
`CLinker` produces a single
block of C code for the whole computation, whereas the
:class:
`OpWiseCLinker`
produces one thunk for each individual operation and calls them in
sequence.
The linker is where some options take effect: the ``strict`` flag of
an input makes the associated input container do type checking. The
``borrow`` flag of an output, if
False
, adds the output to a
``borrow`` flag of an output, if
``False``
, adds the output to a
``no_recycling`` list, meaning that when the thunk is called the
output containers will be cleared (if they stay there, as would be the
case if ``borrow`` was True, the thunk would be allowed to reuse
(
or
"recycle"
)
the storage).
case if ``borrow`` was True, the thunk would be allowed to reuse
--
or
"recycle"
--
the storage).
.. note::
...
...
@@ -119,6 +118,6 @@ Step 4 - Wrap the thunk in a pretty package
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The thunk returned by the linker along with input and output
containers is unwieldy.
``function`
` hides that complexity away so
containers is unwieldy.
:func:`aesara.function
` hides that complexity away so
that it can be used like a normal function with arguments and return
values.
doc/extending/tips.rst
浏览文件 @
1ddf666e
====
Tips
====
...
...
@@ -8,15 +6,15 @@ Tips
Reusing outputs
===============
WRITEME
.. todo:: Write this.
Don't define new
Op
s unless you have to
=======================================
Don't define new
:class:`Op`\
s unless you have to
=======================================
==========
It is usually not useful to define
Op
s that can be easily
implemented using other already existing
Op
s. For example, instead of
writing a "sum_square_difference"
Op
, you should probably just write a
It is usually not useful to define
:class:`Op`\
s that can be easily
implemented using other already existing
:class:`Op`\
s. For example, instead of
writing a "sum_square_difference"
:class:`Op`
, you should probably just write a
simple function:
.. testcode::
...
...
@@ -33,23 +31,23 @@ a custom implementation would probably only bother to support
contiguous vectors/matrices of doubles...
Use Aesara's high order
Op
s when applicable
===========================================
Use Aesara's high order
:class:`Op`\
s when applicable
===========================================
==========
Aesara provides some generic
Op
classes which allow you to generate a
lot of
Ops at a lesser effort. For instance, Elemwise
can be used to
make :term:`elemwise` operations easily
whereas DimShuffle
can be
used to make transpose-like transformations. These higher order
Op
s
are mostly
T
ensor-related, as this is Aesara's specialty.
Aesara provides some generic
:class:`Op`
classes which allow you to generate a
lot of
:class:`Op`\s at a lesser effort. For instance, :class:`Elemwise`
can be used to
make :term:`elemwise` operations easily
, whereas :class:`DimShuffle`
can be
used to make transpose-like transformations. These higher order
:class:`Op`\
s
are mostly
t
ensor-related, as this is Aesara's specialty.
.. _opchecklist:
Op
Checklist
============
:class:`Op`
Checklist
============
=========
Use this list to make sure you haven't forgotten anything when
defining a new
Op
. It might not be exhaustive but it covers a lot of
defining a new
:class:`Op`
. It might not be exhaustive but it covers a lot of
common mistakes.
WRITEME
.. todo:: Write a list.
doc/extending/type.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/extending/unittest.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/glossary.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/index.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/introduction.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/library/config.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/library/tensor/basic.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/library/tensor/nnet/basic.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/library/tensor/random/basic.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/sandbox/elemwise_compiler.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/sandbox/sandbox.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
doc/tutorial/examples.rst
浏览文件 @
1ddf666e
差异被折叠。
点击展开。
编写
预览
Markdown
格式
0%
重试
或
添加新文件
添加附件
取消
您添加了
0
人
到此讨论。请谨慎行事。
请先完成此评论的编辑!
取消
请
注册
或者
登录
后发表评论