提交 ec0e7537 authored 作者: Frederic's avatar Frederic

Added link in the doc.

上级 07d96153
...@@ -1187,12 +1187,14 @@ Gradient / Differentiation ...@@ -1187,12 +1187,14 @@ Gradient / Differentiation
:returns: gradients of the cost with respect to each of the `wrt` terms :returns: gradients of the cost with respect to each of the `wrt` terms
.. _R_op_list:
R op
====
List of Implemented R op
========================
See the tutorial for the R op documentation.
See the :ref:`gradient tutorial <tutcomputinggrads>` for the R op documentation.
list of ops that support R-op: list of ops that support R-op:
* with test [Most is tensor/tests/test_rop.py] * with test [Most is tensor/tests/test_rop.py]
......
...@@ -200,6 +200,7 @@ you need to do something similar to this: ...@@ -200,6 +200,7 @@ you need to do something similar to this:
>>> f([[1,1],[1,1]], [[2,2,],[2,2]], [0,1]) >>> f([[1,1],[1,1]], [[2,2,],[2,2]], [0,1])
array([ 2., 2.]) array([ 2., 2.])
:ref:`List <R_op_list>` of Op that implement Rop.
L-operator L-operator
---------- ----------
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论