提交 47980073 authored 作者: abalkin's avatar abalkin 提交者: Frederic

More explanation of what Eig.grad() should return.

上级 8a51fa05
...@@ -922,13 +922,21 @@ class Eig(Op): ...@@ -922,13 +922,21 @@ class Eig(Op):
def grad(self, inputs, g_outputs): def grad(self, inputs, g_outputs):
r"""The gradient function should return r"""The gradient function should return
.. math:: W\frac{\partial\,\mbox{eig}(X)[0]} .. math:: \sum_n\left( W_n\frac{\partial\,\lambda_n}
{\partial X} + {\partial X} +
V\frac{\partial\,\mbox{eig}(X)[1]} \sum_k V_{nk}\frac{\partial\,\Psi_{nk}}
{\partial X}, {\partial X}\right),
where [:math:`W`, :math:`V`] corresponds to ``g_outputs`` and where [:math:`W`, :math:`V`] corresponds to ``g_outputs``,
:math:`X` to ``inputs``. :math:`X` to ``inputs``, and :math:`(\lambda, \Psi)=\mbox{eig}(X)`.
.. math:: \frac{\partial\,\lambda_n}
{\partial X_{ij}} = \Psi_{ni}\,\Psi_{nj}
.. math:: \frac{\partial\,\Psi_{ni}}
{\partial X_{jk}} =
\left((X-\lambda_n)^{-1}(\Psi_n\otimes\Psi_n-1)\right)_{ij}\Psi_{nk}
""" """
return [grad_not_implemented(self, 0, x, "Work in progress.")] return [grad_not_implemented(self, 0, x, "Work in progress.")]
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论