提交 8ba7b59a authored 作者: Jeremiah Lowin's avatar Jeremiah Lowin

docstring updates

上级 e99bee59
...@@ -7114,25 +7114,28 @@ def dot(a, b): ...@@ -7114,25 +7114,28 @@ def dot(a, b):
""" """
Computes the dot product of two variables. For two matrices, this is Computes the dot product of two variables. For two matrices, this is
equivalent to matrix multiplication. For two vectors, this is the inner equivalent to matrix multiplication. For two vectors, this is the inner
product. When one variable is a scalar, it is like elementwise product. When one variable is a scalar, this is like elementwise
multiplication. For N dimensions, it is a sum product over the last axis multiplication. For N dimensions, this is a sum product over the last axis
of the first array and the second-to-last axis of the second array: of the first array and the second-to-last axis of the second array:
dot(a, b)[i,j,k,m] = sum(a[i,j,:] * b[k,:,m]) dot(a, b)[i,j,k,m] = sum(a[i,j,:] * b[k,:,m])
Note that this dot function will do one of three things, in this sequence: Note that this dot function does one of three things, in the following
sequence:
1. If either a or b is scalar, it returns the elementwise product 1. If either a or b is scalar, it returns the elementwise product
without calling the Dot op. without calling the Theano Dot op.
2. If either a or b has more than 2 dimensions, it calls the tensordot 2. If either a or b has more than 2 dimensions, it calls Theano's
function instead of the Dot op. Tensordot expresses high-dimensional tensordot function with appropriate axes. The tensordot function
dot products as matrix multiplication and is faster than using a expresses high-dimensional dot products in terms of 2D matrix
high-dimensional Dot op. multiplications, so it may be possible to futherize optimize for
performance.
3. Otherwise, calls the Dot op on a and b. 3. If both a and b have either 1 or 2 dimensions, it calls Theano's
Dot op on a and b.
:note: matrix-matrix products are sometimes optimized to Dot22 ops :note: matrix-matrix products are sometimes optimized to Dot22 ops.
(see tensor.blas)
:note: non matrix-matrix products (including matrix-vector :note: non matrix-matrix products (including matrix-vector
products) are handled by numpy. Ensure that you have linked numpy products) are handled by numpy. Ensure that you have linked numpy
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论