- 18 2月, 2010 6 次提交
-
-
由 James Bergstra 提交于
specialization. It inserts a new op called PseudoGemm. This Op has the same signature as Gemm but does not work inplace. Another optimization comes later in the pipeline and swaps PseudoGemm for Gemm
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
An Env feature persists throughout the life of the env, so many optimizations can take advantage of the Shape analysis done by ShapeFeature.
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
exact output from the rngs, the float32-casted sampling bounds mess things up.
-
- 17 2月, 2010 23 次提交
-
-
由 James Bergstra 提交于
phase.
-
由 James Bergstra 提交于
graphs.
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
of infer_shape
-
由 James Bergstra 提交于
optimizer. Also, moved MakeVector to tensor.opt since it is only used internally by ShapeOpt.
-
由 James Bergstra 提交于
that is scheduled to run first, before the first merge.
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
first when sorted.
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
- 16 2月, 2010 8 次提交
-
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
Canonicalizer's get_num_denum function... this might break tests, but it was essential for getting rid of numerical instabilities in RBM free energy grad.
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
-
由 James Bergstra 提交于
internally, to print results, apply nodes, and functions (with multiple outputs)
-
由 James Bergstra 提交于
indentation in big graphs... still a bit ugly though.
-
由 James Bergstra 提交于
-
- 13 2月, 2010 1 次提交
-
-
由 James Bergstra 提交于
-
- 17 2月, 2010 2 次提交
-
-
由 Razvan Pascanu 提交于
-
由 Pascal Lamblin 提交于
-