% COMMENT: Might want to say that EBLearn and Torch5 are specialized libraries written by
% practitioners specifically for these tasks, rest are our own implementations
% Also brief explanation of numexpr: "similar to Theano, 'virtual machine' for array-based expressions'
% but less features implemented
\begin{itemize}
\item EBLearn, Torch5: specialized libraries written by practitioners specifically for these tasks
\item numexpr: similar to Theano, 'virtual machine' for elemwise expressions
\end{itemize}
}
\frame{
...
...
@@ -639,7 +642,7 @@ Multi-Layer Perceptron: 60x784 matrix times 784x500 matrix, tanh, times 500x10 m
\frame{
\frametitle{Benchmark Convolutional Network}
Convolutional Network: 256x256 images convolved with 6 7x7 filters, downsampled to 6x50x50, tanh, convolution with 16 6x7x7 filter, elementwise tanh, matrix multiply, elementwise, then in reverse % COMMENT: what does last elementwise mean?
Convolutional Network: 256x256 images convolved with 6 7x7 filters, downsampled to 6x50x50, tanh, convolution with 16 6x7x7 filter, elementwise tanh, matrix multiply, softmax elementwise, then in reverse
\item There are even more if we include other languages
\end{itemize}
\item All of them are a subset of the functionality of \texttt{numpy.ndarray} on the GPU
...
...
@@ -1369,6 +1368,19 @@ print numpy.asarray(f(xv))
\item It {\bf works} and is {\bf used in the real world} by academic researchers \textit{and} industry
\end{itemize}
}
% COMMENT: it is often customary to have a slide with thank yous to the audience and to funding agencies and stuff at the end, I don't know
% which ones provided funding... NSERC? CIFAR?
\frame{
\frametitle{Thanks}
\begin{itemize}
\item Thanks for attending this tutorial
\vfill
\item Thanks to our agencies that resources for this projects: Calcul Qu\'ebec, CIFAR, Compute Canada, FQRNT, MITACS, NSERC, SciNet, SHARCNET, Ubisoft and WestGrid.