As someone who has previous used Theano / Lasagne, I noticed immediately when a Bengio-lab paper offered its accompanying code in TensorFlow. It’s possible that the tides are changing for Theano - and I want to make sure that my current Deep Learning Workshop repo remains relevant to what people believe they should learn…

## CNN MNIST using various Frameworks

The following is just a collection of code samples for solving CNN MNIST (all using roughly the same network structure) using different deep learning frameworks (without additional sugar layers) :

### Caffe

Caffe is configured via a plain-text .prototxt file, and then run on the command line with switches.

The following is from the main Caffe repo, with a companion tutorial :

### Torch

When building a simple CNN for MNIST, note that Torch code involves lua rather than a Python interface.

The following is from the main Torch demos repo :

### CNTK

The following is from a detailed blog posting, which is a CNN translation of the TensorFlow CNN for MNIST, since the standard example that Microsoft gives in its CNTK tutorials focusses on a fully-connected network :

### PaddlePaddle

Finding this proved more difficult to find than expected because all the relevant Issues seem to be in Chinese…

The following is from a VGG16 CNN implementation within the main Repo :

### MXNet

The following is from an MXNet blog posting - there’s a large model zoo too, but those examples have lots of common helper code factored out, whereas the blog does it straight-forwardly :

### Plain Theano

When building a simple CNN for MNIST, raw Theano will typically build some helper functions to make the process easier.

The following is from a Theano Tutorial repo by Alec Radford :

### Plain TensorFlow

When building a simple CNN for MNIST, raw TensorFlow will typically build some helper functions to make the process easier.

The following is from ddigiorg’s repo :