## CNN MNIST with sugar-coated TensorFlow

The following is just a collection of code samples for solving CNN MNIST (all using roughly the same network structure) using different layer-helpers on top of regular TensorFlow. In a review post (coming soon), I’ll figure out which one makes the most sense to me (as someone who has previous used Theano/Lasagne).

Currently the list is :

• Plain TensorFlow
• Keras
• TF-Slim
• tf.contrib.learn aka skflow
• PrettyTensor
• TFlearn
• TensorLayer
• SugarTensor

### Plain TensorFlow

When building a simple CNN for MNIST, raw TensorFlow will typically build some helper functions to make the process easier.

The following is from ddigiorg’s repo :

A more interesting example that uses pure TensorFlow is Fast Style Transfer, or this thorough LSTM-based tutorial.

### Keras

This is a popular frontend - particularly since it can use both Theano and TensorFlow as a backend.

The flip-side of this flexibility is that one loses the ability to dig into the backend’s explicit representation : So adding custom layers is (roughly speaking) no longer possible.

One can see from the model=Sequential(); model.add( fn ) code that Keras has its own environment for formulating the computational graph, before handing it off to whichever backend is chosen.

### TF-Slim

This looks like ‘just what the Doctor ordered’, until you look into the seq2seq code, and find that the promising looking library has quite a few stubbed functions, which solely execute pass.

Also noteworthy : the number of pretrained models available in the slim hierarchy.

### tf.contrib.learn aka skflow

Not to be confused with tflearn

Documentation at the main TensorFlow site.

Code from the skflow examples directory :

### PrettyTensor

The main code demonstrates the ‘Fluent’ style of function chaining, though there are some complaints about documentation.

Nice tutorial walk-through, with code in GitHub :

### TFlearn ( not to be confused with tflearn )

Documentation all online in regular Python Sphinx format.

Complete example from main TFlearn GitHub repo :

### TensorLayer

Documentation all online in regular Python Sphinx format.

Partial example from the documentation itself :

### sugartensor

This jumped out because of the same author used it to reimplement Speech-to-Text-WaveNet.

Documentation is very limited (mainly read-the-code), but the code is in GitHub, and makes the interesting choice of putting itself on the TensorFlow tensor variable structure itself, with the prefix sg_ to avoid polluting the namespace. This is somewhere in between {clever, ingenious, convenient} and nasty (TBD). And it also means that these sugar methods can be chained (like prettytensor, without requiring special wrapping).

Code from main repository README :