This is part 6 of a series of tutorials, in which we develop the mathematical and algorithmic underpinnings of deep neural networks from scratch and implement our own neural network library in Python, mimicing the TensorFlow API. Start with the first part: I: Computational Graphs.

TensorFlow

It is now time to say goodbye to our own toy library and start to get professional by switching to the actual TensorFlow.

As we’ve learned already, TensorFlow conceptually works exactly the same as our implementation. So why not just stick to our own implementation? There are a couple of reasons:

  1. TensorFlow is the product of years of effort in providing efficient implementations for all the algorithms relevant to our purposes. Fortunately, there are experts at Google whose everyday job is to optimize these implementations. We do not need to know all of these details. We only have to know what the algorithms do conceptually (which we do now) and how to call them.

  2. TensorFlow allows us to train our neural networks on the GPU (graphical processing unit), resulting in an enormous speedup through massive parallelization.

  3. Google is now building Tensor processing units, which are integrated circuits specifically built to run and train TensorFlow graphs, resulting in yet more enormous speedup.

  4. TensorFlow comes pre-equipped with a lot of neural network architectures that would be cumbersome to build on our own.

  5. TensorFlow comes with a high-level API called Keras that allows us to build neural network architectures way easier than by defining the computational graph by hand, as we did up until now.

So let’s get started. Installing TensorFlow is very easy.

pip install tensorflow

If we want GPU acceleration, we have to install the package tensorflow-gpu:

pip install tensorflow-gpu

In our code, we import it as follows:

import tensorflow as tf

Since the syntax we are used to from the previous sections mimics the TensorFlow syntax, we already know how to use TensorFlow. We only have to make the following changes:

  • Add tf. to the front of all our function calls and classes
  • Call session.run(tf.global_variables_initializer()) after building the graph

The rest is exactly the same. Let’s recreate the multi-layer perceptron from the previous section using TensorFlow:


Update: The new TensorFlow 2 has an updated API. I haven’t yet had the time to update this series of blog posts to that version. If you would like to migrate the example to TensorFlow 2, you can find information on how to do that here: https://www.tensorflow.org/guide/migrate.