Tuesday, November 13, 2018

Lessons on the Path to Righteousness and the Installation of Tensorflow on Centos

draft

Tensorflow is one of the open source solutions to a machine learning back end.  That, with a Keras layer on top, is one of the more popular machine learning environments out there. Among other things, it supports both central and graphics processing on most operating systems.

As in so many things in life, a clever or lucky choice can achieve a goal with no effort, but a similar choice can result in weeks, years or even decades of hell.

There are a number of surprises involved in installing these packages on your operating system of choice, and this note is intended to help you, readers, avoid shooting yourself in the foot or the head as the case may be.

1. Never, never, never try to install from source no matter who advises you to.  It is perfectly possible to install from source on a bare metal machine without any virtual environments, or you could just hit yourself with a large hammer for a few weeks.  Who knew that there were so many different ways to install Python, or that there were so many Pythons?  And that is just the tip of a very nasty set of icebergs.

2. So whenever you are given an opportunity to isolate yourself from the real world by using a virtual environment, whether in Python or anywhere else, take it.  In particular, for the Windows 10 version, a choice of the python virtual environment and a precompiled version of Tensorflow/Keras will result in a cpu only version in an afternoon.  For some of you, you are done and can move on.

3. For those of us in Linux world, you now have to choose between a few specific versions of Ubuntu and everything else.  You who would compromise your integrity and have no aesthetic sense are welcome to use Ubuntu.  Go, it is there for you.

4. For the rest of us who might use an adult version of Linux, my operating system of choice is Centos / RHEL 7.5 which is the most recent version.  I thought I had to compile from source, but this turns out not to be the case.  What turns out to to one of the best paths through this jungle is to use the Docker (container) version as follows.

5. Install Docker by registering as a free user of the Community Edition.  Having registered, and installed the preferred package from the preferred repository, you are now eligible to use containers that have been registered with Docker.

6. Tensorflow creates a new version more or less every day in a variety of flavors (no GPU, GPU, etc) and puts them out on the Docker registry with such adjectives as "latest" or "stable" for example.

7. Using these magic words you can create the name of a container you want to run.  You use one of the magic containers and it loads that part of the container whose layers are not already local, and if you so specify, you are in a shell, in a container, in which you can go into python, load tensorflow and keras and you are off to the races in a cpu version of Tensorflow.

8. Of course, at this point you are now using containers and you will need to spend a day learning about container file systems and other nuance.  Its not too bad though.

9. For those of you who foolishly also want to use GPU acceleration, you have chosen a slightly more difficult path.  You will have to install a different version of the "docker" program from Nvidia and Github.  But once you do, and once you install the GPU driver on your Linux (a bird of a different feather) you can now use a container with GPU from that list mentioned above.

Good luck!

No comments:

Post a Comment