Using Machine Learning to generate Shakespeare

After this tutorial, you will be able to create text based on previous Shakespeare using AI. Minimal programming/command line knowledge required.

A couple of months ago, I released a tutorial explaining the concepts of how Machine Learning and Artificial Intelligence work. Since then, I have given a Ted Talk on Artificial Intelligence (will post on this blog soon), implemented multiple models, and have gained a sufficient understanding of Torch and RNNs (Recurrent Neural Networks). If you have not yet read my previous tutorial, please do so -> machine-learning-and-neural-networks and then come back. I’ll wait.

source.gif

Okay cool, now that you’re back with a basic understanding of AI and Machine Learning, let’s build a project using AI. This will be a multi-part tutorial. In this first part, I’ll show you how to use a Tensorflow-RNN framework to create an AI model based on the input of a text file.

What are we doing

yay.png

Essentially we will be providing an RNN framework (a type of artificial intelligence framework in which the neural networks are layered heavily) with a text file filled with Shakespeare poetry. We will then let the framework train for hours and finally produce an output. This tutorial is divided into four steps:

  1. Setting up the framework
    1. Torch
    2. Graphics Card
    3. HDF5
  2. Inputting the Shakespeare
  3. TRAINING!
  4. Outputting original text

Disclamer: This tutorial is not meant for those without lots of patience. You will encounter errors based on your device specification. You are more than welcome to leave me a comment asking me how to resolve it, but make sure you understand that this project will take you a long time. Figuring everything out (without clear instructions) took me three weeks, but with this tutorial, you can probably get it done in 3-5 days. Don’t get frustrated if things don’t work. Setting up the project (step one) is the hardest part, but after that, everything is worth it!

This tutorial is meant for everyone, but certain lines of code may not produce the same result on Windows. Because this is a Mac-oriented tutorial, I will also provide alternative instructions in italics at every step for windows users.

1. Setting up the Framework

A. Torch

We need to first install something called Torch. (Windows users use this guide http://torch.ch/docs/getting-started.html) The easiest way to do this is via command line, so open it up and enter this code:

git clone https://github.com/torch/distro.git

~/torch --recursive cd

~/torch bash install-deps

./install.sh

This also installs something called Lua, which is the programming language that will be used in this tutorial. Now, type the following;

touch ~/.bash_profile; open ~/.bash_profile

and in the text editor that opens up, add these lines at the bottom:

# TORCH
export PATH=$PATH:/Users/<your user name>/torch/install/bin

All this does is let us use Torch from Terminal without having to navigate to a specific directory.

B. Graphics Card (optional)

Do you have a VIDIA graphics card? If you don’t know what this means, ignore this step and start reading after the picture of the graphics card.

I don’t, because I still use my old mac after speeding it up (which you can read about here (themillibit.com—how-to-speed-up-your-computer), but if you do then you can take advantage of the extra speed with an extra step. Download CUDA, a software accelerator, and add these lines to the text editor that we opened up in the last line (you will need to run the open batch_profile line again):

# CUDA

export PATH=/Developer/NVIDIA/CUDA-7.5/bin:$PATH

export DYLD_LIBRARY_PATH=/Developer/NVIDIA/CUDA-7.5/lib:$DYLD_LIBRARY_PATH

Then in terminal, run this:

kextstat | grep -i cuda

5330700_sd.jpg

C. HDF5

Okay cool, now we need to install the HDF5 library. Before installing it, let me explain what how this works. At this point, you have everything you need to start training your AI engine and produce output, the only problem is that the AI framework doesn’t take raw text input—instead, it takes an “h5” and “json” file as input. The only way to turn your text file into this “h5” input file, you need the HDF5 library.

Windows users, follow http://www.h5py.org/ to install HDF5

You can install the HDF5 library using a command line installer called homebrew, which is pretty useful to download libraries in general. So if you don’t have homebrew, download that here https://brew.sh/

Then, install HDF5 using homebrew with the following:

brew tap homebrew/science

brew install hdf5

Then, MOVE the torch folder to your home directory. After that, get the correct implementation of HDF5 with the following

git clone git@github.com:deepmind/torch-hdf5.git

cd torch-hdf5

luarocks make hdf5-0-0.rockspec

Okay now, this is the last thing we need to do for HDF5. Basically, the following lines of code let you use the HDF5 library with python, which is another language that we will also use for this project. (BTW If you want to learn more about python, you can check my tutorial here, but it’s not necessary.)

sudo easy_install pip
sudo pip install h5py

Okay, we’re almost done with this step. Now the final thing we want to do is download this repository of the Torch-RNN framework. This is different from Torch, which we installed earlier.  Download this repository (for more about GitHub, check out my tutorial here), and unzip it onto your home directory where everything else has been.

 

2. Inputting the Shakespeare

Windows users also consider this

Okay so remember that Github repository we downloaded and put in our home directory? It comes with a sample file of Shakespeare! So now, all we need to do is take that text file, turn it into a JSON and H5 input file, and then train with it.

Objective: Turn our .txt of Shakespeare into BOTH a .JSON & .H5 file in order to input into the Torch-RNN framework.

python scripts/preprocess.py 
--input_txt data/tiny-shakespeare.txt 
--output_h5 data/tiny_shakespeare.h5 
--output_json data/tiny_shakespeare.json

Okay so the above code does just that. It takes the text and converts it into a JSON and an H5 file. After entering the above code, the terminal will hopefully respond with something like this. If it doesn’t, it should respond with an error. Google the error, or tell me in the comments, and try your best to get it resolved (that could mean that the step wasn’t completed).

Total vocabulary size: 65
Total tokens in file: 1115394
Training size: 892316
Val size: 111539
Test size: 111539
Using dtype <type ‘numpy.uint8’>
Yay, this means that an H5 and JSON file have been generated in the data directory of your torch-rnn folder (which was in your main directory).  Congratulations, we are now halfway there!

3. Training

If you’ve made it this far, you’re a true AI warrior. All we need to do now is input those H5 and JSON files that we generated earlier into the Torch-RNN framework and let it train for a couple of hours.

th train.lua -input_h5 data/tiny_shakespeare.h5 -input_json data/tiny_shakespeare.json-gpu -1

If you have the CUDA GPU, remove the “-gpu -1” from the end of the line.

Windows users use:

th train.lua -input_h5 my_data.h5 -input_json my_data.json

Let me break this line down for you. The th signifies that the command is directed towards torch, the train.lua is a command that says we are about to input files to train the torch system with, the -input_h5 [pathtoH5] provides the framework with the h5 input, and the -input_json[pathtoJSON] provides the framework with the JSON input. The -gpu -1 signifies that you do not have the CUDA GPU that I mentioned briefly earlier. After running that line, you should have a message that looks similar to this:

Running with CUDA on GPU 0
Epoch 1.00 / 50, i = 1 / 17800, loss = 4.165219
Epoch 1.01 / 50, i = 2 / 17800, loss = 4.028401
Epoch 1.01 / 50, i = 3 / 17800, loss = 3.935344
This will run for 50 epochs, and will probably take over an hour (depending on your computing speed/if you are using a GPU). An epoch occurs when the Machine Learning framework completes one full cycle of the data set. Basically, it will go through the JSON and H5 files 50 times, reinforcing and relearning the patterns of the words.  Now go take a coffee break, go on a run, or do something else. This might take a while, and once it finishes, come back to step three.

3. Generating some output

Just enter this code, and your output will be generated:

th sample.lua -checkpoint cv/checkpoint_10000.t7 -length [CHARACTEROUTPUT] -gpu -1

This will print the output to your terminal window. NOTE: If you have a more recent checkpoint than checkpoint_1000.t7, use that.

Windows users use: 

th sample.lua -checkpoint cv/checkpoint_10000.t7 -length 2000

 

Here are the flags, detailed from this documentation. Basically, if you want more specific output, you can add one of these flags to the line of code above and it’ll take that into consideration:

  • -checkpoint: Path to a .t7 checkpoint file from train.lua
  • -length: The length of the generated text, in characters.
  • -start_text: You can optionally start off the generation process with a string; if this is provided the start text will be processed by the trained network before we start sampling. Without this flag, the first character is chosen randomly.
  • -sample: Set this to 1 to sample from the next-character distribution at each timestep; set to 0 to instead just pick the argmax at every timestep. Sampling tends to produce more interesting results.
  • -temperature: Softmax temperature to use when sampling; default is 1. Higher temperatures give noiser samples. Not used when using argmax sampling (sample set to 0).
  • -gpu: The ID of the GPU to use (zero-indexed). Default is 0. Set this to -1 to run in CPU-only mode.
  • -gpu_backend: The GPU backend to use; either cuda or opencl. Default is cuda.
  • -verbose: By default just the sampled text is printed to the console. Set this to 1 to also print some diagnostic information

That’s it! You should have now generated your own shakespeare! Here is the file I generated:

POLIXENENT:
Why! when far she’s deny upon my dead’st
Did me; I cheer the king, nothing nature!
Soud, at by down and being noble other
That I am speak me to my prayers.

MENENIUS:
Trear’ster should come, I will shall’t confise Grord no;
‘Tis title, he did, we must swelk’d were send
a sablot and mark true a very live!
The golden speak to enterate to of blood
time to our head, give like your guess; thereochlow’d
As you thou rare made that poor queen,
Bestance away of that good make our maident,
And queen, the and your soon confell you, women,
I joyness strumber’d from him to thy flower
in; whom more deneficers as doubting.

BRUTUS:
Marry, but those gone that was he most arms!

HENRY PERCY:
My seraces, one unjorman, bounds no beastly’s will.

Nurse:
I warrow down, give thee the land to Hended
Hath alreadon gentleman, you lesses; let me hear;
And shall be Manton; but allide; for makes my oan niggle
And do the wife and some. What which I none for!
Nothing funcellemanes work calnants that best thousands
On knowly ament was throve them appearta!
Scare for them he afters out, ‘matter as Warwick.

KING RICHARD II:
I desleazed to your ryor of good, they
restate shalphing yet heldofs! I; thou, combom,
Long’d and lads my nought in thy queen, dreabun,
As too begeter, roage to thy issives, alones
That holdiers to desible say on the prison,
I’ll a strong to lick an ools honest found.

First Solvignous, Lord:
By they are needy by then I am side
The tuist for the sweet bumber we mad my sight,
And bither received be bequestance will I, and so;
Outler, who now, Marcius, but thou wiltch or bear;
My by out his neishap of monather say
Will longer behalt.

CAPULET:
Comes stain’d the bent sovereignter of him.

ROMEO:
Now it your mind to me; but I say:
Nurse from his long stoody in they from the cad!
So dead Gloung say, for made not safed how courters,
But when not scape at that time advance; thou,
And I will spray dungs thy country: could Tush:
I down of the coldar the most bid undersc

Isn’t that awesome? You just successfully generated your own unique Shakespeare. It’s totally unique and based on the patterns that the real Shakespeare would write in! AI is super cool, and I hope you take some time to experiment with it. Feel free to comment with any other projects you’ve completed with AI