A.I., Machine Learning, and zombies

Ever since I got my first Commodore Vic20 (and then later the Commodore 64), I’ve been addicted to using the computer to make art. And while my college years were spent studying traditional painting and drawing, as soon as I graduated and moved to Los Angeles, I dove headfirst into using the computer to animate and paint.

Recently, the field of A.I and Machine Learning has made leaps and bounds, making these tools finally available to artists. I first started playing around with a web-based app called Pix2Pix, which let you sketch something, and the computer would “flesh” it out.

My initial results were pure nightmare fuel.

pix2pix.png



Not too long after, I discovered a site called Artbreeder, which gave access to a pre-trained set of models. Mostly you could make your photos look like Van Gogh, or any other artist whose work the A.I. had been trained on. But I wanted to be able to train my own model. I wanted the computer to make new work based on my entire body of digital paintings.

That’s when I found RunwayML.

RunwayML’s aim was to make the training of new A.I. models accessible to artists, which is exactly what I was wishing for. And their site made the whole process super painless to hop into. Soon I was training models on my amoeba characters, some digital faces, and now, zombies.

My first experiment in the land of the A.I. undead, was to grab all the photos of zombies I could find from The Walking Dead. I trained a model on them, and it started giving me some pretty creepy, but awesome output. Soon, I realized that I would have to fix some of its attempts, and re-train it to include the newly altered imagery.

This idea turned to work extremely well.

img000000000.jpg
img000000023.jpg

Then I decided to start re-painting the images to be a little sillier, more in my style. Those results were just as weird.

But a few days ago, I realized I had enough imagery of my silly, squiggly zombies to try training a model on those. And boy am I glad I did! Right away the computer started making things that approximated my wide-eyed, confused zombies. I just had to keep training it.

But then something went wrong. In Machine Learning, there is a measurement used to describe how confident the computer is in its attempts to recreate the source imagery. This is called the FID score, or Frechet Inception Distance. It’s basically a measure of how far off the pixels of the generated image are from the original images in the training data-set.

And somewhere in my over-excitement, I had over-trained my zombie model.

So my current goal, is to take the output from the stuff that worked, and clean it up. Re-paint it all and feed it back to the computer. Like I did with the Walking Dead zombies. My hope is that with a little time and care, I can train a little robot Petey to make very good silly squiggly zombies. Just like his dad.