Best of the Bots

A slew of paint colors named by a neural network, including such gems as “turdly” and “rose hork,” made it big last week, with mentions in Ars Technica, The AV Club, and even The Atlantic. But for the story straight from the source, check out lewisandquark.tumblr.com, the wacky neural network blog of optics researcher Janelle Shane. Her original post about paint colors went viral, but I got even more out of part 2, in which she writes about how she tweaked the algorithms to get better results. (Though “copper panty” is a marginal improvement at best.)

“May Picture” by Paul Klee, who as far as I know did not buy his paint from a neural network. Public domain, via the Metropolitan Museum of Art.

My introduction to the machine-generated text genre was the now-dormant King James Programming tumblr, featuring lines from a Markov chain trained on the King James Bible and some computer programming texts. Its most recent contribution was “37:29 The righteous shall inherit the land, and leave it for an inheritance unto the children of Gad according to the number of steps that is linear in b.” Timeless wisdom, to be sure. 

Neural networks and other machine learning processes are hot right now. Google’s AlphaGo, which uses neural networks to decide which moves to play, is now consistently beating the best Go player in the world (read more about that at The Math Less Traveled by Brent Yorgey), so we humans have lost the edge in basically the last game we were still comparatively good at. I think it’s only prudent for us to keep an eye on what our robot overlords have in store for us as they take over more and more formerly-human tasks.

If the paint colors, recipes, Irish tune names, and pickup lines at Lewis and Quark aren’t enough for you, there’s a lot more algorithmic creativity to choose from. At jamesoff.net, you can click until you find a recipe that actually sounds like food. (Let me know if you try grilled coffee, with its ingredients of milk, coffee, mayonnaise, and lambchops.) High noon GMT has something even better than Irish tune names: entire Irish tunes generated by computers. I’ve really enjoyed seeing experiments involving word2vec, which embeds words as vectors in 300-dimensional space. I first remember learning about it from Jordan Ellenberg’s blog, and just the other day a friend pointed me to a word2vec reinterpretation of Genesis I using only words that begin with “a.”

For bite-sized chunks of machine learning, check out this list of bots I follow on Twitter. I’m using “bot” as sort of a catch-all term there for a lot of different kinds of computer-generated tweets. Census Americans tweets census data of randomly selected Americans. Symmetric Curves tweets beautiful randomly-generated curves with radial symmetry. Picdescbot tweets the image descriptions it comes up with for random pictures from Wikimedia Commons (with varying degrees of success). All of the bots I follow inject a dose of randomness and usually some levity into my day. So I’d like to thank the bots for amusing me so much as they work toward world domination or their own line of Sherman Williams paints — whichever comes first.

This entry was posted in Data Science, Mathematics and Computing and tagged , , , , , . Bookmark the permalink.

2 Responses to Best of the Bots

  1. Cat says:

    Excellent post! Very interesting and funny stuff. Another example would be Reddit’s subreddit known as “Subreddit Simulator,” which tends to produce some similarly entertaining content. Here’s a link: https://www.reddit.com/r/subredditsimulator

  2. Jordan Ellenberg says:

    I like algebraic topologist Eli Grigsby’s posts on neural nets at

    http://iamalearningcomputer.blogspot.com/

    I’ve seen a million “this what a neural net does” posts, but somehow her perspective is an especially clear one for pure mathematicians like me!

Comments are closed.