#and decided to invent RNN
Explore tagged Tumblr posts
Text

Ranma siting in the middle of chicken scratch
#expressosartkinda#fanart#anime#ranma saotome#doodle#i cant believe that humans looked at the beauty of earth#and decided to invent RNN#or even gradient descent#fuck that shit
59 notes
·
View notes
Text
How a neural net makes cookies
The other day I trained a neural net to generate the names of cookies, based on about 1,000 existing recipes. The resulting names (Quitterbread Bars, Hand Buttersacks, Low Fuzzy Feats, and more) were both delightfully weird and strangely plausible. People even invented delicious recipes for them. But given that I’ve trained neural networks to generate entire recipes before, why not have the neural network generate the entire thing, not just the title?
Well, this is why.
The first neural network I tried is textgenrnn, which I’ve used to generate things like new species of snakes, names for creepy shopping malls, and terrifying robotics teams. Given 1000 cookie recipes from David Shields’s site, textgenrnn could do a recognizable recipe - but its titles and directions were a bit suspect.
Now, granted, it’s confused about other things, too. A memory approximately 40 characters long means that it doesn’t know how many times it has already added sugar (apparently its favorite ingredient). (Other algorithms, like GPT-2, have ways to zoom out.)
I decided to see if textgenrnn would figure out recipe titles if it trained for longer. It generated the recipe above after it had seen the example recipes 3 times each. Below is what it did after another 3 looks at each dataset (6 in total). The title is... different. The recipe is more chaotic. It has at least moved on from its obsession with sugar.
After 3 more looks at the data (9 in total), things have gotten even worse, though according to the loss (the number it uses to track how closely it matches the training data), it thinks it’s doing better than ever. It seems to be freaking out in particular about the repeated + signs that some recipes use as decorative borders.
There are terms for these kinds of training disasters that sound more like warp engine malfunctions: “mode collapse” usually applies to image-generating algorithms, though, and “exploding gradients” usually is signaled by the neural net thinking it’s doing worse and worse, not better and better.
So I moved back to another algorithm I’ve used for a long time, char-rnn, which seems to do well at long texts like recipes.
The recipes are... better. I’ll give it that much.
Some of its ingredients are questionable, and its directions certainly are. But at least (despite a memory only 50 characters long) it has managed to do the beginning, middle, and end of a long recipe. It’s often fuzzy about how to end a recipe, since my rather messy dataset has recipes ending with sources, urls, and even ISBNs. Recipe notes are also highly variable and therefore confusing.
So what happened with textgenrnn? I mentioned my difficulties with textgenrnn to the algorithm’s creator, Max Woolf, and he urged me to try again. There’s some randomness to the training process. Sometimes textgenrnn degenerates into chaos during training, and even then sometimes it pulls itself together again. When it did well, its instructions even start to make sense. You could make the Butterstrange Bars below (almost). Given this amount of randomness, it’s nice when researchers report the aggregate results of several training runs.
Support AI Weirdness and get an Excellent (not excellent) recipe for “Mrs. Filled Cookies”.
727 notes
·
View notes
Text
A neural network designs Halloween costumes
It’s hard to come up with ideas for Halloween costumes, especially when it seems like all the good ones are taken. And don’t you hate showing up at a party only to discover that there’s *another* pajama cardinalfish?
I train neural networks, a type of machine learning algorithm, to write humor by giving them datasets that they have to teach themselves to mimic. They can sometimes do a surprisingly good job, coming up with a metal band called Chaosrug, a craft beer called Yamquak and another called The Fine Stranger (which now exists!), and a My Little Pony called Blue Cuss.
So, I wanted to find out if a neural network could help invent Halloween costumes. I couldn’t find a big enough dataset, so I crowdsourced it by asking readers to list awesome Halloween costumes. I got over 4,500 submissions.
The most popular submitted costumes are the classics (42 witches, 32 ghosts, 30 pirates, 22 Batmans, 21 cats (30 incl sexy cats), 19 vampires, and 17 each of pumpkins and sexy nurses). There are about 300 costumes with “sexy” in their names; some of the most eyebrow-raising include sexy anglerfish, sexy Dumbledore, sexy golden pheasant, sexy eyeball, sexy Mothra, Sexy poop emoji, Sexy Darth Vader, Sexy Ben Franklin, Sexy TARDIS, Sexy Cookie Monster, and Sexy DVORAK keyboard. In the “technical challenge” department, we have costumes like Invisible Pink Unicorn, Whale-frog, Glow Cloud, Lake Michigan, Toaster Oven, and Garnet.
All this is to say that humans are very creative, and this task was going to be tricky for a neural network. The sensible approach would be to try to use a neural network that actually knows what the words mean - there are such things, trained by reading, for example, all of Google News and figuring out which words are used in similar ways.There’s a fun demo of this here. It doesn’t have an entry for “Sexy_Gandalf” but for “sexy” it suggests “saucy” and “sassy”, and for “Gandalf” it suggests “Frodo”, “Gollum”, and “Voldemort”, so you could use this approach to go from “Sexy Gandalf” to “Sassy Voldemort”.
I wanted something a bit weirder. So, I used a neural network that learns words from scratch, letter by letter, with no knowledge of their meaning, an open-source char-rnn neural network written in Torch. I simply dumped the 4500 Halloween costumes on it, and told the neural network to figure it out.
Early in the training process, I decided to check in to see how it was doing.
Sexy sexy Dombie Sexy Cat Sexy A stare Rowan Sexy RoR A the Rog Sexy Cot Sexy Purbie Lampire Poth Rat Sexy Por Man The Wombue Pombie Con A A Cat The Ran Spean Sexy Sexy Pon Sexy Dander Sexy Cat The Gull Wot Sexy Pot Hot
In retrospect, I should have expected this. With a dataset this varied, the words the neural network learns first are the most common ones.
I checked in a little later, and things had improved somewhat. (Omitted: numerous repetitions of “sexy nurse”). Still the only thing that makes sense is the word Sexy.
Sexy The Carding Ging Farbat of the Cower Sexy The Hirler A costume Sexy Menus Sexy Sure Frankenstein’s Denter A cardian of the Pirate Ging butter Sexy the Girl Pirate
By the time I checked on the neural network again, it was not only better, but astoundingly good. I hadn’t expected this. But the neural network had found its niche: costume mashups. These are actually comprehensible, if a bit hard to explain:
Punk Tree Disco Monster Spartan Gandalf Starfleet Shark A masked box Martian Devil Panda Clam Potato man Shark Cow Space Batman The shark knight Snape Scarecrow Gandalf the Good Witch Professor Panda Strawberry shark Vampire big bird Samurai Angel lady Garbage Pirate firefighter Fairy Batman
Other costumes were still a bit more random.
Aldonald the Goddess of the Chicken Celery Blue Frankenstein Dancing Bellyfish Dragon of Liberty A shark princess Statue of Witch Cupcake pants Bird Scientist Giant Two butter The Twin Spider Mermaid The Game of Nightmare Lightbare Share Bat The Rocky Monster Mario lander Spork Sand Statue of pizza The Spiding hood A card Convention Sailor Potter Shower Witch The Little Pond Spice of pokeman Bill of Liberty A spock Count Drunk Doll of Princess Petty fairy Pumpkin picard Statue of the Spice of the underworker
It still was fond of using made-up words, though. You’d be the only one at the party dressed as whatever these are.
Sparra A masked scorby-babbersy Scormboor Magic an of the foand tood-computer A barban The Gumbkin Scorbs Monster A cat loory Duck The Barboon Flatue doctor Sparrow Plapper Grankenstein The Spongebog Minional marty clown Count Vorror Rairol Mencoon A neaving hold Sexy Avical Ster of a balana Aly Huntle starber pirate
And it ended up producing a few like this.
Sports costume Sexy scare costume General Scare construct
The reason? Apparently someone decided to help out by entering an entire costume store’s inventory. (”What are you supposed to be?” “Oh, I’m Mens Deluxe IT Costume - Size Standard.”)
There were also some like this:
Rink Rater Ginsburg A winged boxer Ginsburg Bed ridingh in a box Buther Ginsburg Skeleton Ginsburg Zombie Fire Cith Bader Ginsburg
Because someone had entered about 50 variations on Ruth Bader Ginsberg puns (Ruth Tater Ginsberg, Sleuth Bader Ginsber, Rock Paper Ginsberg).
It invented some awesome new superheroes/supervillains.
Glow Wonder Woman The Bunnizer Ladybog Light man Bearley Quinn Glad woman robot Werewolf super Pun Super of a bog Space Pants Barfer buster pirate Skull Skywolk lady Skynation the Goddess Fred of Lizard
And oh, the sexy costumes. Hundreds of sexy costumes, yet it never quite got the hang of it.
Sexy Scare Sexy the Pumpkin Saxy Pumpkins Sexy the Pirate Sexy Pumpkin Pirate Sexy Gumb Man Sexy barber Sexy Gargles Sexy humblebee Sexy The Gate Sexy Lamp Sexy Ducty monster Sexy conchpaper Sexy the Bumble Sexy the Super bass Pretty zombie Space Suit sexy Drangers Sexy the Spock
You bet there are bonus names - and oh please go read them because they are so good and it was so hard to decide which ones to fit into the main article. Includes the poop jokes. You’re welcome.
I’ve posted the entire dataset as open-source on GitHub.
And you can contribute more costumes, for a possible future neural net upgrade (no email address necessary).
12 notes
·
View notes
Text
New My Little Ponies, designed by neural network
The Kingdom of Equestria is inhabited by thousands of colorful, magical ponies, whose life cycle, socioeconomics, and biomechanics are best not investigated too closely. Their names are usually something like “Rainbow Dash” or “Diamond Tiara” or (my favorite because she’s totally a grad student pony): ‘Twilight Sparkle”.
Often the plot calls for crowd scenes (usually involving ponies in great peril), and I worry that some day the creators of My Little Pony will run out of names. In the spirit of being helpful, I decided to put a computer to the task of generating lots of new ponies.
I used a program called a character-level recurrent neural network (char-rnn), which looks at examples of text (Pokemon, or Harry Potter fan fiction, or even guinea pig names) and learns to imitate them. I gave the neural network more than 1,500 names from My Little Pony Friendship is Magic Wiki, and let it start learning.
Result: partial success.
It did come up with some pretty plausible-sounding ponies, certainly not as weird as some of the ponies that have already appeared on the show (such as Groucho Mark and Button Mash and Buzzard Hooffield).
Star Blueberry Sprinkle Cherry Bolt Berry Spy Sweet Glints Cheer Belle Sunferry Sunshine Star Sweet Bolt Cherry Curls Mint Flower Bright Seas Flight Star Plum Flower Sweet Suns Brash Clouds Cheery Breath Cloudy Daze Big Blue Brass Flare Blue Chile Coco Mane Neon Brush Strawberry Sun Sugar Top Cinnamon Mark Glowberry Amethyst Mist
The neural network also came up with some seriously tough-sounding ponies, ones that you would definitely want on your side when fighting giant killer cupcakes, or whatever the peril is this week.
Cold Sting Scarline Shoot Bolt Sunder Bright Dark Role Sob Dancer Sunsrot Masked Rock Roar Starlich Command Pony Deader Pony Flint Sting Steel Roller Dark Candy Scarphore Creep Well Prince Still Stare Rust Crack Colder Sanderlash Bitter Star
But the neural network’s results weren’t all successful. It also came up with some ponies that probably wouldn’t be on the A-team.
Dunder Dort Tardy Pony Flunderlane Flueberry Sherry Marina Doof Want Cone Starf Dad Star Star Flurtershy Starly Star Mr. Atple Pony Pony Packy Pack Pinky Swoll Apple Apple Dim McColt Free Sing Fail Poney Hoof Tasting Spar Dirky Flithers Arple Robbler Chest Star Barp Moon Mr. Wander
It also invented some ponies that are just plain weird.
Lilie Lice Billy Boon Wootson Mice Full Fish Crest Suns Sun Ramen Breek Smarky Hondsarors Blither Bon Persy Belly Pony String Heart Swinkleshine Flint Cream Star Sandlime Rocky Scooppony Piemonk String Punch Apple Stork Bunny Maze Lilac Ruster Winker-Moon Charmy Vine Swan Break Wags Pine Pearlicket Nandy Quark Firey Up Tracklewock Packin Flustershovel Aoetel Pakeecuand Tapshine Sugar Cloudsdalou Sandy Apple Mitten Splash Silvermice Butter Flash Agar Swirl Cheese Breeze
And a list of ponies you might want to avoid:
Clotter Raspberry Turd Blueberry Pants Benny Sweat Parpy Stink Blue Cuss Groan Rear Pony Lace Crunk Rade Slime Derdy Star Swill Brick Colona Pocky Mire Hoofed Snarch Apple Ronch Trowel Pony Smanky Hank Princess Sweat
Pony pictures created using General Zoi’s Pony Creator
#neural networks#char-rnn#my little pony#my little pony friendship is magic#mlp#ponies#flustershovel
3K notes
·
View notes
Text
Computer algorithms, invented by a computer
I train neural networks, which are a type of machine learning program that imitates the way human brains learn. (Technically, they’re artificial neural networks, to differentiate them from the biological sort). Unlike traditional computer programming, where a human programmer comes up with a long set of rules for a computer to follow, with neural networks, the computer learns by example and comes up with its own rules.
Most of us encounter neural networks every day - they power face recognition, automatic language translation, object recognition, and self-driving cars. The neural networks I train, though, are for more modest, and sillier purposes - inventing new paint colors (like Burf Pink and Stanky Bean) or new names for guinea pigs (Popchop and Fuzzable, for example).
Today’s experiment: computer algorithms.
Not the algorithms themselves, mind you - that sounds difficult. Just their names. 2045 of them from the Wikipedia list of computer algorithms (big thanks to Joshua Schachter for extracting the data). I gave them to a simple char-rnn neural network, thinking it would be very interesting to find out what one computer program decides to name another.
The results do sound pretty algorithmic - you might be able to get away with recommending these to a programmer who’s stuck on a problem, if you leave before they google it.
Moshwack algorithm Stardand's algorithm Super–Kelnic algorithm Soft sort Vandical time algorithm Moloning Go sort Hair mato-sort Speedated heeling tree Jusi tree Shamer Gorper's algorithm Protacons Spade optimization Wash problem Gore search Bollard method
These are a bit less plausible (especially the inevitable fart algorithms. For some reason, the word “fart” often comes up in this neural network’s results.)
Farter search Prebabel strung parser Boonfus-(computer scearch Stani computer somplerity farter estimator Purparden argloximication Rendamical fimfering Pint blops Wolgaren farting Gimprach relucing Suav clopping Random damplestremptic ferchion Pand fassing Aromatic pashering contex algorithm Farter-hear srial fecty optimization
And these prove that basic and simple does not necessarily mean more believable. Don’t suggest using these. You will not sound smart.
Jashen computer statistication Computer algorithms Mathator sort Somperting problem Complexity computing Code forting algorithms Rare algorithm Lean mathing Coding Rarge problem Compater partimation Gerertic proplem Spreeer–Mate proplem Barse me
420 notes
·
View notes