#char-rnn
Explore tagged Tumblr posts
Text
GAZA (PrR)(RNN) — Since this morning, 40 martyrs have ascended across the Gaza Strip due to IOF bombings.
The most prominent of these massacres was at the Abu Assi School, where 10 martyrs ascended and 20 were wounded, their heads cut off and their bodies charred.
Amidst a security incident, the IOF violently bombarded with artillery and airstrikes the western areas of Beit Lahia in the northern Gaza Strip, which is besieged for the 42nd day.
In the south, drones constantly hover over the eastern areas of Rafah, where Civil Defense crews recovered several martyrs and wounded. The Khan Younis Municipality also announced that water and sewage facilities and pumps, as well as waste collection and transportation, would stop operating due to the lack of required fuel.
The details of other attacks are as follows, ordered chronologically since dawn:
Bombing in Shuja'iyya at dawn, resulting in 3 martyrs from the Samara family and several wounded.
Artillery shelling northwest of Nusseirat. Bombing of Jabalia, coinciding with artillery shelling. Quadcopter firing in Jabalia Al-Nazla.
A martyr, Yousef Suwairki as a result of bombing in Gaza City .
Bombings in Jabalia, Beit Lahia, and Nusseirat; Shelling from vehicles and boats north and west of Nusseirat - around 7 AM.
Shelling east of Khan Younis and in Beit Lahia.
Explosive bombs dropped by drones in Jabalia and in Rafah, resulting in three wounded at least, and three martyrs in Rafah (including Mahmoud Al-Sarsak).
Bombing a group of citizens near Al-Aqqad School north of Rafah; bombing by warplanes south of Gaza City.
Three martyrs recovered in Beit Lahia (Omar Abu Al-Maza, Yousef Abu Rabie, Ahmed Hamouda) -
Five drone strikes north of Rafah, and continued shelling in Beit Lahia and Jabalia - 11 AM
Three martyrs east of Rafah due to shelling.
Two martyrs recovered in Rafah
A martyr in Al-Sabra, south of Gaza City - Saddam Asraf.
A martyr, Harith Shahwan, ascended in Gaza City.
A martyr, the child Abdulrahman Al-Sharif, by IOF bullets in northern Gaza.
A martyr in Rafah after the child Yassin Al-Shaer succumbed to his wounds.
Five martyrs as a result of a bombing in Rafah, including Saqer Sabah
Two women and a child wounded by a drone strike in Deir Al-Balah.
A martyr - Tamer Walid Ezz El-Din, north of Gaza City.
Four martyrs and wounded in the Arjani family home southwest of Khan Younis. Then, in the late afternoon, another home belonging to the Arjani family was bombed, resulting in several wounded.
A martyr - Osama Al-Hams, as a result of bombing on Rafah .
A martyr - Mustafa Qanoun, due to shelling west of Nusseirat .
Heavy artillery shelling, gunfire, and smoke in the Saftawi area of Jabalia, in addition to a powerful airstrike targeting a barbershop in the Ayoub family home. This strike resulted in six martyrs (Media 8).
Bombing and helicopter strikes in the middle and east of Rafah, and a bombing in Khan Younis.
Bombing Hamouda family home in Beit Lahia, artillery shelling in several areas of the city. Opening fire on homes near Kamal Adwan Hospital.
Two martyrs as a result of the IOF bombing the Abu Armana home west of Nusseirat.
#palestine#free palestine#gaza#free gaza#current events#jerusalem#yemen#tel aviv#israel#palestine news
6 notes
·
View notes
Text
@transgenderer oh wait i forgot char-rnn (2015) was a thing. example:
PANDARUS: Alas, I think he shall be come approached and the day When little srain would be attain'd into being never fed, And who is but a chain and subjects of his death, I should not sleep. Second Senator: They are away this miseries, produced upon my soul, Breaking and strongly should be buried, when I perish The earth and thoughts of many states. DUKE VINCENTIO: Well, your wit is in the care of side and that. Second Lord: They would be ruled after this chamber, and my fair nues begun out of the fact, to be conveyed, Whose noble souls I'll have the heart of the wars. Clown: Come, sir, I will make did behold your worship. VIOLA: I'll drink it.
2 notes
·
View notes
Text
If anyone’s interested, I trained a neural network on chapstick flavors because I had a dream that @lewisandquark did it and i was bored one day
Some of them were actually pretty good:
Cherry kiwi
Peach belle
Cocoa & cream
Berry blossom
Melon mellow
Rose cherry
Some didn’t work as well:
Sugar snakle
Cherry coof
Stranger cake
Peanut butter beer
Coconut crapie
Some were really weird:
Coconut lick calangus
Armid pit
Butt bear bears
Sticed reaming rout
Peppermint catered picklemong
Spf 25 with greamint tit butter
Some were just inappropriate:
Pink poon
Strawberry shit hitt
Coconut cream coot
Strawberry shite (It generated this one many times)
Banana butherfucker
For some reason, much like @lewisandquark‘s neural network generated candy hearts, my neural net also seemed to generate a lot of bear-related flavors:
Strawberry bear
Peppermint toof bears
Coconut with bear
Coconut & smone bear
Butter bear
Bummy bears
Berry bear
Coconut bear (unflavored)
Spf 15 with bear
I'm not really sure why, because the word bear only appeared like once in the data set (in a flavor called “gummy bear”). I guess the moral of the story here is that neural nets just really like bears.
#rnn#char-rnn#neural network#neural networks#i meant for this to be just like a quick little post but i spent way too much time on it oops#lol no one's even going to read this#it's just me and my neural nets shouting into the void here#long post#lol i actually did this a while ago and just now remembered it
25 notes
·
View notes
Quote
3:6 O my God, thou art my God: for that thou liest down in the synagiurlrrrt,bts.
KJV, via neural network
1 note
·
View note
Text
sex blocking candubs
DATA (works) Relake, sir.
DATA Sir... I think we try to deneat anticeritives they control to I could be to be used.
Troi is shocked as they disappear in the ship.
BEVERLY No!
LAVELLE (works) There are reperion of a most chance. Your sex blocking candubs. A calvert experience but the people look up to the technorection is close. I would be back and stop it.
DATA (to Data) It's like Data.
2 notes
·
View notes
Photo
I’m training a neural network on the king james version of the bible and it just hit me with this.
1 note
·
View note
Text
HUMANDESIGN
WW
:3; Color Key T a Name Generator (6) Manifesting Mmil’estor Projector Reflector )‘P Generator (MG) hncrgy/ yon-ancrgy Energy Type IhpEm-rg)‘ Type Enerqu3p; Nonucnerg) Typt Nnn-energ 33p! T’TE-‘F' ‘A , \ ‘ “EtnI e Defined Sacral Defined S‘gcral Undefinud 51¢?“ undefined 5""3' undelimd 5“"l J V Sacral Inéfnr Sacral ‘Indlor Mo 0'. Cmbra) No Molar f Definition? MM" Cantu“) MN“ center“) dnflncd In the cuter“) defined "OJCIFMEH NOT defined to defined In the «[th“ tn Illa Thwlt 1' i“ the Threat Thrust
«II
Natural Slat Responsive Responsive! ircrtl 3' Ami: trunkingll bcforej :Opt-n t6 engage Activc Clarilyfhutiv WVRJ'SF‘E'QB 5‘ up: not
CIcIrln l’ 1’ ocl'f ‘ " Fulfilled Salilfaclion nr commitment: or :cPthl'ul! Guiding of“; Flowing! or not? Frull ration or leave a trail ‘ 3 ignorad Overloach of dannnia RE" tr
Habitual] ml Ovorworkcd] [0H1 I buy] Epcktg’knnoyca Edistqpéejgna Eggs:qu ed Exurn“ "mm!“ xt:rnll Itinulu ght filing}; Tu hue" um”? inn & spicy Nee d_. inner convict: cg)“: W nqognilhg N 13119:“)!!! r r ‘P rnn-pullimtlu 2': 5:“;"““'° nrcmcnl Providin Dewihpihg urpme ell’ccuve “nu. exginxgr t caulk: juldln mrdmin/
+ Yinflttceptire AAAAA Ing-ylng ““93 Q'Ing callingt Yln-Yin bu: support: -“I'd! momenta yin wcceptwitr rebepllvil)‘ Ia ‘7'“3} right In inrohcmcnl provide wisdom
3 nivcual Cod 5f action Ygfffutgi:
(3mg) right Iclinn that then more: to pure
manifestation
l Wail fur uni! Wail f“? “"‘I Implrc Ind Wait for char Allow other: to r I unit b for: "'P“““r ““5“ i I d [In I It M! In ruvid: Key to mccc: ‘3 P “u“; .1 :nn. 1]: e “um. tzcrggltltctign 8 guiding!“ provide luppurt before committing
Copyright 3’20”. HumnnDniganllsAll com
I
‘I 'I.



1 note
·
View note
Text
reblogging as note to myself: teaspoop
What is teaspoons?
I’m training a neural network to generate cookbook recipes. It looks at a bunch of recipes and has to figure out completely from scratch - no knowledge of what English words even are - how to start generating more recipes like them. It has to start by figuring out which words are used in recipes. Here in a very early iteration, you can see the first somewhat intelligible word beginning to condense out of the network:
4 caam pruce 6 ½ Su ; cer 1 teaspoop sabter fraze er intve 1 lep wonuu s cap ter 3 tl spome. 2 teappoon terting peves, caare teatasing sad ond le heed an ted pabe un Mlse; blacoins d cut ond ma eroy phispuz bambed 1 . teas, &
It’s trying SO hard to spell teaspoon. Teaspoop. It’s hilarious. It gets it right every once in a while, apparently by sheer luck, but mostly it’s: ceaspoong, chappoon, conspean, deespoon, seaspooned, ceaspoon, tearpoon, teasoon, tertlespoon, teatpoon, teasposaslestoy, ndospoon, tuastoon, tbappoon, tabapoon, spouns, teappome, Geaspoon, leappoon, teampoon, tubrespoon… It reeallly wants to learn to spell teaspoon. There are a lot of almost-teaspoons beginning with c… maybe it’s a mixture of teaspoon and cup. There are a few others that might be a tablespoon attempt.
Up next: pupper, corm, bukter, cabbes, choped, vingr…
2K notes
·
View notes
Text
How do you spell Cchannuukkahh?
The first night of Hanukkah is tonight. Or is it Chanukah? Xanike? Hanwcuh? Janice?
Well, being the very serious computer scientist I am, I realized that of course the only way to answer this question was to train a neural net. After all, now that we throw them at all sorts of problems that we really shouldn’t, let’s see if they can teach me the true meaning of Chanuqa.
There are 16 “not too weird” English spellings of this holiday, because you can sort of mix and match Ch vs. H at the start, number of n’s, number of k’s, and whether there’s an h at the end. But that’s simple enough that you’re not going to get anything interesting out of that. If you throw in Spanish spellings with a J at the start, French spellings that use ou for the second vowel, and a smattering of Yiddish transliterations like Khanike, you get a more interesting training set. But the magic really happens when you find shitposts. Because the truth is there’s nothing certain Jews like more than coming up with increasingly stupid spellings for the Festival of Lights. So Zhajnuquye? Ghanikehah? Xanixax? Hchkhkannuckwauaaha? All obviously in the training data.
I landed with a 51-word training set. This is generally too small for a neural net, but I discovered the online version of char-rnn (thanks, google), I lowered some (or, well, a lot) of parameters, and set it to train. It did spit back some of the training set or some combinations that led to actual weird but plausible spellings. But when I really cranked up the creativity parameter, there was some holiday magic.
So, without further ado, a neural net’s spellings for tonight’s holiday: annukah Januqua Gannuko channuuckka Hokaniqxe Ghhajnuoka Khaghajnike Zhajnuquqxcahhah Janiqqxchkhkhhahkkannuquke Hoogh
Anyways, chag sameach! Happy Hoogh!
303 notes
·
View notes
Text
disconcision replied to your post “She is dressed, now, in a black vest thrown over a dark gray-green...”
did you switch to the 345M model at some point? this is disturbingly coherent
Yeah -- I switched @uploadedyudkowsky over to 345M shortly after the release of 345M, which conveniently coincided with a point when I was getting tired of curating the 117M output
It’s amusing to reflect on how much @uploadedyudkowsky has “improved” since I started the blog. Originally I was just doing word-level Markov chains, which are an old favorite of mine (first learned about them through Janusnode, which I first used back when it was called “McPoet,” sometime in the . . . early 2000s?).
When I heard about char-RNN, I started using that, which allowed for a lot more “creative” variability although I still had to laboriously hunt for funny-sounding phrases buried within larger swaths of mostly dull/gibberish output.
Then when I heard about people fine-tuning GPT-2, of course I had to try that, which got vastly better results -- now the curation challenge is less “find some phrase in the output that doesn’t suck” and more “I want to quote some gigantic 15-paragraph stretch but I’m worried people will just tl;dr unless I choose a smaller excerpt.”
And now it’s the 345M GPT-2. And I’ve established this sort of personal tradition where every time I hear about a new gee-whiz NLG method with freely available code, I’ll try it on this same Yudkowsky corpus and revive @uploadedyudkowsky. I’m sure I’ll do it again with the larger GPT-2s, and then whatever comes next.
The really funny thing here is that if you look over the blog, it looks like something “gaining more intelligence” dramatically and qualitatively over the course of a few years. But it’s a thing that I, a perpetual (if moderate) AI skeptic/pessimist, am doing to poke fun at Yudkowsky, whose whole deal is worried about rapidly improving AI. So it feels almost like he’s getting the last laugh! It’s an extra, completely unintended/emergent joke on top of the basic joke underlying the blog.
(On a related and more serious note, I do think I’ve become less “AI skeptical” in certain ways -- which are still vague in my head and need more thought -- as a result of recent successes with the “transformer” architecture.
Like, a few years ago if you had told me we’d have GPT-2-level NLG in 2019 I imagine I would not have believed you. But what’s more, the same architecture that GPT-2 uses for NLG also enables some really incredible stuff in supervised NLU, via BERT, which can get you state-of-the-art results on p much any supervised task with a few epochs using one-size-fits-all hyperparameters. I was like “sounds fake but OK” when I read the paper, and then I tried it on some proprietary tasks from my day job and it just worked. Sometime I want to make an effortpost about the transformer architecture, because there’s something magic going on there, and none of the explainer posts out there do justice to the intuitive simplicity of the thing. [Very briefly: it’s a lot like a CNN in terms of using sparse filters, but the “shape” of each sparse filter is computed dynamically from the input via a function learned from the data])
41 notes
·
View notes
Text
First Explorations in Neural Networks - Featuring D&D Monsters
So, as a keen fan of this blog, and as someone working in the technology field (but unsure of what exactly I want my career to be), I decided it might be fun and interesting to build a neural network. It’s probably a useful, transferable skill, too.
For me, as a total beginner, the biggest stumbling blocks in getting this working were:
Figuring out what software I needed to get installed
Installing it, and getting it to work (I’ve just got a new PC with a ssd and hdd for the first time, and having a few issues with things installing in unexpected places...)
I installed, uninstalled, reinstalled a whole bunch. In the end, I’ve got Python 3 installed, along with about 5-7 other things (eg. scipy, numpy), and Anaconda, so that I can use the conda command prompt to run my code. I also installed Notepad++ in order to write my code - all the tutorials didn’t actually mention this, assuming I’d already have this basic level of knowledge! And I’m using regular Notepad to put my raw data in.
Initially, I followed a couple of tutorials, building a simple network using keras.
Once I was confident in my setup and how to input scripts, run them, and create datafiles, I found a very simple Recurrent Neural Network on GitHub which uses Python and numpy.
I got excited. I created a dataset (a list of D&D creature types), and I ran my script.
It failed.
Turned out this was written in Python 2, and I had Python 3.
Irritated at my noob mistake, but undaunted, I googled the error messages and edited the script until it worked. And the world was my oyster!
I cleaned up my initial dataset and ran my code for around 150000 iterations. This generated a bunch of D&D NPC names, demon/angel/deity names, names of creatures/monsters/races, and a bunch of words/names that really should never be used in a game by any sane individual! It enjoyed generating NSFW words; you can find those below the cut.
The Good
Krion (probably a young dragon)
Braender (elf)
Aryvir (elf)
Vodyitharnu (dragon)
Vorech (creature)
Shadran (morally dubious dwarf)
Shidlinor (innkeeper)
Uzicudache (roguish swashbuckler)
Uzifasleser (Uzicudache's younger brother)
Fion (tavern wench)
Vorgi'lle (orc)
Demora (some sort of demon woman)
Gara-sirgor
Aragir (lordly type)
Shadmorex (dracolich?)
Fricondrug (Viking type)
Ellit (bookish human)
Dracolax (a dragon that makes you poop)
Baladur
Gillon (probably a human blacksmith)
Parnau (rustic village in snow-covered mountains)
Barbasquex (idk but it sounds scary)
Gorgimoca
Nalkor beast
Pevitus
Parilith (she works in a temple of healing)
Grobizellan (sea captain)
Centifolk (sentient, intelligent centipedes)
Paphaspawn
Mezuma
Giant huspelver (bigger than your regular huspelver)
Fientyan Fig (wealthy, round man with a bald head)
Cengoin (dwarf whose father was Cengor)
Alane Marak (she owns a general goods store)
Caulin
Abbagon (lesser demon)
Beleriakes (elf city)
Crast Cracteroid (you don't trust him as far as you could throw him)
Phangoar
Deve Kromi (everyone knows someone like Deve)
Paviri (a deliciously-scented woman from a far-off land)
Omondworf (another Viking)
Bitor Wokolf (such Viking, wow)
Selanande Starticoria (sophisticated half-elf woman)
Feendeller (bustling town)
Deptina (cute dwarf woman)
Nagmorithelf
Aarafolke
Fire of Fyandri (clearly a massively OP spell)
Grelia
Membrion Thue
Chymex
Threx
Xoroter
Skallira
Myrilith
Romoru
Spawn of Larntaner
Modrias Thod (he’s so dull that he makes ditchwater seem interesting)
Bent Conad (they call him bent because of his twisted leg)
Pegasta (you know her as Peggy)
Oodathi (consortium of exotic traders)
Fith Hassau
Orriel
Murkdrake
The Bad
Speder ol pigen (basically me, trying to talk about a fast pigeon, but drunk)
Nightie Seirite (has quite a lot of pizzazz to it!)
Mammath
Shocker
Al
Grey Rag
Blond
Calf
Ankel
Dinoman
Drakonic Dragon (as opposed to all the other sorts of dragon)
Hug
Winter
Goes Spere
Gray Desk (interesting that it came up with both “grey [noun]” and “gray [noun]”)
Gooda
Hammy
Cansaur
Horgie Seal
Wimin (ah yes, the more evil of the two races on this plane...)
Gnomz (gnomes are fine, but the youth gnomes of today, styling themselves as “gnomz”, are a scourge across the realms)
Half-gin
Rug
Mo
Sleet
Alan
Mommy
Boghester
Nick
Ouzu lord
Dan
Z (”My new character is only known as Z. That’s all he can remember. He remembers nothing of his past; that’s why I’ve not written a backstory.”)
Bard (the most evil obv)
Cool Byarnie (probably vapes)
Bee List (it’s like Bee Movie, but a list)
Gezuss (it tried so hard to spell “Jesus”)
NSFW
Bongher
Bongman
Peadman
Bundong
Phukman
Wagrider
Bonker
Ass
Tenis (unsure if this rhymes with “penis” or “tennis”)
Tit
Choder
As Beata
Wine Dong (my personal favourite)
Bummy
Fisst
#d&d#dnd#d&d character#npc#dnd character#neural networks#neural network#neural net#rnn#char-rnn#ai#artificial intelligence
0 notes
Text
The neural network continues to be horrible at recipe names
I haven’t been messing with my rnns as much because of school, internships, etc. taking up a lot of my time, but I did restart trying to train my recipe title one again, and it seems to have reached its, um, anal stage...
peach barf sad poop cookies very cheese finger boop nookies poop bard groiner cookie cake poop cokey cake and man-merman pee cake frownies bartic bakened pinge peel cookies ram poop picnic caramel boopies poom chewies unicorn spoot pomp poop butt dick sand e-rics
(No, seriously. It generated all of those. I did not make any of those up.)
Bonus: "Name's Bond, James Bond. And you are?"
bakes, cake bakes
11 notes
·
View notes
Photo
This is my attempt to train char-rnn, a neural network thing probably most famous for the posts of @lewisandquark, to generate emoji. The training data was the set of SVG files from here: https://github.com/twitter/twemoji/tree/gh-pages/svg The results are mostly the equivalent of SVG noise (e.g., top two pictures) with a few individual emoji-like images that I like among them. I hope you enjoy these machine-generated glitch-moods.
168 notes
·
View notes
Quote
But one voice for the fishes dreamed, and he sent it to his mother, that Heber the Jezreelites there was in Jericho; and he drave out his wife David in haste.
KJV, via neural network
1 note
·
View note
Text
WTF TL;DR
Mostly inspired by @lewisandquark, we are running a recursive neural net (currently using char-rnn) with the noble ambition of creating a new season of Star Trek: The Next Generation. So we’ll see how that goes.
We’ll be posting snippets from each training run, as well as asides on this strange computer brain toy that suddenly everyone can play with before it achieves sentience and kills us.
I’m totally new to playing with neural networks, so this is pretty much a learning experiment. Bonus points if this ends in a movie deal, though.
2 notes
·
View notes