Neural Networks: new text from existing prose, part 1

“Magis Humanis Quam Humanis” (“More human that a human”)

Such was the motto of Tyrell Corporation in the cult classic Blade Runner, directed by Ridley Scott back in 1982. In the below snip from Youtube (apologies for the quality), we hear the CEO of Tyrell Corp laying down the cold reality on Blade Runner and main character Rick Deckard, played by Harrison Ford. Blade Runners are a Special Police unit whose job it is the hunt down and “retire” Replicants who have, despite their built-in 4 year life span, escaped their duties and gone rogue. There are several revisions of Replicants, referred to by their Nexus designation. Racheal, as mentioned in the clip below and with whom Deckard develops a connection, was of the first of the newest release, the Nexus 9 – the first to be implanted with false memories and the ability to develop real emotions – this being done to make them more “stable”, thus increasing their usefulness, their predetermined life-span; and perhaps even procreate. How’s that for an ethical and existential head-spin ! And if that weren’t enough debate still comes and goes as to whether Deckard himself was a Replicant.

Well, while not quite in the same league as Tyrell Corp, I thought I’d have a crack at using an AI rig which has been doing the rounds with a lot of results, ranging from amusing, to nonsensical, to downright eerie. The module I’m referring to is “textgenrnn” by Max Woolf. Textgenrnn uses Googles TensorFlow and the open-source Keras deep-learning frameworks, both brought together and controlled by a Python library to create a back-propagation (or recurrent) Neural Network to, from scratch, learn about a text corpus and from that produce new, novel lines of text.

One of the great things about textgenrnn is that it “keeps things simple” in terms of making some assumptions about hyper-parameters and just lets you choose some basic high-level directives such as number of epochs (self-refinement training iterations) and temperature (the degree of creativity for responses).

So inspired both by Deckard and one of my favourite sci-fi books “Neuromancer”, the full text of which is available for free online with the permission of the author, I thought I’d to try to create a network to write more like William Gibson than William Gibson, or, Magis Gibson Quam Gibson 🙂

I thought Neuromancer would be an interesting choice as not only was full text available online, but grammatically and thought-patternly, it’s the most difficult book I’ve ever read. It’s written in “stream of consciousness” style – fancy way of saying thoughts aren’t always connected and you can go for pages and pages without so much as a punctuation mark. I wondered about this for some time on-and-off and finally realised it was intentional – this was about the dystopian man-machine melded future where thoughts and actions are decoupled and happen quickly, concurrently, unknowable unless you know the algorithm and confusing if you weren’t the one who pre-empted them. Plus it starts with that line “The sky above was the color of television, tuned to a dead channel.”.

Out of interest here’s the word-cloud for Neuromancer:

Anyway, I tried it first on my home Macbook, then ramped things up and hired a fairly big, NVIDIA CUDA GPU-enabled AWS instance and smashed through about 500 epochs before pulling the plug and seeing what it had to say. The top ten results are below. Some of them won’t make sense if you aren’t familiar with the book, however you can almost sense the way that it’s tried to approximate English, but within the strange context of Gibsons prose.

Eat our Hosaka was watching a sh*t. You ever say you were a scard, and the boy’s right eye was talling snore and fine the border.

Afteries. Maybe you got the matrix. The back hung here. Molly was her hand.

"What is this some way." The small human story. "I have you."

What do you get my goddam control ?

Step on the console of consciousness. Drink, somehow, a dozen body.

"Christ," Case said. "I hear your own this *ss horror," Case said, "I don’t have told him all". "Come on," she said, "a toxin construct."

I don’t have a neural design left.

"I don’t like this, that’s a week?" 

"No," the Flatline said, "She was quite conscious, Jane’s crumpled system".

What do you get my goddam control ?

So there you have it – an empty, naive neural network trying to make sense and find patterns in what is linguistic chaos. I’m surprised it came up with anything at all !

I guess the next step is to try it on some “proper legit” literature. See how it goes on some other authors, then maybe throw a whole bunch together and see what kind of style it verges toward or who knows, develops one unique to itself ?

Stay tuned 🙂

Next: See Shakespeare and Hunter S. Thompson duke it out in part 2.

error: Content is protected.