The strange world of computer-generated novels
Welcome to National Novel Generation Month, where the algorithm is the author
Josh Dzieza. THE VERGE. 25/11/2014
It’s November and aspiring writers are plugging away at their novels for National Novel Writing Month, or NaNoWriMo, an annual event that encourages people to churn out a 50,000-word book on deadline. But a hundred or so people are taking a very different approach to the challenge, writing computer programs that will write their texts for them. It’s called NaNoGenMo, for National Novel Generation Month, and the results are a strange, often funny look at what automatic text generation can do.
The developer and artist Darius Kazemi started NaNoGenMo last year, when he tweeted out an off-the-cuff idea.
“I got a ton of people responding saying ‘Oh my god, I’d totally do that,’” Kazemi says. The next day, he opened up a repository on Github where people could post their projects.
Nick Montfort’s World Clock was the breakout hit of last year. A poet and professor of digital media at MIT, Montfort used 165 lines of Python code to arrange a new sequence of characters, locations, and actions for each minute in a day. He gave readings, and the book was later printed by the Harvard Book Store’s press. Still, Kazemi says reading an entire generated novel is more a feat of endurance than a testament to the quality of the story, which tends to be choppy, flat, or incoherent by the standards of human writing.
“Even Nick expects you to maybe read a chapter of it or flip to a random page,” Kazemi says.
Narrative is one of the great challenges of artificial intelligence. Companies and researchers are working to create programs that can generate intelligible narratives, but most of them are restricted to short snippets of text. The company Narrative Science, for example, makes programs that take data from sporting events or financial reports, highlight the most significant information, and arrange it using templates pre-written by humans. It’s not the loveliest prose, but it’s fairly accurate and very fast.
NanNoGenMo, Kazemi says, “is more about doing something that is entertaining to yourself and possibly to other people.”
For last year’s NaNoGenMo Kazemi generated “Teens Wander Around a House.” He made a bunch of artificial intelligence agents and had them meander through a house at random, his program narrating their actions. When two characters ended up in a room together, he pulled dialogue from Twitter. One tweet could be a question — “What’s for dinner tomorrow?” — and the next, a statement that also contained the word “dinner” — “Dinner is my favorite meal of the day,” for example. “The result was a conversation that sort of stayed on topic but didn’t make much sense,” he says.
This year he’s designing a program that interprets an online step-by-step guide to novel writing extremely literally. “It starts with ‘establish a day-to-day routine’ then ‘show the characters’ wants and dreams’ then ‘give them a call to action,’ all that stuff,” Kazemi says. “It reads like crap but it actually does have a forward sense of narrative.
“THE COMICS THAT COME OUT GIVE ME CHILLS SOMETIMES”
Another participant, Michelle Fullwood, made Twide and Twejudice: Pride and Prejudice but with each word of dialogue substituted for a word used in a similar context on Twitter. The result is delightfully absurd, a normal-seeming Austen novel where characters break out in almost-intelligible gobbledegook. For instance, here is Mr. Bennett telling Mrs. Bennett that plenty more wealthy young men will move to town for their daughters to marry.
“But I hope you willl get ovaaa it, whereby live to see manyy young snowmobilers ofthe four karat a yearrr comeeee into tje neighbourhood.”
And in an earlier version:
“But I hopee yiou willllll gget ovaaa itttttttttt , aand livee to seee meny peppy cyborgs ofv umpteen luft awhole mnth coem intoo tthe neighbourhood.”
Liza Daly made her own version of the Voynich Manuscript, a 15th century codex written in an unknown script and illustrated with elaborate and perplexing diagrams. Daly wrote a program that took words from the codex, randomized them, and placed them on a page along with old alchemical and botanical images from the Internet Archive. The result is quite beautiful, and no more or less bewildering than the source codex.
Then there’s Greg Borenstein’s Generated Detective, a noir comic. Borenstein’s program searches old detective novels on Project Gutenberg for sentences that include a series of words.
He then searches Flickr for each sentence plucked by the program, runs the resulting image through a manga app, and ends up with an eerily inscrutable noir story. Borenstein does the Flickr search himself, but he’s working on automating the whole process, as well as incorporating image recognition so that the program can add dialogue bubbles.
“The comics that come out give me chills sometimes,” Kazemi says. “It’s a very disjointed, dream-like narrative, like most NaNoGenMo narratives.”
Ultimately, Kazemi says, the point is to have fun, to flex your coding muscles a bit, and maybe leave thinking about text a little differently. He points to the strange cadence of Definition Book a program that parenthetically defines a word from an initial eight-word sentence, and defines a word from that definition, and so on, recursively for 50,000 words. The first half of the book is all the beginning of sentences and the second half is the end. “I’ve never thought about a text that way,” Kazemi says. “It sort of turned on a lightbulb in my head.”
UPDATE: Borenstein has automated Generated Detective’s sentence and image selection, as well as modified the script to pull sentences from different genres, including sci-fi,romance, and horror.
Llegiu la notícia original aquí.