Author: Sam Hill
Translators: Tian Ao, Song Qingbo, Aileen, Long Muxue
Reply “novel” in the background to download the complete new novel created by the neural network~
Winter is coming……
Season 7 of “Game of Thrones” has ended, but it is said that the last six episodes of this series will not air until spring 2019.
However, the biggest winner of the show so far is probably the character Jon Snow, who not only came back to life in the show but also got engaged to Rose Leslie, the actress who played Ygritte ❤️
After finishing the dog food, let’s talk about the original novel series “A Song of Ice and Fire”.
The fifth book of the Ice and Fire series, “A Dance with Dragons”, was published in 2011, and the author George R.R. Martin has been working on the sixth book, “The Winds of Winter”, but the release date remains uncertain
Moreover, there is nothing you can do about the author,
Due to the lack of new material, the producers of the TV series have been forced to find ways to move the story forward since season 6, as the impatient crowds demand more.
Tired of waiting, a full-stack software engineer, Zack Thoutt, is training a Recurrent Neural Network (RNN) to predict the plot of the unfinished sixth novel. He made the neural network learn from the text of the first five books of “A Song of Ice and Fire”, totaling 5000 pages, and then continued writing the story.
Reply “novel” in the background to download the complete new novel created by the neural network~
Just provide the neural network with a character’s name as a central word, and it can create a chapter!
“I am a die-hard fan of both the original and the TV series of ‘Game of Thrones’,” Thoutt said,
A neural network is a type of machine learning algorithm that mimics the human brain, and the Recurrent Neural Network is a subclass that handles sequential data well, such as text.
“Using a vanilla neural network, you can take an input dataset, pass it through the neural network, and get an output dataset.”
“To train these models, you need to predict the model’s outputs, which is usually referred to as labels or target variables. The neural network updates itself by comparing its output data with the target variable, making the learning results more aligned with the expected goals.”
Thoutt is using a more memory-efficient Long Short-Term Memory (LSTM) RNN structure, which is key to training a neural network to remember the emotional nodes thousands of words back.
Theoretically, this memory should prevent the neural network from repeating events that have already occurred, thus allowing the generated book to have a coherent plot instead of being a substitute version of an already published work.
In this sense, the neural network is trying to write a true sequel, although it is clear that it will occasionally make mistakes.
For example, in some cases, it still writes about characters who have already died.
“It is trying to write a new book; a perfect model would take everything that happens in the book into account, rather than treating characters who died two books ago as still alive,”
Thoutt said,
“The reality is that this model is not there yet. If it were that good, authors would be in trouble. The model is trying hard to write a new book, considering everything, but it makes many mistakes because there is currently no technology available to train a text generator that can remember complex plots from millions of words.”
After adding 5376 pages of the first five books of the series to the neural network, Thoutt has already “created” five predicted chapters and published the project on GitHub.
“When I start creating a chapter, I provide the neural network with a central word—a character’s name—and tell it how many words to generate after that,”
“This way, the generated text will revolve around a character in each chapter like the original work, and besides providing the first central word to the neural network, no other operations are needed.”
The AI-generated text trained by Thoutt is quite readable,Friendly reminder, the following includes AI predictions and fan theories, spoiler alert!!
For example, the neural network predicts that Sansa Stark is actually part of House Baratheon. The AI wrote:
“I fear Sansa Stark,” Ser Jamie reminded her. “She is a member of House Baratheon, your onion queen’s sons.”
“This was its first created sentence, and I found it quite interesting,” Thoutt said. Sansa could actually be King Robert’s illegitimate daughter, raised by the Stark family like Jon Snow, hiding her identity.
In the original book series, the ‘Sons of the Harpy’ are the army loyal to the Mother of Dragons. As for what the “onion queen” is, we will have to wait for the AI to create more chapters to learn more.
This neural network also created a new character named Greenbeard:
“Aye, Pate.” the tall man raised a sword and beckoned him back and pushed the big steel throne to where the girl came forward. Greenbeard was waiting toward the gates, big blind bearded pimple with his fallen body scraped his finger from a ring of white apple. It was half-buried mad on honey of a dried brain, of two rangers, a heavy frey.
“This is obviously not perfect. The story is short, with grammatical errors, but the neural network can demonstrate basic English language ability and imitate the old man’s writing style,” Thoutt said.
Not all predictions are completely off base.
According to the neural network’s predictions, Jaime will ultimately kill his lover and sister—Cersei, and Jon Snow will ride a dragon, while Varys will poison the Mother of Dragons.
This is speculation that all fans of the series have made—
Jaime killed Cersei and was cold and full of words, and Jon thought he was the wolf now, and white harbor……
“I believe this verifies that in Game of Thrones, “everyone dies eventually.” I didn’t provide the neural network with anything from the fans’ speculations on the website, only these books,” Thoutt said.
The novel contains about 32,000 words, which makes training the neural network more challenging.
“Martin’s writing is very expressive, and the extra adverbs, titles, and fictional places make it harder to train the neural network.”
Additionally, the text of these five novels is actually a relatively small dataset. An ideal resource for training a Recurrent Neural Network would be a corpus that is 100 times larger, containing only the vocabulary level of children’s books.
Thoutt has considered adding extra texts to the dataset, such as the scripts of the TV series, but this may cause the generated text to lose the old man’s writing style.
Who knows if the mysterious Greenbeard will descend from the heavens at the beginning of season eight and seize the Iron Throne? ̄▽ ̄
Original link: https://motherboard.vice.com/en_us/article/evvq3n/game-of-thrones-winds-of-winter-neural-network
GitHub link, you can generate your own new chapters of Ice and Fire: https://github.com/zackthoutt/got-book-6
Reply “volunteer” to join us
Click the image to read