Sunspring by 32 Tesla K80 GPUs Short film experiment by EndCue and Ross Goodwin uses a script generated with a LSTM neural…
Sunspring by 32 Tesla K80 GPUs
Short film experiment by EndCue and Ross Goodwin uses a script generated with a LSTM neural network trained on sci-fi movie scripts:
To call the film above surreal would be a dramatic understatement. Watching it for the first time, I almost couldn’t believe what I was seeing—actors taking something without any objective meaning, and breathing semantic life into it with their emotion, inflection, and movement.
… As Modern English speakers, when we watch Shakespeare, we rely on actors to imbue the dialogue with meaning. And that’s exactly what happened in Sunspring, because the script itself has no objective meaning.
On watching the film, many of my friends did not realize that the action descriptions as well as the dialogue were computer generated. After examining the output from the computer, the production team made an effort to choose only action descriptions that realistically could be filmed, although the sequences themselves remained bizarre and surreal. The actors and production team’s interpretations and realizations of the computer’s descriptions was a fascinating case of human-machine collaboration.
You can read more background to the project here