Dec 10, 2020
Picture it: Ian McKellan in a
wizard hat surrounded by a green screen. Is that what
artificial intelligence in the entertainment industry looks
like? It certainly is a step toward incorporating the powerful
modeling tools computers can use to generate scenery like the Mines
of Moria. But how much further might AI move beyond today's
standard movie-making video vfx? Alexis Kirke and Richard take
listeners on a tour of the possibilities.
Listen and learn
Alexis Kirke is a senior
research fellow at the School of Humanities and Performing Arts and
with the Faculty of Arts at the University of Plymouth, England.
He's a screenwriter and quantum/AI programmer and shares intriguing
AI ideas with listeners.
Examples of AI most familiar to listeners include the blending of artificial and natural scenes in Marvel franchise movies or recreations of film stars no longer with us like Princes Leia in a recent Star Wars film. Alexis Kirke sees the blending of artificial and human intelligence as much more than automation in the film industry. Rather, he sees creative potential in how humans and machines collaborate, making each other more powerful and inspiring.
In addition to contemplating
possibilities, he helps listeners understand the practical uses.
For example, adapting a TV program for someone with hearing
difficulties in a meaningful way is more possible than ever.
Creators have broken down all the sound elements in a piece and
rated each one with narrative importance—someone could have an
intelligent volume control and turn down the sounds in inverse
relative importance, emphasizing only the most essential noises for
The job of the production team is only to break these elements down. The individualization is up to the device of the viewer. But of course, it goes far beyond this and he and Richard talk about the extent to which a story might be alterable as well as scenery and actors. Listen in for an entertaining discussion about the movies.