By IAN FAILES
Ollie Rankin is a veteran of the visual effects industry, with experience at studios including Method Studios, DNEG and MPC. In more recent years, he’s utilized his VFX and storytelling skills in producing original immersive entertainment content.
One interactive VR film Rankin directed, called Downloaded, was part of this year’s Venice Biennale. TheVR experience, made through Rankin’s Pansensory Interactive and executive produced by Nancy Mott Basi from the Vancouver Economic Commission, has the viewer trapped inside a computer, thanks to a contraption that can digitize minds and simulate consciousness. Downloaded’s interactive nature means there are 40 different permutations to the story.
As with a number of visual effects professionals who segue into VR or AR, Rankin capitalized on his skills to incorporate what he calls ‘old school’ VFX tricks into the film, which combines live-action and real-time rendered CG. The director also wanted to explore the idea that different viewers can have different experiences in virtual reality. VFX Voice asked him about Downloaded’s VR filmmaking process.
VFX Voice: What were you trying to accomplish with Downloaded?
Ollie Rankin: The idea for Downloaded had been floating around in my head for a long time, because it’s a pretty inevitable science fiction trope of being able to digitize the human consciousness and transfer it to a computer. VR is the perfect medium to tell that story, to give somebody that first-person experience of being digitized, and being inside of a computer.
I have always been fascinated by the notion that VR is an isolating, anti-social medium. I’ve always thought that it has potential to do the opposite, if it’s exercised correctly. Part of what I was attempting to achieve with Downloaded, is that it should actually lead people to have social interactions with other people that have gone through that experience, because each of them had a different permutation of the story, and it’s only really by comparing notes with other people that have been through the experience in a different trajectory, that you can start to piece together the fuller story work.
I thought it would be quite ironic to use VR as a medium for encouraging and facilitating these real-world conversations. I wanted to have a story that also had that message about how technology can both bring us together, but it can also isolate us. So the story of Downloaded is essentially a story of how you’ve used technology to cut yourself off from a real world relationship, but the resolution of it is to re-establish that relationship in the real world, to bring yourself back out into reality, by creating a real-world connection with the live-action character. If that can, in return, result in these people having these real-world conversations afterwards, then it’s achieved its ‘meta’ purpose as well.
VFX Voice: Can you describe what happens in the experience?
Rankin: The viewer is trapped inside the computer and there’s this conceit – they’re looking out through the screen. So rather than looking out through the webcam or anything else that you might imagine, we had the viewer looking out through a screen. It gives them a reason to focus in a particular direction for large portions of the experience.
It also borders their view of the live-action portion of the story. It means that we can have content on the screen, which in itself serves a double purpose, because by showing the user they’re looking through this content in reverse on a screen, that makes it clear to them that they are inside the computer, looking out into the real world. And that content also is something that they can interact with by solving a series of puzzles.
The rest of the CG world, the ‘cyberworld,’ as we call it, is an abstract representation of the internal workings of a computer. So it has abstract data structures and bigger processing mechanisms that are visualized as a digital city architecture.
VFX Voice: What approach did you take in filming the experience?
Rankin: What I wanted to create with Downloaded was this very, very photorealistic view, and so I realized the only way that we could really do that was by using some visual effects cheats. It’s quite common in visual effects to do things in two-and-a-half D, where we combine flat layers of elements in three dimensions.
So that was the approach that we decided to take to filming the live-action portion of it. We built the set on stage in a warehouse, and we used laser scanning and photogrammetry to create a high-resolution and 3D representation of the set. We filmed the actress on a greenscreen. Part of the beauty of this idea of being trapped inside of a computer and looking out into the real world is that there’s a narrative justification for not allowing the viewer to get close enough to the live-action character for that illusion to break.
VFX Voice: Because you are delivering for VR, how did you bring all that footage in and output something that works in VR?
Rankin: It did start very much in a typical visual effects way. We gave the footage to an outsource company in India to do the greenscreen compositing. They used a combination of keying and rotoscoping to extract the actress from the background. Then the background was printed a solid green, and they delivered us back movie files.
In the game engine, those movie files play back, and then there’s a plugin in the game engine which strips out the green background so that she’s outlined, and she’s keyed in real-time. As she moves backwards and forwards, we hand-animated and keyframed the distance that the ‘code’ needs to be from the viewer in order for it to maintain the right scale and depth. So the viewer can move their head from side to side and see parallax.
VFX Voice: What is you plan with this experience–will it be available to download?
Rankin: What we’ve created so far is a five- to eight-minute experience, depending on how the user interacts and what narrative pathway they take. We learned an awful lot about designing interactive and immersive narratives, and learned a lot about how intuitive people find different aspects of the experience. I think of it more as a prototype than a finished product.
What I’m really looking to do is use this exposure of getting it into Venice and some other festivals – we’re also going to be screening it at the Vancouver International Film Festival – to raise money to develop a series.
It would be a five- or 10-episode series where each episode will be half an hour long, at minimum, but could take longer as people explore different parts of it, or as they replay different aspects of it. In future episodes, you’ll be able to travel to other computers, and there will be some other stories that are dealing with other aspects of contemporary technology and social issues. But you will then be able to delve into those in an interactive immersive way. You’ll be able to hack into different computer systems, you’ll be able to print yourself out into other locations.