What if, instead of just watching C-3PO, Boba Fett, or an Imperial Stormtrooper in a Star Wars movie, you could jump into the movie and be any one of them?
Narrative VR, the art of telling stories within virtual reality, is uncharted territory compared to VR gaming. Oculus Story Studio was founded to investigate the potential for animated films on the Oculus Rift, but in terms of more traditional filmmaking in virtual reality high-profile research and breakthroughs have been rare, and it’s been difficult to conceptualize what narrative VR might look like.
That was before Rob Bredow, head of ILMxLAB, the studio that exists in the intersection between Lucasfilm creative, Industrial Light and Magic, and Skywalker Sound, delivered a talk at Oculus Connect 2 on Wednesday titled “The Force of Virtual Reality at Lucasfilm.”
Bredow demonstrated a short Star Wars film produced using all the advanced technology developed by ILMxLAB’s Advanced Development Group, the department that Bredow heads personally. And whether or not it was his intention, Bredow laid out the rudiments for a theory on how narrative VR could be structured, if you have the imagination to put the pieces together.
ILMxLAB has been operating for years, but only recently announced its existence to the world with a video introduction to the studio and its work.
“One group we don’t talk a lot about publicly is our little team called the Advanced Development Group,” Bredow said, “and this group has a very broad charter, which is to transform entertainment with real-time graphics.”
The film industry uses a time-consuming rendering process to create computer-generated images and environments. The advantage of this process is extremely high-fidelity images. The disadvantage is having to wait overnight to see what the shots look like before deciding whether you need to make any adjustments, at which point you need to compose the new footage, record it, and then wait another evening to render the new material.
“Video games, of course, have been leveraging real-time graphics for a long time,” Bredow said, “and they’ve completely transformed first-person shooters and all those kinds of things, and have been the mainstay for many years. But when you look at the filmmaking process, it’s made its way into previsualization, it’s made its way in some uses on set, but it hasn’t really transformed the filmmaking process.”
This is where the Advanced Development Group comes in. One of the techniques it has introduced into the filmmaking process is called “Realtime Look Development,” that uses a shared materials systems between filmmakers and virtual-reality experience creators.
Traditionally, to do something as simple as apply weathering to something like a speederbike, several different artists need to decide where the paint chipping would be, what the texture would look like, and then to apply the new textures to the model. Realtime LookDev means that an artist at Industrial Light and Magic, or ILMxLAB, can apply these textures using Photoshop-like software, to make the texture applications in real time.
Another application created by the Advanced Development Group, called V-Scout, allows filmmakers to explore a virtual set. The V-Scout demo Bredow shared with the audience at Oculus Connect 2 featured key elements from what has already become an iconic image of The Force Awakens, a wrecked Imperial Star Destroyer and a crashed Rebel Alliance X-Wing starfighter half-buried in the sand of the planet Jakku.
Using an iPad, the V-Scout user can navigate freely through this virtual environment, seeing what it looks like to stand next to the X-Wing’s cockpit, looking out at the Star Destroyer, or standing within the wreckage of the Star Destroyer, looking out at the X-Wing. The V-Scout user can adjust the camera to compose shots, and then bookmark those camera positions and associated settings to refer to later.
Bredow ended the V-Scout demo by snapping the camera to the top of the wrecked Star Destroyer’s bridge, right in-between the two shield generators at the very top of the ship. And then Bredow invited a member of the audience to take the stage, slip on an Oculus Rift, and be transported to that same position on top of the Star Destroyer, looking out at the deserts of Jakku.
Bredow then showed a short Star Wars film created using a combination of real-time graphics technologies developed by ILMxLAB.
Rather than filming an actor in a motion-capture suit to generate a skeleton that is then used to guide the performance of a virtual character, ILMxLAB can shoot the actor and place their performance directly into a virtual set in real time.
A camera operator can also move around a motion-capture set, using an iPad as though it were a handheld camera, and have their camera position replicated within the virtual set.
The short film began with a squad of Imperial Stormtroopers, patrolling a town on Tatooine. They informed their squad leader of reports that a pair of droids were trying to make contact with a rebel cell. The squad leader told his squad to spread out and look for the droids.
We cut to a shot of C-3PO and R2-D2, making their way out of a building in the town, C-3PO muttering about doom and gloom. In the background, we saw a starship trying to make a landing near the town, while taking fire from an Imperial AT-AT walker.
C-3PO radioed the starship that he was ready for pickup. The starship captain radioed back that he’d called in reinforcements. A pair of X-Wing fighters screamed through the sky over C-3PO and opened fire on the AT-AT.
As C-3PO and R2-D2 started walking out of the town toward the rebel starship setting down on the sand, we saw the familiar silhouette of Boba Fett, and he called out to the droids to stop. Then the scene ended. Bredow had just shown the audience a traditional, linear, narrative sequence like you might see in any Star Wars movie.
Using software like V-Scout, however, you could rewind the scene to the beginning, and then reposition the camera to see what else was taking place in the town, other than what you were shown the first time. Perhaps while the Stormtroopers were conferring with one another, instead of watching their exchange, you snapped to C-3PO and R2-D2.
Bredow rewound the short, used V-Scout to reposition the camera on the droids’ starting position within the small house, and then played the experience again. Now the audience saw the droids receiving a message from Princess Leia, telling the droids to hold tight, because pickup was on the way, before the droids made their way out of the building, and C-3PO began muttering doom and gloom.
Or, if you rewound the scene and snapped to Boba Fett’s perspective, you would see him taking on a pair of ruffians somewhere else in the town.
You could even, as the X-Wings screamed down over the town to open fire on the AT-AT, snap to the perspective of one of the Rebel pilots. Maybe you’d even be given control over the blasters, to decide when to shoot at the Imperial walker. And you could be doing all of this while wearing a virtual reality headset.
In Pulp Fiction you move between several main characters, out of chronological order, and by the end of the movie in your mind’s eye you can put all the pieces back together and see how the different stories flowed into one another.
Imagine watching Pulp Fiction in VR, and having the ability to snap between all the main characters, at any point, to see what they were doing in relation to one another over the course of the film.
Consider the 1993 television movie Gettysburg, and being able to snap back and forth between the perspectives of different officers in the Union and Confederate armies, to see how different parts of the battle unfolded simultaneously.
What was happening elsewhere during the scenes where Col. Joshua Lawrence Chamberlain (played by Jeff Daniels) and the 20th Maine held Little Round Top at the far end of the Union line?
Or what about the example of the Star Wars short Bredow showed during his talk, weaving side stories like Boba Fett’s into Star Wars movies, that you could later go back and unravel through VR experiences?
All of this is entirely theoretical, of course. Putting together a quick demo the likes of which can be shown at Oculus Connect 2 is one thing. Trying to weave enough substantive side stories into a Star Wars movie for VR represents a tremendous amount of additional writing and production, and producing a two-hour, traditional Star Wars narrative film is already a Herculean task.
You can, however, get a small taste of what narrative VR might look like, yourself, by trying the Star Wars: The Force Awakens Immersive 360 Experience video unveiled on Facebook shortly before Bredow delivered his talk at Oculus Connect 2.
The last thing Bredow showed the audience during his narrative VR demo was what it might look like to ride around on a speederbike through the town with the Stormtroopers, C-3PO, R2-D2, and Boba Fett.
He showed that whoever was on the speederbike could ride through the streets, watching the narrative scenes unfold.
Now imagine the Star Wars: The Force Awakens Immersive 360 Experience but instead of flying around a crashed Imperial Star Destroyer on the planet Jakku, you’re flying through a village on Tatooine while Imperial Stormtroopers are chasing C-3PO and R2-D2.
And rather than being on a predetermined path, you’re deciding where the speeder goes. Maybe you ride out of the town entirely and park underneath the AT-AT, to watch as the Imperial walker pounded the rebel starship with blaster fire.
Or maybe you stay within the town, take a wrong turn, come to a dead end, and find yourself facing Boba Fett.
Screengrab via ILMVisualFX/YouTube