Advertise here with Carbon Ads

This site is made possible by member support. โค๏ธ

Big thanks to Arcustech for hosting the site and offering amazing tech support.

When you buy through links on kottke.org, I may earn an affiliate commission. Thanks for supporting the site!

kottke.org. home of fine hypertext products since 1998.

๐Ÿ”  ๐Ÿ’€  ๐Ÿ“ธ  ๐Ÿ˜ญ  ๐Ÿ•ณ๏ธ  ๐Ÿค   ๐ŸŽฌ  ๐Ÿฅ”

The tools ILM built to make Rogue One are super interesting

Every time I watch or read something about how Rogue One was made, I come away more intrigued. And it’s not about how they made the film…it’s about the tools they built to help them make the film. A few weeks ago, I posted about the full-length story reel they made from bits of old movies so that director Gareth Edwards could determine the pacing:

There was no screenplay, there was just a story breakdown at that point, scene by scene. He got me to rip hundreds of movies and basically make ‘Rogue One’ using other films so that they could work out how much dialogue they actually needed in the film.

It’s very simple to have a line [in the script] that reads “Krennic’s shuttle descends to the planet”, now that takes maybe 2-3 seconds in other films, but if you look at any other ‘Star Wars’ film you realise that takes 45 seconds or a minute of screen time. So by making the whole film that way โ€” I used a lot of the ‘Star Wars’ films โ€” but also hundreds of other films too, it gave us a good idea of the timing.

In this video, we see a couple more tools the team used to facilitate the making of the film. The first is a VR video game of sorts that ILM built so that Edwards could move a virtual camera around in a virtual set to find just the right camera angles to capture the action, resulting in a process that was more flexible than traditional storyboarding.

The second tool jumped around a virtual set โ€” a complete digital model of Jedha City โ€” and rendered hundreds of street views from it at random. Then the filmmakers would look through the scenes for interesting shots and found scenes that looked more “natural” than something a digital effects artist might have come up with on purpose โ€” basically massively parallel location scouting.

Both are attempts to introduce more serendipity and possibility into a digital filmmaking process that sometimes feels a little stilted. I think animation studios like Pixar have been using these techniques for years, but it’s interesting to see them applied to live-action films like Rogue One.

Update: The Verge’s Bryan Bishop talked to Edwards and visual effects supervisor John Knoll and came away with more interesting details about how they used technology in filming Rogue One.

Typically, you’d have to storyboard these things, and that means you’re pulling from some default, subconscious idea in your head, probably based on another film you’ve seen, where you feel like it should be this shot. I find you get much better, more unique, shots when you are in a real environment, trying to find something that’s unfolding in front of you. You get inspired because of the light and shapes and things. It was like being in the real world more, and like the way we shot a lot of the rest of the movie.

I think if I ever do a big film again, and there’s a big digital set piece in it, whatever that is, I would definitely want to pre-animate it and then go in with a camera and try and film it like it was real.

Read the whole thing…the bit about the LED screens is fascinating. Prior to the 1980s, aside from some relatively minor editing tricks, effects in movies were mainly done during shooting. More recently, most of the production happens after the cameras stop rolling: extensively green-screened footage of the actors is combined with entire sets and worlds that are completely digital. With Rogue One, they tried to move some of that production back into the shooting phase in order to give the director more control over the scenes and the actors a more immersive environment in which to act. (via @sippey)