With “The Creator,” Gareth Edwards made an $80 million indie sci-fi thriller look like a $200 million blockbuster, shooting everything first and then reverse-engineering the VFX once the film was edited together in post. It was a far cry from the back-to-back “Godzilla” and “Rogue One” studio franchises; “The Creator” served as a return to the roots of his low-budget sci-fi debut “Monsters,” where he made the leap as a VFX artist to director with a personal, guerilla filmmaking style — shooting on location with a naturalistic look, and very hands-on with camera and VFX.
Yet his sci-fi vision about a future war with AI (pairing John David Washington’s hardened ex-special forces agent with Madeleine Yuna Voyles’ 6-year-old, human-looking AI simulant) required a new mindset and paradigm shift. It meant combining the benefits of big- and low-budget filmmaking but with greater scope and scale.
Edwards shot in 80 locations throughout Southeast Asia with a small crew and natural lighting (serving as primary camera operator) with no mo-cap suits or markers or green/blue screen. Fortunately, he could afford to hire ILM (“Rogue One”) because the budget was back-loaded. He would sit down with the designers and paint over the shots in 2D to create the sci-fi world on top. Later, he would hold regular review sessions with the 3D modelers.
“The story was always established, [Gareth] knew that there was going to be a tank or a robot, we always knew there were plates gathered for these beats,” ILM production VFX supervisor Jay Cooper told IndieWire. “The components that were different were, we didn’t know the size of the tank, we addressed that based on the footage that he came back with. We didn’t know exactly on set sort of where things were going to go, but we knew they were out there.
“In terms of our approach, a lot of time was spent with [production designer] James Clyne, where he would do an initial paint over, or multiple initial paint overs, to find the design, based on the plates, which is exactly converse to what would normally happen: You agree to something, then you go build some portion of it on set, and then you shoot it. And then you iterate. That meant that we weren’t tied into things that were built already that we would want to chuck later. But it also meant that we were sort of starting from scratch.”
Another major advantage of doing VFX in post was that ILM was familiar with all the framing cues, which saved a lot of time. “But also I think there’s an element of performance that Gareth wanted to try to keep grounded and as naturalistic as possible,” Cooper added.
The bulk of the VFX work was devoted to the AI-animated characters. This was divided mostly into two types: the simulants that resemble humans, such as Voyles’ Alphie and Ken Watanabe’s soldier, Harun, and the robot army, with different-shaped heads inspired by ’80s Sony and Nintendo designs. The simulants were the most creative, with the face mask, circular-shaped mech on the side, and negative space through the neck and skull.
“We knew roughly where she was going once we had that tumbler through their head,” said Cooper. “We knew there would be some negative space under the neck. But everything else in terms of the detail work, where the contours of the simulant mechanics ended and the more human aspect showed up, that was developed as we started the production. And so we did a few different designs and we even got shots pretty close to finished.”
The robot army, though, was designed after the shoot. “And so we built a bunch of different robots, with different heads and different paint schemes,” Cooper continued. “But it did have this ’80s futurism feel to it, which was intentional. And we replaced anything that would read as human. So the idea is that they’re a metal body with normal clothing [designed by Weta Workshop].”
Edwards wanted the designs to start completely in 2D, and the design work went back and forth between the director and production designer Clyne (who started his career in VFX) and ILM. “We would pull and stretch and squash and remove pieces and create more negative space and reform,” Cooper said. “And that’s something that I hadn’t been a part of in terms of this process of going back and forth a number of times. But one of the things that’s original about the way Gareth works is he has one set of opinions in 2D and then wants to see it on a turntable. Then he’s like: ‘That doesn’t look good from this side. What if we did this or that?’ And then we tried to improve our designs and improve the look.”
ILM’s collaboration with Clyne on turning locations into sets began with his initial drawovers based on plates. “And then we would iterate repeatedly on that,” on-set VFX supervisor Andrew Roberts told IndieWire. “For the floating village, it was important that there were these towering environments that were really imposing. But I also like the idea of having small amounts of detail in this little ribbon along the edges and sort of to break up these larger shapes.”
For an action set piece on a wooden trestle bridge in Thailand, where Alphie defuses an explosive confrontation with a bomber droid, they shot in five different locations around the bridge, the village, and the far side, where Harun and the simulants are behind bunkers. Roberts shot lots of reference and photogrammetry and then scanned the environment, which he passed back to the ILM team.
As for the massive NOMAD orbital space defense station, it started as a ring that encompassed the entire earth at one point. It was intended to be imposing and scary with lots of mobility but no escape. The third act aboard NOMAD was very CG intensive with a lot of rigid body destruction, intercut with scenes shot on physical sets at Pinewood Studios in London. That’s also where the ILM StageCraft Volume is located, which was used for two scenes involving an airlock and biosphere.
“In terms of how Gareth approached the week’s StageCraft shoot was very successful,” Cooper said. “If we didn’t have final shots, we at least knew what we were going to do. There’s some elements of StageCraft that we would always have to come back to because it was either stunt rigging or an explosion that we had to add. But in terms of the heavy lifting of the framing and the design and the storytelling, we came out of StageCraft with that established.”
For Edwards, StageCraft will never replace shooting in actual locations, but he found it beneficial for these particular sci-fi settings. “At the moment, the tricky part is you have to figure it out, all of it, and build it, and have it all there, prior to filming,” Edwards told IndieWire. “And so it’s not total, it’s more like building sets, but they just happen to be in the computer. But it was really exciting to be on it. It wasn’t very strange, but it messes with your head. You really think you’re in those locations. And when we split the set, pushing a button, they could change the set to a different set. And I kept thinking I’d moved stages to a different place.
“I would definitely use an LED screen technology again. I think it has big advantages,” Edwards added. “It’s really useful in situations where you couldn’t otherwise film that. If there’s a place on Earth that you could travel to and shoot something very similar, then I would just try and do that first.”
“The Creator” is now in theaters.