Who better than Robert Legato, who has won three visual effects Oscars (for The Jungle Book, Hugo, and Titanic) to talk about the future of this influential part of the film industry? Winner of the VES (Visual Effects Society of America) Award for Excellence in Innovation and Creativity, he has transformed virtual production into photorealistic live-action for James Cameron, Martin Scorsese and Jon Favreau. In fact, testing virtual reality with Favreau in “The Lion King” led the director of this work to create the “Mandelorian” series for Disney Plus in collaboration with the special effects studio Industrial Light and Magic (ILM). They built the Stagecraft platform, which eliminated the time-consuming and costly need to shoot on location. At the studios in Manhattan Beach, near Los Angeles, actors perform on walls and ceilings covered in giant LEDs, where the physical elements of the stage are integrated with digitally enhanced displays. This allows filmmakers to create complex and amazing digital backgrounds in real-time using software such as Epic’s Unreal Engine.
10 reasons why computer special effects in cinema are getting worse, not better!
As a result of this revolutionary technology, stages covered with LED walls are popping up all over the world. ILM has three studios in Los Angeles, one in Pinewood, London, and one in Vancouver. Weta and Unity studios have one in New Zealand, Warner Brothers has one in Leavesden, England. DNEG and Dimension Studios also have one in London, and Trellis Studios has one in Atlanta. This issue has reduced more than ever the number and extent of filming in real locations, the construction of physical components and crowded scenes, and Legato talks about the importance of scenes covered with LED walls and also about its impact on the future of the cinema industry.
On the other hand, another hot topic that Legato mentions is the future of digital acting, where animation software continues to advance and step beyond the uncanny valley. This includes a de-aging process that has finally matured, resulting in more believable and refreshing games and performances. The special effects of Lola, who is the coating of the cinema industry in the field of rejuvenation, and the go-to specialist technology of Marvel in the technique of softening the skin and twisting the shapes using two-dimensional composition, like Photoshop software, have reached mastery. However, the advent of deepfake technology is pioneering a new innovation in which computer learning software analyzes and merges photos and videos of young actors to create a computer composite of their faces.
– After the success of Mandelorin, the place of virtual production and the effect of scenes covered with walls LEDs How do you rate
We were working on something similar, but not as easy as that with the blue screen tool. With the advent of LED-covered walls, you can now pre-build your own scene, light it, texture it, and make all the necessary decoration settings, and then shoot a sequence with those settings. . But now, the LED-covered stage has made visualization easier. It doesn’t even have to be a special effect anymore. It could be an office or a bar or a cave. This means that when it becomes cheaper and where all the scenes are LED, you can have a wider and deeper imagination and not be limited by the budget.
-Tell me about the possibility of visualizing and visualizing everything in the virtual reality space. In the film Shir Shah, which was a photorealistic test to imitate live action.
In The Lion King, Caleb Deschanel and I had a cameraman, a grip with a mobile stand, a camera assistant, a focus changer and someone holding the gimbal arm, it was a lot like live action but with a bunch of extras and stuff. which was little If you’re talking about the future, you’re taking advantage of the advances in computers and tracking and a lot of other things, but I think we still like that chemistry of the actors on set with the director. We will never completely get rid of analog instruments, nor should we. Now you can make The Lion King or The Jungle Book in a way that is more photoreal and you can shoot it in a shorter period of time, without the difficult and laborious work of shooting in real locations.
I always mention The Returned as an example of a movie we want to see, but no one wants to remake! The reason is clear: it is very hard to make and very physically exhausting for Leo. When you’re in the freezing cold and you’ve been waiting there all day to take advantage of only 20 minutes of imaging. He even said I wouldn’t do it again, but he’s still highly respected and won an Oscar for the role. But you don’t need to suffer that pain to show the same amount of initiative and creativity. It can be a film with virtual scenes that is just as visually powerful and impressive.
– And we are witnessing this now with the continued evolution of the transformative virtual production.
LED walls have provided full access that was a hindrance with all CG tools years ago. There is no problem with this anymore. You can set up a scene on a video wall, shoot real actors, move the camera wherever you want, and when you’re done, you can move on to another scene without ever leaving the studio. Regardless of whether the world you can build is sci-fi, directional, or whatever. You can only do your normal filmmaking and increase the value of your production.
-But the wall LEDs It is not a comprehensive solution. There’s a custom thing that works for the Mendelorian but not for the dune and I’m getting negative feedback that the lighting isn’t realistic enough.
I think it’s not going to replace everything completely, but it’s going to open a new way to shoot difficult scenes. I am working on a film now. We’re shooting a swamp sequence in New Orleans. We stop work three or four times a day, because of the lighting, and as soon as it starts raining in the middle of the day, all the equipment gets stuck in the mud. LED walls are just for getting that face of the scene and being able to shoot 10 hours a day instead of just 4 hours. All these things happen only because of money. If I can’t see the difference and you can finish a sequence that would take us two weeks to shoot on location in three days, you can handle the lighting problems and everything else with the skill you have. Kenny You can give it a light that is not artificial.
– Engines Real time Like Unreal 5, they have achieved and will bring all kinds of improvements in lighting.
Yes, and since you’re talking about the future, it’s not something that’s happening now. I am walking in this direction. I don’t fully understand it, but this technology allows you to have infinitely more tools at your disposal to produce images and be able to do it in real time. If I were to teach a cameraman how to go into a virtual reality environment and bring his assistant with him and you light it up and say I want a 20K image here and I want a 12 x 12 here and they start Saying that, damn! I can light it like I could on a huge stage. The result is the same but there is resistance and fear towards it.
We don’t have many Calebs, John Thales, or Vittorio Storaros, but videographers can adapt to this technology. This is a bit like when synthesizers came along. Everybody hates the sound these machines make, and suddenly where a single person can get on the instrument and play some sort of orchestra, if they’re talented enough, the music that comes out of it is amazing, and now we can work. do more and there is no need for an 80-piece orchestra. He needs his imagination and a computer and he can reproduce something that is extraordinary.
But again, I must say that this is the future and adaptability and people show resistance to change. They fear it will take away their jobs, instead of improving their work, but the set designer is still building the scenes, whether it’s physical and tangible or on the wall and virtual. It is still the same work of art as before. The same is true for the photographer. They have to light the digital stage like they would light a conventional stage. If they do their job well, they can get almost the same result by putting the bulbs in exactly the same spot. But this time, the lights are virtual and not physically plugged into a wall outlet.
– Where do you see this technology in the next 5 years?
In the next 5 years, we will sift through the companies that are not doing well.
Of course, I don’t want to name names, but I have worked with a few recently who were still at the beginning of their adaptation and coping with technology. They have the ability to master the technical aspect of technology and its hard part and to handle it, for example, to make the stage not crowded, how to use the software and all the technical issues of the matter. But what was missing in the middle is art. I entered the work and re-directed the artistic aspect of the problem, and only then did a good result come out. It is said that what separates the men from the boys is artistic skills, photography and production value, and this may happen in the next 5 years. Almost every TV show, every big studio or any stage that starts work will have a wall stage with LED. Think of the efficiency of being able to shoot 5 scenes with a guest actor with just one scene change at the touch of a button. I could have Tom Hanks in my movie and only hire him for three days to play, which I couldn’t afford otherwise.
– Now let’s talk about the technology that is related to acting and performance: digital face painting and rejuvenation.
The advent of digital face painting makes it look very photorealistic, it’s all about ray tracing and ambient lighting and stuff like that. Those who really dedicated themselves to matching lines, and something that was traditionally limited, now you can do it without limits. You can have any actor playing for another actor, it can be any character, any size. It will reach a point where the hair will no longer go through the seams. You can have an actor play someone else instead of spending six hours doing makeup. The art of this work and all these things will remain in place. It will become commonplace to be able to add something after the person’s real self and make it look photorealistic.
I don’t think there’s going to be a moment where you can replace the actor with an artificial actor. But it can be a person who creates the character and the combination of the two becomes a movie star. This can happen where a short guy plays an action star and no one notices. I can’t name names but there are action stars who don’t really get to do a lot of action scenes and they are replaced by their stuntmen and some kind of face swapping. So I think we are going to have a mixed movie star whose face and acting ability will be completely different.
– Add to this that rejuvenation is getting better day by day. Look at what Lola did with Mark Hamill as Luke in the second season finale of The Mandelorian. Those are the methods They integrated themselves with Deepfake technology as part of the preparation for editing images and videos from Return of the Jedi.
Deepfake is impressive technology from someone who isn’t an artist at all, much more natural and less expensive than what we see in The Irishman (the method ILM uses is markerless and light-driven). And that’s going to take over from digital facials in the future, where you’re not going to sit in a chair for five hours and have things glued to your face, and the technology looks just as good.
It’s amazing to be used by experts and will help make it easy to do. You see this deepfake tool (using machine memory) where Bill Hader is doing his stunts on a TV show, and they subtly make his face look like Arnold Schwarzenegger.
The existing line is gradually blurring and merging the lines and those who are working in my field should become more like a filmmaker. The appearance and art are the same and it is not going to take away anyone’s job.
Source : IndieWire
RCO NEWS