Published , by Kevin Tucker
Published , by Kevin Tucker
Though the process has been slow, video games and movies have become largely comparable in terms of visual effects quality. This is mainly due to film's adoption of computer graphics rendering techniques that have been commonplace in gaming for years. Now, with the power of Unreal Engine at filmmaker's fingertips, the gap between games and film has closed even more. Shacknews had the chance to speak with Industrial Light and Magic's Rob Bredow about this shift, including the company's focus on Unreal technology as well as real-time effects on Solo: A Star Wars Story.
"I think the real-time tools that we're leveraging [during] post-production and in pre-production and for our VR experiences, those tools are becoming a bigger and bigger part of the filmmaking process."
Bredow states that on Solo: A Star Wars Story, ILM's process involved using traditional film effects layered in with computer-rendered material at various stages of the process.
"We had pre-rendered material that would mostly play, but then we could trigger it interactively, and then we could add interactive elements on top of it. So blaster fire was added interactively in real-time while we were playing back the full quality media, and we kept the quality high — no one knew what was real-time and what was pre-rendered because it was all seamlessly generated."
For more gaming and games industry videos, including convention coverage and in-depth developer interviews, be sure to follow both Shacknews and GamerHub.tv over on YouTube.