The Case for CGI in Filmmaking

Ryan Hu (11) | STAFF REPORTER

CGI, or computer-generated imagery, is one of the most controversial aspects of the current film industry. Several directors, like Christopher Nolan, swear off the technology entirely and believe it to be a crutch. However, the widespread usage of CGI in modern films is undeniable, and will likely increase as the technology only gets better. 

Computer-generated imagery is exactly what it sounds like: computer-generated images or video, usually three-dimensional. Today, it has a wide range of applications, including flight simulators, video games, computer-aided design, and even in courts to recreate alleged events. 

While the first movie to use CGI was Vertigo (1958), the use of CGI in movies only became widespread in the 1990s. Jurassic Park (1993) is often credited for pushing CGI into the mainstream by featuring realistic, lifelike dinosaurs that would have been much harder to depict with practical effects. 

As CGI technology advanced into the 2000s, it became possible to create live-action movies almost entirely through CGI. Another James Cameron film, Avatar (2009), used motion-capture technology to place actors onto computer-generated, humanoid aliens, with the setting being a CGI alien planet. Weta FX, the company that developed most of the CGI in the film, went as far as calling the film “animated” because of how much of it was computer-generated. 

Since then, the use of CGI has only increased. Most notably, movies in the Marvel Cinematic Universe use CGI and motion capture extensively. While this is reasonable for alien environments like in Guardians of the Galaxy (2014), as well as for certain characters like the Hulk or Ant-Man, CGI is even used to replicate entire set-pieces, like how the airport in Captain America: Civil War (2016) was entirely computer-generated. 

However, in the last couple of years, Marvel’s once-touted CGI has been seemingly getting worse. Some explanations for this include COVID-19, as well as their VFX team being overworked and underpaid, leading to a vote to unionise earlier this year. Nevertheless, audience backlash has shown that over-reliance on CGI can make a whole movie look fake or unrealistic.

For example, The Hobbit trilogy was widely criticised for its overuse of CGI. This made the films feel unrealistic, especially when used for action-heavy scenes. Its predecessor, The Lord of the Rings trilogy, never had this issue, despite also being set in a fantasy world where elves, orcs and magic exist. 

This is because, where The Hobbit relied too heavily on CGI for its settings and characters, The Lord of the Rings was shot entirely around New Zealand and in physical sets. Even the computer-generated orcs had their facial animations mapped to real actors, making them more believable to the audience. This sparse use of CGI granted the trilogy more immersiveness, and allowed the trilogy to succeed where previous adaptations of The Lord of the Rings had failed. 

However, one case where the CGI-heavy sequel holds up to the original is in Blade Runner 2049 (2017). Its prequel, Blade Runner (1982), used stunning practical effects to depict flying cars, rainy streets and futuristic technology in a cyberpunk, future Los Angeles. An entire city street set was built for the movie, but had to be shot in the dark and with rain and fog effects to hide the rest of the set. 

On the other hand, Blade Runner 2049 has many of the same visuals as its non-CGI original, but is able to use CGI to expand its setting and world. While most of the movie is shot in real life, CGI is used for settings and locations that would be infeasible to construct. Furthermore, Blade Runner 2049 is driven by the actors and their performances, instead of any action or special effects, which means that the CGI does not distract from the main experience.

The bottom line is that CGI is simply a tool available to directors. While CGI can be misused in many ways, it is still a vital element in modern filmmaking. In fact, it can even enhance certain movies when paired with practical effects, but only when practical effects are already present. CGI has not harmed the quality of modern films; that blame may go to other factors. It has, however, greatly improved cinema’s capabilities, pulling the film industry into the 21st century.