Skip to main content

Opinion

Share to Socials

Since its inception, computer generated imagery (CGI) has been used to revolutionize cinema with films such as “Avatar,” “Jurassic Park” and “Star Wars” taking advantage of this groundbreaking technology to create stunning visuals and wacky characters in their legendary stories.      

However, in recent years CGI has received significant backlash from moviegoers. Despite modest innovations in technology, I believe that the quality of CGI is not what it used to be.  

 As CGI continued to evolve with each passing decade, it slowly became a vital part of the filmmaking process. In the late 1980s and early 1990s, it was considered rare and even groundbreaking in the film world. Films such as “Jurassic Park” and “Terminator 2: Judgement Day” pushed the limits of visual storytelling with impressive CGI visuals, setting them apart from their competition. 

However, what was even more impressive was the filmmaker’s ability to create and use CGI in movies.  

The process of creating a CGI scene is anything but simple. According to Adobe “Professional CGI animations often require a team of VFX, pre-visualization, lighting, animation, rotoscoping and compositing artists — among others.” Essentially, it takes time, dedication and care to craft compelling digital imagery. 

In 2018, the average CGI cost per film was $33.7 million, a steep price for a process that is both lengthy and difficult.  So, despite having a large team of people with massive budgets working on GGI characters and visuals, why do moviegoers criticize it so much in modern films?  

Back in the 1990s, CGI was groundbreaking. Now, it is overused and applied to the smallest and least significant details of films. While CGI was exciting and new when it was first introduced, it has become cliché and tacky, since it is either poorly executed or no longer used in moderation.  

According to Animost, “The problem isn’t the technology. CGI has become more powerful than ever. But it’s often used as a shortcut.” Visual effects teams are stretched thin, often given smaller budgets, less time to work and more assignments to complete. With conditions like that, it is clear why CGI has appeared to go downhill. 

A prime example of a CGI backfire was Marvel’s “Ant-Man and the Wasp: Quantomania,” released in 2023. Fans criticized the film for its weightless, artificial and randomly conceived CGI sequences, a surprising failure from a studio once known for cutting edge effects.

Regarding this film, Yahoo noted, “It’s a good reminder of the fact that bad CGI often isn’t the fault of the artists doing the bad CGI, but rather higher-ups who don’t give them enough time to do good CGI.” 

Despite these criticisms, CGI remains one of the most important tools in modern filmmaking. If studios want to meet and exceed fan expectations, executives need to prioritize realistic timelines, fair budgets and appropriate pay for the artists behind the technology.  

The technology itself is not the problem. The way it is being used is.  

Long is a staff writer for the Liberty Champion.

Leave a Reply

Your email address will not be published. Required fields are marked *

Please enter a comment

Please enter your name