Today, modern films and 21st-century filmmaking as a discipline are often criticised for being hollow, vapid, and grounded in special effects.
Cinema started life in the dying days of the 19th century and through the 20th grew to become associated with one thing above all else: storytelling. Today, modern cinema has been marked by many developments, including the advent of special effects.
Critics claim that this has cost the discipline something, that it’s not what it used to be; proponents claim that the technology means writers, directors, and actors are better equipped to tell stories than ever before.
This article will explore how much truth there is to both claims, but before I begin, a disclaimer: this doesn’t refer to every film released in the 21st century. The advent of indie cinema, in particular, has seen a revival of traditional and unusual techniques. Instead, when I talk about “modern films”, I mean high-grossing, high-budget Hollywood blockbusters driven by CGI and celebrity cameos.
Special effects: the end of an era or a door into the future?
There are different estimates as to when CGI first started appearing in mainstream films, but most people point to the mid to late 1970s as the point at which it all began (with films including Star Wars, Westworld, and Alien).
Today, more than 30 years on, Hollywood blockbusters sometimes feel like technological exhibitions intended to showcase the best and brightest in computer animation rather than artful cinematic storytelling.
Franchises and brands most guilty of this include Transformers, Star Wars, Marvel, Pixar, and Disney; all common household names. And while the critical response to all of these varies from very good to awful, there’s no denying the overarching trend leaning towards a greater focus on the spectacular, delivered by computer animation.
The underlying question is this: does it really add that much? How much value do we get from the wonderful special effects? We might exchange sentiments with one another in the cinema that a certain process or image is stunning, but does that equal a good film?
There’s no perfect answer to any of these questions; obviously, we have to treat every case differently. CGI doesn’t necessarily mean a lack of focus on subtlety and narrative; the Lord of the Rings films, for instance, relied heavily on cutting-edge special effects and won 17 Academy Awards in the space of three short years. However, by the same token, it doesn’t necessarily mean that special effects can detract from a flawed narrative; films like Avatar and Transformers in recent years have delivered the spectacular at something of a cost to narrative quality.
Cinema has been defined by technology
Let’s backtrack for a moment or two. Cinema through the ages has come to be defined by changes in filmmaking technology; in the late 1920s, silent films were replaced by talkies, and in the 40s and 50s, colour burst onto the screen. Why should special effects be treated any differently?
It’s a good question, but there is something of a fundamental difference; with sound and colour, you either have it, or you don’t, and having it is arguably better than not. It’s a binary change, and once it becomes adopted by the masses (reasonably cheaply), there’s no going back.
CGI is different. There is, in theory, no endpoint. CGI can continue to get better and better the more money and time we throw at it. Avatar, the film that famously took more than 15 years and $250 million to make, is testament to that. Every year, films are released with more and more impressive effects, and there appears to be no ceiling.
This is where the difference lies. Sound and colour were all about bringing more clarity to the audience, about communicating vision more effectively. Special effects can help us realise a fantasy land or a colossal spacecraft, but every year, a CGI artist will try to make any given image or project 10% better or more spectacular or more “realistic”. Films can’t be more hearable or more “in colour”.
The case for old cinema
The argument, then, is that we reject special effects in favour of traditional filmmaking: telling simple stories grounded in great writing and classical acting. When filmmakers had fewer tools at their disposal, they had nothing else to rely on; they were of a different breed to directors like Michael Bay and James Cameron.
And the critical consensus is clear: the writing and dialogue in Casablanca (1942) and Gone With The Wind (1939) is commonly regarded as the best in film history. As far as actors are concerned, Bogart, Brando, Hepburn, and Streep top the all-time lists year-in, year-out.
There’s no right or wrong answer to such an absurdly broad question, but, I think the reason that critics and film buffs have such a sycophantic relationship with classical cinema is that there’s a certain romance about it. Films made 80 or 90 years ago feel so much more mystical; it’s an age of Hollywood lost to time, something we can’t see or experience anymore. The directors and the stars, bar a few last vestiges, are all gone.
While the best films, new or old, are timeless, maybe we have to consider that the movies we love from decades ago are just those that have endured. Lower quality films wouldn’t stand the test of time as well; they’d be forgotten.
With that in mind, maybe the argument that older films are better, or more “pure”, more “authentic” for their lack of special effects… maybe it isn’t right. Perhaps it’s just that the ones we look back and remember are; those that survived are the cream of the crop. The bad and the average don’t last. I can’t foresee film critics in 2070 choosing to watch Transformers, while somehow Lord of the Rings feels like it’ll stand the test of time.
There’s no right or wrong answer, and anybody that says “older films are better” isn’t right, because it’s such an impossibly broad thing to quantify. Nostalgia has a part to play, but I think the films of decades and eras past weren’t better or worse en masse. They were just of their own time and place, and today, that just has a certain romance about it.