10 Questions to Get Film Lovers Arguing - dummies

10 Questions to Get Film Lovers Arguing

By James Cateridge

If there’s anything that film fans, students and scholars share, it’s a love of animated debate about film – or a good argument. All of the following issues are bound to liven up a classroom discussion, spark off an engaging conversation over coffee or encourage a spirited dissection of the film just screened as the credits roll. None of them have a correct answer, so just get stuck in and be as opinionated as you like. Obviously it helps if you are well informed, which is where a certain For Dummies guide may be invaluable. . . .

Is cinema dead?

Critics and social commentators have been predicting the demise of cinema for several decades. In the 1980s, cinema audiences worldwide hit an all-time low and film’s days seemed numbered. But then something strange happened. Audience numbers began to recover and have been on the rise almost everywhere since. So what saved cinema? Here are two possible answers:

  • Multiplexes: Some people hate them because they’re soulless additions to shopping malls. But they’re clean, safe and accessible. They boast a level of choice above what most inner-city cinemas offered.

  • Videotapes and DVDs: Revenue from these formats effectively saved the Hollywood majors. But since the Internet subsequently ate DVD, all bets are off.

Has digital cinema destroyed realism?

The cinema always felt real to audiences because of its relationship with photography. Movie cameras used to capture reality through mechanical and chemical processes, and so people trusted that what they were seeing on-screen was ‘true’. Of course, you knew that all kinds of trickery went on to bend the rules, but as long as cinema used moving photographic images most viewers were happy to believe their eyes.

Now things are rather different. Digital cameras still use lenses to capture light, but they record it in very different ways, through algorithms and data. Also computers can create images themselves, to an increasingly ‘photorealistic’ level. Digital cinema is more like animation, which can be realistic or stylised, but humans always create it. Perhaps this idea is a more realistic view of what cinema has always done in any case. . . .

Are film critics still worth reading?

In the olden days before the Internet, audiences needed film critics to tell them what was on at the local cinema and whether it was any good.

The first part of that purpose has largely disappeared now that show times are so easy to find and on-demand services let you view movies any time, anywhere. As regards the second part, well (again thanks to the Internet), today everyone’s a critic. Blogs, customer reviews and Twitter hashtags mean that you can instantly find a dizzying array of opinion on any film you want to see.

With more and more available ways to spend your entertainment time, however, you do need some kind of filter to reduce your options and make choosing more manageable. Perhaps you can think of ‘official’ critics as skilled and trained curators of an imaginary media museum.

Do film stars matter anymore?

Today’s Hollywood press loves to speculate about which stars are the best or the worst ‘value’ based on calculating how much box-office return is made for every dollar spent on their salaries. This equation is a fairly blunt instrument, but the larger subtext is clear: stars are no longer worth the exorbitant salaries that they command. If you look at the biggest hits of cinema history, few of them are uniquely star-driven.

So why are stars still paid so much? Well, they produce loads of cheap publicity for one thing. Plus, they’re often vital in the early stages of a project, when producers are trying to convince studios or other investors to part with their cash. Also, without film stars, how would advertisers make you buy perfume or watches?

Is television the new cinema?

Director Steven Soderbergh’s recent decision to abandon film-making in favour of working in television raised a few eyebrows, but his argument made sense. Hollywood has fallen behind in terms of willingness to experiment and take risks. A decade ago, this career move would have seemed unthinkable, because directors started in the grunt work of TV and then moved up to the glamorous film world.

The line between cinema and television has only become more and more blurred: many people consume movies at home and yet fans can watch some event TV in cinemas (such as Doctor Who in the UK). Plus, the number of big-name film directors making TV shows has notably risen – for example, Martin Scorcese’s work on Boardwalk Empire (2010–14) or Jane Campion’s Top of the Lake (2013).

But whether today’s aspiring Soderberghs will be able to make their reputations purely on the small screen is another question.

Why not release films on all formats at the same time?

Until fairly recently, cinemas loudly defended their God-given right to show movies first and make all the other formats wait their turns. When Disney tried to shorten the time between cinema and DVD release for Alice in Wonderland (2010), cinema chains in the UK threatened to boycott the film. Within three short years, Ben Wheatley’s A Field in England (2013) was released in cinemas, on DVD and online simultaneously. So what changed?

The industry’s answer is quite clear ‒ Internet piracy ‒ which clearly played a role in diminishing the secondary release windows for blockbusters, particularly internationally. But for smaller, independent films, going out on all formats just makes sense. Film producers can make the most of limited marketing budgets, and audiences have a choice of viewing options to suit their lifestyles.

Will 3D stick around?

The ongoing debate around the use of 3D imaging in digital cinema provides a great excuse to think through the collisions of industry, technology and creativity that characterise cinema. Different versions of stereoscopic projection have been around since the birth of film, and film-makers have attempted several times to introduce the technology into the mainstream.

This time around, the enormous profits of Avatar (2009) turned the industry’s head, along with the relative ease of converting digitally shot films into 3D. Since Avatar, critical hostility and audience indifference have returned, although Gravity (2013) offered a ray of hope that film-makers can use the format inventively and reignite audience excitement.

Has Hollywood had its day?

Although no longer the largest and most productive film industry in the world (thanks to India), Hollywood remains the gold standard for international film exports. It has varying levels of presence in markets worldwide, but its old business model is seriously creaking.

The previous system of blockbusters that dominated markets made most sense when cinemas were the heart of the battleground. As the balance of power shifts towards home-entertainment formats, however, blockbusters have to compete with many other different products and have much less of an automatic advantage. Just think about Netflix, and the way it places big-budget comic-book movies up against indie or foreign films. And if India and China get their export acts together, Hollywood may be toast.

Are directors still auteurs?

In the 1950s, claiming that Hollywood directors were the brilliant authors of their films was a radical move, one that effectively paved the way for modern-day film studies. Simple, logical arguments against this idea exist, however, such as the collaborative nature of film-making, as well as more complex theoretical ones (such as the idea that films only exist at the point when a spectator views, or reads, them).

Yet the auteur theory spread into film journalism and everyday conversations about film, and even the industry took it up witness the spate of ‘director’s cut’ DVD releases in the 1990s and 2000s. As foremost film scholar Richard Dyer suggests, auteur theory is film studies’ ‘greatest hit.’ The auteur theory may have fallen out of fashion with film theorists, but the vast majority of film buffs still believe that directors are gods ‒ and they can be difficult to argue down.

Why study films when you can just go out and make them?

These days, film-making equipment is cheaper and easier to use than ever before. With just an entry-level HD camera, a laptop and an Internet connection, you effectively have the means to shoot, edit, post-produce and distribute your masterpiece directly to the world. So why waste your time studying other films first?

Well, you may consider yourself to be a self-taught film-making genius, but you have to learn your craft from somewhere. You can choose to imitate other home videos uploaded to YouTube, or you could go back to the source: to the great films made by generations of talented film-makers. The style and technique of classical Hollywood films may feel remote and out of reach, but you can certainly learn and apply the basic elements of editing and structure. And you’re bound to find a film-maker, genre or movement that inspires you to create much better work of your own.