Oscars by the numbers: The people vs. Oscar vs. critics

facebooktwitterreddit

Everyone likes to bemoan that the Academy gets it wrong, that critics don’t know what audiences like, that the masses have bad taste. But how different were the responses to the 2020 Oscar nominees really?

When the 2020 Academy Award nominations were announced in January, factions of Film Twitter had a minor melee over Joker’s year’s-high 11 nominations.

See!, tweet-shouted some, the Academy gets it: Joker is a masterpiece.

Yikes, tweet-griped others, this is stronger evidence than ever that the Academy has lost the plot when it comes to recognizing cinematic achievement.

We all know — or we should all know, and try to remember — much of this righteous tweeting is in response to straw men. A couple of tweets do not a critical mass make. But the idea for this post, in which we’ll present a comparison of 50 movies and their Academy Award nominations, critics scores and audience scores, similarly came from a couple of tweets.

When the Oscar nominations were announced and it felt like all talk was about Joker, a user named Thomas Doherty, whose profile describes himself as a “Brandeis American Studies professor who watches a lot of film and TV,” tweeted that “the film-crit elite, who seem to have all gotten the same memo to slam JOKER, are missing how strongly it connected with audiences across the board.”

Alissa Wilkinson, film critic at Vox, responded that, “This take is getting a little tired, for a movie that won the Golden Lion, was programmed at every major film festival this fall, and has a 69% on RT and 59% on Metacritic. *Everyone* knows it connected with audiences, but the “critical elite” slamming Joker en masse isn’t real.”

Using Wilkinson’s evergreen reminder that Twitter is not reality as a jumping-off point, I thought it would be interesting to explore just how close (or far!) critical consensus was from “audiences” as well as from the Academy’s picks.

As it turns out, when averages are taken on such a scale, there’s not a ton of difference. The highest-scoring films in both critical and audience categories received Oscar nominations, though there is a more valid argument to be made about the number of and category of noms in relation to their respective quality. Among the snubbed films, Portrait of a Lady on Fire, Dark Waters, Spider-Man: Far From Home, John Wick 3 and Waves were beloved by critics and audiences alike. And as it turns out, neither critics nor audiences had all that much love for Joker after all, it is solidly middle of the pack for both.

Before we present the data for your perusal, a few more caveats and conclusions.

Caveats and conclusions

I looked at 50 films — 35 of which received at least one 2020 Academy Award nomination; 15 that were among the “best” and “most popular” films of 2019, as aggregated by IMDb and affirmed by my own judgment as an entertainment editor.

The charts included below draw their critical score from an average of Rotten Tomatoes Tomatometer and Metacritic scores (both also listed). You will notice that, almost across the board, Metacritic scores are lower. But in both cases, one must acknowledge the shortcomings of aggregators. Both outlets do their best to bring in a representative collection of critics and control for quality, but the breadth of included reviews (and the responsibility of Rotten Tomatoes to assign a numeric value to reviews that do not provide their own) make that hard.

Our audience scores come from Rotten Tomatoes’ Audience Score and IMDb User Score. As with the critics, Rotten Tomatoes showed higher scores in general and no IMDb User Score for any of the 50 films we polled was higher than 8.6 (adjusted to 86 for averaging purposes), which naturally lowered the collective audience average. Still, the average audience scores are, frequently, higher than the critical scores.

There are two potential confounding variables here. First, critics are assigned movies to review — it’s literally their job. In most cases, we can assume audience members are only going to see movies they want to see in the first place, making their odds of seeing a movie they hate somewhat lower. Second, there is what I’m going to call the Yelp factor. Audience or user scores are submitted by the kind of people who are inclined to submit Rotten Tomatoes or IMDb reviews. I’m not going to draw any conclusions about such people — though we do know for a fact there is some overlap with the troll community — but a self-selecting group is never going to be truly representative of audiences.

In any case, there’s also quantity. While not everyone who sees a movie is logging into Rotten Tomatoes to share their opinion, enough people are that the difference between critic and user reviews can often be in the thousands, especially for a blockbuster release like Avengers Endgame (504 reviews vs. 68,098 audience submissions). However, there are some films, particularly foreign or documentaries, that actually have more audience scores than critics. If you haven’t gotten it by now, this is all extremely imperfect.

Then, there is perception, specifically on social media. It’s the genesis for this project and yet I drew a blank on how to measure perception of critical response — which is to say, I have no idea how to represent that the impression the “film-crit elite” slammed Joker when none of the review aggregators reflect that reality. (Similarly, the perception of audience response — because like the perception of critical response, your perception is inherently personal and determined by your social feeds and IRL circles.) So, I will leave that element of comparison to your own experience.

Dive into the data for yourself, presented below sorted by Oscar nominations, average critical score and average audience score.

Sorted by 2020 Academy Award nominations

Sorted by average critical score

Home/Entertainment