The Search for the Box Office Holy Grail

How do you predict a box office bomb versus a blockbuster triumph? It’s the oldest question in Hollywood, and anyone who has a real answer has got the golden ticket. Now, realistically, nobody does – it can never be that simple, right? So instead, what studios are stuck with is a mix of factors – critical reception, audience response, and good old-fashioned timing and marketing. But as it becomes easier and easier to synthesize all these competing collections of data into a single whole, the question appears a bit easier to answer. In this blog post I’m going to compare two methods of evaluating a movie’s quality and the degree to which that data can predict success at the box office.

The first tool I’ll take a look at is “Cinemascore”, run by an independent market research company based out of Las Vegas. Operating since 1978, this company polls moviegoer reaction to theatrical releases with a very simple survey that gauges their overall impression of the movie (F to A+) and their likelihood to buy or rent the movie afterwards. Cinemascore prides itself for “establishing the simplest, most effective, and most reliable means of gauging audience response”. No tech, no frills, and most importantly, no critics – this is purely the immediate audience response.

CinemaScore ballot

One look at recent Cinemascore calculations and one thing becomes clear: if you grading below a B, something has gone horribly wrong. The vast majority of films grade in the B- to A range, signifying that most audience goers are either easily entertained, or generally know what movies they’re going to like when they head to the theater. So the real appeal of Cinemascore is looking at the extremes: as Matt talked about in his presentation on online reviews, often people revert to the stark negative or positive. If someone isn’t even willing to give a movie a B-, it might as well be an F. It has become a scarlet letter to receive the dreaded “F” from Cinemascore, an “honor” that has been granted to only 19 films in the entire history of the site. As an aside, Cinemascore slams some critically acclaimed movies: Darren Aronfosky’s “Mother”, released in 2017 and with a fresh 69% rating on Rotten Tomatoes, is one of the films to receive the dreaded F. Another example: the 2012 crime thriller “Killing Them Softly”, a 74% on RT, also scored an F. But a bust is a bust, whether it’s 2009’s “Land of the Lost” (a C+) making $68 million on a $100 million budget or Mother (F) with $45 million on a $30 million budget. There are a lot of factors at play here, whether it’s poor marketing, lack of star power, or a movie that just plain stinks. But whatever the cause, the scores are good at predicting the “box office multiplier”, or the total box office gross divided by opening weekend gross – a measure of the film’s staying power with audiences. At the other extreme, films earning an A+, most recently achieved by Marvel’s “Black Panther”, are almost uniformly blockbuster hits, with high gross and high multipliers.

Screen Shot 2014-08-13 at 1.52.09 PM

It should also be said that Cinemascore can help us look past the sometimes-reactionary backlash that hits some major films. “The Last Jedi” created controversy among Star Wars fans, reflected by a scathing 48% audience score on Rotten Tomatoes, but Cinemascore tells a different story, giving it a solid A rating. And it doesn’t take a lot of research to see that “Last Jedi” made quite a bit at the box office. The people that Cinemascore surveys may or may not tuned in to the internet politics of movie criticism and fanboyism, but that’s probably helpful in the long run. Cinemascore’s methods may be rooted in the past, but it’s easy to see why it’s still looked to as the gold standard.

But how about some of the newer tools of the trade? Let’s take a look now at Metacritic, a review aggregator that allows for both critic and audience reviews. Unlike Rotten Tomatoes, which assigns either a “fresh” or “rotten” rating to each critic review, Metacritic is a raw average for both users and critics, making it a closer analogue to Cinemascore. Metacritic did their own in-house study on critics’ ability to predict box office success, and their conclusion was eerily similar to that of the Cinemascore.

Once again, the “box office multiplier” effect is very evident. The better films are received critically, the better their multipliers and overall gross. The multipliers also seem relatively consistent with the Cinemascore study, in the same 2 to 4.5 range and with the same upward trend. One difference is towards the very top: whereas the A+ Cinemascore carries with it a dramatic multiple increase, Metascore plateaus a bit earlier. Small sample size could certainly be the culprit here. Critical consensus is obviously a major factor in audience behavior.

Screen Shot 2018-02-25 at 1.11.15 PM.png

It is important to note the distribution difference: whereas the majority of Cinemascore ratings were in the Bs or higher, Metacritic appears to have a far more normal distribution of ratings, with most films falling in the middle of the pack. It’s interesting then that with such different pools of data, Metacritic and Cinemascore reach basically the same conclusion: good movies make money. Hardly seems like a revolutionary concept.

But neither system is perfect, and predictions become hazier when the audiences and critics disagree, which they often do. I mentioned some examples earlier, but how about one from a box office near you? Take “Annihilation”, a movie released just this past weekend: well received by critics with an 80% Metacritic score but rocking an unsightly Cinemascore of C. So far, indicators point to a bomb, another win for the tried and true method of exit polling. The distributors seem to accept this conclusion, as the film is only receiving an official theater release in the U.S. while international markets have to wait to see it exclusively on Netflix (Netflix’s ability to save potential “bombs” is another topic entirely, but I digress).

Screen Shot 2018-02-25 at 1.37.18 PM

Screen Shot 2018-02-25 at 1.37.03 PM.png

The age of review aggregators has made it easier than ever to go into a movie with bias entrenched. If the critics or audience liked a movie this much, shouldn’t I like it too? It ties to the “wisdom of crowds” concept that we have talked about at length in class. Assuming a bit of trust in critics, it takes 5 minutes to take a quick look at a movie’s Metascore or RT rating, a move that can save you $15 and your valuable time of a Friday night. It’s no surprise that those scores to an extent can predict box office potential. But Cinemascore’s enduring success is proof that Hollywood shouldn’t leave old methods behind anytime soon. The digital-based methods of the present serve as just another tool in the search for the Holy Grail of box office predictions.

Sources:

http://www.metacritic.com/feature/film-quality-vs-box-office-grosses

https://www.cinemascore.com/press/article/id/16

https://www.theatlantic.com/entertainment/archive/2018/01/annihilation-paramount-netflix/551810/

6 comments

  1. Great post, Ray! I have very little familiarity with movie review systems other than Rotten Tomatoes, so this was all new to me. It’s really interesting how the two methods you discussed take different approaches to generating ratings, yet usually come to the same (if not very similar) conclusions. It will definitely be interesting to see how data-driven the movie review business becomes. I know Google for one has tested algorithms that can predict revenues based on search volumes and click ads, which makes me wonder what sort of impact that will have on the rating process down the road.

  2. As a huge movie person, I thought this post was awesome. The data behind the ratings and revenue was super interesting. I didn’t know a lot about websites like Metacritic because I always based my pre-movie judgements strictly on Rotten Tomatoes. I plan on seeing Annihilation this week and am very curious to compare my own view to the CinemaScore. I am a fan of Sci-Fi and have heard a lot of mixed reviews.

  3. Nice job with the detail in this post, it was very educational. I would love to hear more about how Netflix saves a bust. I thought it was interesting how most Cinemascore grades are above a B. Based on personal experience it makes sense that it is the phenomenon that most people know what they want when they decide to go out to a movie. When I choose to actually pay that $15-20 to see a movie, which also involves transit time and a trip to the concession counter, I am usually very certain it is something I will like. I leave the experimentation for my streaming services where the cost is much lower and I can easily “bail” on a bad movie.

  4. Really cool post, Ray! Your post in conjunction with @mpduplesmba presentation a few weeks ago definitely has me questioning rating systems, and I will be more critically considering them when I use sites like Rotten Tomatoes now. It leaves me wondering if there is any solution to this considering the power of crowds and how much is related to subconscious associations with other people or biases we hold. I wonder if this is something that can be addressed through algorithms or more big data, but because it is so related to humans I feel like even that may be difficult. Your discussion about how these rating systems can be used to predict movie success is also really interesting and definitely shows the power of data and the power I think it can hold in any industry or role.

  5. What I think is really interesting when it comes to this is bias, which was talked about in a presentation the other week. A lot of the time, when I see that a movie has a certain rating with certain reviews, I will go in expecting to either like it or dislike it. This a problem with today’s review culture, which is ingrained in the technology that we have at our fingertips.
    Although it seems old fashioned, CinemaScore seems to be more objective. With people voting after they see the movie, it has the potential to be a completely objective rating system. However, this system will of course be biased because of people looking at reviews and such before going to spend $15 on a movie night.

  6. I find these tools helpful, because they help me not waste my time on bad or average movies. Nothing is worse than the end of a movie feeling like that’s 2 hours of your life you will never get back. I personally love metacritic (for games too).

%d bloggers like this: