5 Ways to Fix Rotten Tomatoes (and Other Review Sites)

By: Marcus Varner | July 26, 2016 (Edited July 7, 2017)

rotten tomatoes

The last time you considered going to the movies, how did you make your decisions?

If you're like most regular moviegoers, your decision-making process most likely included a review site like Rotten Tomatoes (RT), Metacritic, or IMDb. If you're like me, when you saw that a movie you had previously been interested in had scored something below 50% on RT, your interest probably began to wane.

Film review sites have, like other review sites, become undeniably influential. But are they really deserving of that influence?

This week, in the wake of the release of the Ghostbusters reboot, a certain article caught my eye: "'Ghostbusters' Is a Perfect Example Of How Internet Movie Reviews Are Broken."

rotten tomatoes

Now, I've seen the movie. It's not a great movie. It's not even a good movie. But it was what article author Walt Hickey had to say about movie reviews at large that gave me pause. After pointing out discrepancies in the ratings for Ghostbusters by female reviewers versus male reviewers (see below), he blasted film review sites:

"It lays bare so, so much of what we're investigating when it comes to the provenance and reliability of internet ratings. Namely, they're inconsistent, easily manipulated and probably not worth half the stock we put in them."

While I don't agree with Hickey on all of his points about the movie, I do agree that RT and other movie review sites are in need of an overhaul if they're going to successfully lead moviegoers to the best movies.

Riffing off of Hickey's article, and based on my own observations, here are five ways that RT and other movie review sites can fix the movie review system. Of course, other review sites-looking at you, Yelp-can also benefit from these suggestions:

1. Uniformize all reviews to numerical scores

rotten tomatoes

All of the movie review sites let average joe users like you and me assign a single numerical score to movies. But things get messy when they try to incorporate reviews from professional critics.

Some movie reviewers give a numeric score to the movies they watch. Some assign letter grades. Others just go on for several paragraphs and trust in their audience to pick through what they liked and what they didn't. But how to effectively blend these reviews with the scores given by average users? Each site has its own way of tackling this issue.

Metacritic makes an admirable effort by translating critics' reviews into a score between 1 and 100. This allows them to be a bit more specific, but the process of picking a number is still fairly subjective.

IMDb sidesteps this challenge by leaving out professional critics' reviews altogether. This is a shame because it leaves out the weightier opinions of trained professionals, which could surely be of value to moviegoers.

RT uses what is perhaps the spottiest method: they simply put critics reviews into one of two buckets: 'like' or 'didn't like'. A like equals 100%; dislike equals 0%. If you've read enough reviews, you know that very few reviews are all good or all bad. They're usually something like, "Angelina Jolie's acting was amazing, but the editing could've used some work."

Here's the unfortunate truth: any attempt to assign a score to a critic's review by anyone other than the critic himself is going to be prone to subjectivity and bias.

For this reason, it seems only right that all critics assign their own numerical scores to their reviews, if only to get rid of the guessing games that movie review sites have to play currently. Also implied here, in order for this to work, is that all critics should abide by a single numerical scoring system. It can be 1-5, 1-10, or 1-100, but they all need to agree to stick to it.

But how then do you get every critic to agree to your scoring system? I'm glad you asked...

2. Uniformize across review sites

One point of consternation for Hickey is the wildly inconsistent results from the major movie review sites, as illustrated in this graph:

rotten tomatoes

I get it. If you're creating your own movie review site, you don't want to do things just like the other guy from whom you're trying to steal eyeballs. For this reason, of all the recommendations in this post, this one is least likely to be adopted. After all, RT, Metacritic, IMDb, and Fandango are all competitors to some degree or another. They want to differentiate themselves from one another, not join forces.

But this unorthodox move would have some huge benefits. First, it would greatly increase the odds of getting critics' participation in their collective system. And they wouldn't be bombarding critics with a multitude of different systems.

Second, it would make their scores unassailable. One of the universal principles of statistics is that the greater the sample size (or the greater the number of reviews) the more valid the analysis. If RT were to uniformize and then share data with other sites like Fandango, IMDB, and Metacritic, it would make their numbers that much more trustworthy to users and shut down their naysayers.

3. Create aggregate scores from more specific criteria

As mentioned earlier, single-score reviews-where reviewers assign a single number score to a movie-are just problematic. For example, I just saw the new Star Trek Beyond. Some things about the movie were great. Other things lacked. If I were to provide my own score, it would probably be a 7.5, but what would that mean to users who read the review?

Would they know that the production design was top-notch? Would they know that I docked the film 2.5 points because the story sagged in the middle and the editing was confusing at times?

"Ratings on the internet are inherently specious, and ratings aggregated from user reviews even more so," says Dickey (and I agree). "To distill a work of art down to a single number, you have to strip out an immense amount of meaning and context."

RT's binary review scoring is the worst offender in this regard. But all of the movie review sites could benefit from letting reviewers score movies based on various criteria, like:

  • Acting
  • Cinematography
  • Suspense
  • Humor
  • Drama
  • Action
  • Music

Individual scores in each criterion could then be summed up into one composite score. This would provide a neat single score for those users who don't care to delve into the nuances of cinema. Those who would care for a more thorough analysis would have the numbers available.

4. Monitor for trolls

rotten tomatoes

They're everywhere. No matter how much a movie is showered with praise by 99% of critics and moviegoers, there is always that person who will find a reason to hate the movie. Hickey even claims that male trolls purposely brought down the female-led Ghostbusters' score before the movie even hit theaters.

This is the dark side of the democracy that is consumer reviews. Free speech gives everyone a voice, even those people who are mean and perpetually unhappy. Unfortunately, instead of fostering a healthy review system, these trolls endanger it.

It behooves movie review sites, then, to put better monitoring and controls on trolls. If a reviewer consistently scores films low or has complaints filed against them by other reviewers, sites should take action to push potential trolls to become contributing citizens or be banned from participating.

5. Customize scores for users

rotten tomatoes

Much of Hickey's criticism of movie review sites revolves around the discrepancies between male reviews and female reviews. He shows that female-centric are often pounded by male reviewers, who tend to outnumber female reviewers, using these stats for Ghostbusters as an example:

  • Male reviewers of the film on IMDb outnumbered female reviewers almost five to one.
  • The average IMDb user rating for the film was 4.1 out of 10, which is pretty awful. Which aligns it with B-movie fare like Fifty Shades of Grey and Stop! Or My Mom Will Shoot.
  • The average user rating amongst female reviewers was a strong 7.7. This would put Ghostbusters on par with Oscar-winners like Titanic, Argo, and The Blind Side.
  • Amongst male reviewers, the rating was a sad 3.6. This puts the film in the same class as Jaws 3-D and Superman: The Quest for Peace.

So which score was closest to the truth? The right answer-the one that Hickey might hate-is that both the male scores and female scores are legitimate. They each represent the distinct views of their demographics, which, in the case of a female-skewed picture like Ghostbusters, is bound to turn off male reviewers and hit all the right notes with female reviewers.

In trying to make improvements to movie review sites, it's futile to argue that reviewers viewpoints or preferences are unfair. There's nothing you can do about taste or the tendencies of demographic groups.

And perhaps, taken in light of what people really expect from movie review sites, this argument is irrelevant. And what do users expect? They expect to get information to help them find a movie they will love and avoid movies they won't love-they don't need to know how the movie is all things for the whole human race. They just need to know how the movie would be for someone like them.

For people interested in Ghostbusters, it should be clear that females are much more likely than males to enjoy it.

Instead, of kicking against the pricks of human nature, movie review sites should take up the challenge of customizing their review scores by demographic and past browsing history. Retail sites have already figured how to do this with a great deal of success.

Up For Review

Could RT and other movie review sites just continue to coast on their fame? Certainly. The voices currently criticizing their methods are still a tiny minority. But if movie review sites truly understand their place of influence, they would do well to take their game to the next level in truly becoming the authorities of media recommendation.

Moreover, other types of review sites can learn some valuable lessons from the foibles of movie review sites. They should make the systems they use to score or valuate companies as transparent and consistent as possible. They should seek to make the information as customized to the individual consumer as possible.

tag
tag
tag
tag

Top of Page keyboard_arrow_right

author image

Written by Marcus Varner

Our goal, here at Best Company, is to provide you with honest, reliable information you need to find companies you can trust.

Compare the top ranked companies

Find the right company for you.