In the Battle of Memory vs. Movie Website, Netflix is Still the Champ

On Monday I posed the question, is your memory of a movie that you’ve already seen the best predictor of “really like” movies. Based on Monday’s analysis memory certainly comes out on top against IMDB and Rotten Tomatoes. Today, I’m extending the analysis to Criticker, Movielens, and Netflix. By reconfiguring the data used in Monday’s post, you also can measure the relative effectiveness of each site.

On Monday I posed the question, is your memory of a movie that you’ve already seen the best predictor of “really like” movies. Based on Monday’s analysis memory certainly comes out on top against IMDB and Rotten Tomatoes. Today, I’m extending the analysis to Criticker, Movielens, and Netflix. By reconfiguring the data used in Monday’s post, you also can measure the relative effectiveness of each site. For example, let’s look again at IMDB.

Probability I Will “Really Like” Based on IMDB Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.1% 69.2%                            0.11
Never Seen Before 50.6% 33.6%                            0.17

It’s not surprising that the probabilities are higher for the movies that were seen before. After all it wouldn’t make sense to watch again the movies you wished you hadn’t seen the first time. But by looking at the gap between the probability of a recommended movie and a non-recommended movie, you begin to see how effectively the movie recommender is at sorting high probability movies from low probability movies. In this instance, the small 11 point spread for Seen Before movies suggests that IMDB is only sorting these movies into small departures from average. The low probabilities for the Never Seen Before movies suggest that, without the benefit of the memory of a movie seen before, IMDB doesn’t do a very good job of identifying “really like” movies.

Rotten Tomatoes follows a similar pattern.

Probability I Will “Really Like” Based on Rotten Tomatoes Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 80.5% 65.1%                            0.15
Never Seen Before 49.8% 31.8%                            0.18

Rotten Tomatoes is a little better than IMDB at sorting movies. The point spreads are a little broader. But, like IMDB, Rotten Tomatoes doesn’t effectively identify “really like” movies for the Never Seen Before group.

Theoretically, when we look at the same data for the remaining three sites, the Percentage Point Spread should be broader to reflect the more personalized nature of the ratings. Certainly, that is the case with Criticker.

Probability I Will “Really Like” Based on Criticker Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 79.3% 56.4%                            0.23
Never Seen Before 45.3% 18.9%                            0.26

Like IMDB and Rotten Tomatoes, though, Criticker isn’t very effective at identifying “really like” movies for those movies in the Never Seen Before group.

When you review the results for Movielens, you can begin to see why I’m so high on it as a movie recommender.

Probability I Will “Really Like” Based on Movielens Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 86.6% 59.6%                            0.27
Never Seen Before 65.1% 22.3%                            0.43

Unlike the three sites we’ve looked at so far, Movielens is a good predictor of “really like” movies for Never Seen Before movies. And, the spread of 43 points for the Never Seen Before movies is dramatically better than the three previous sites. It is a very effective sorter of movies.

Last, but certainly not least, here are the results for Netflix.

Probability I Will “Really Like” Based on Netflix Recommendation
Recommended Not Recommended Percentage Point Spread
Seen Before 89.8% 45.7%                            0.44
Never Seen Before 65.7% 21.4%                            0.44

What jumps off the page is that there is no memory advantage in the allocation of movies for Netflix. As expected, the Seen Before probabilities are higher. But, there is an identical 44 point gap for Seen Before movies and movies Never Seen Before. It is the only site where you have a less than 50% chance that you will “really like” a movie you’ve already seen if Netflix doesn’t recommend it.

“If memory serves me correctly, I “really liked” this movie the first time I saw it.” That is an instinct worth following even if the movie websites suggest otherwise. But, if Netflix doesn’t recommend it, you might think twice.

***

6/24/2016 Addendum

I’ve finalized my forecast for the last three movies on my June Prospect list. My optimism is turning to pessimism regarding my hopes that Independence Day: Resurgence and Free State of Jones would be “really like movies”. Unfavorable reviews from the critics and less than enthusiastic response from audiences suggest that they could be disappointments. Of my five June prospects, Finding Dory seems to be the only safe bet for theater viewing, with Me Before You a possibility for female moviegoers. The IMDB gender split is pronounced for Me Before You with female voters giving it an 8.1 rating and males a 7.3 rating. It is also one of those rare movies with more female IMDB voters than males.

When It Comes to Movie Rating Websites, There is Strength in Numbers.

If you can only use one website to help you select movies that you will “really like”, which should you choose?

If you can only use one website to help you select movies that you will “really like”, which should you choose? That’s a tougher question than you might think. Because I have used all five of the websites recommended here to select movies to watch, my data has been heavily influenced by their synergy. I have no data to suggest how effective using only one site would be. Here’s what I do have:

Probability I Will “Really Like”
Recommendation Standard When Recommended in Combination with Other Sites When Recommended by This Site Only
MovieLens > 3.73 70.2% 2.8%
Netflix > 3.8 69.9% 8.4%
Criticker > 76 66.4% 10.1%
IMDB > 7.4 64.1% 0.3%
Rotten Tomatoes Certified Fresh 62.7% 4.3%

When MovieLens recommends a movie, in synergy with other websites, it produces the highest probability. When Criticker recommends a movie but the other four sites don’t recommend the movie, then Criticker has the highest probability. Netflix is second in both groups. Which one is the best is unclear. What is clear is that the three sites that recommend movies based on your personal taste in movies, MovieLens, Netflix, & Criticker, outperform the two sites that are based on third party feedback, Rotten Tomatoes and IMDB. When Netflix, MovieLens, & Criticker recommend the same movie there is an 89.9% chance I’ll “really like” it. When both IMDB & Rotten Tomatoes recommend the same movie the probability is 75.8% I’ll “really like” it.

What also is clear is that if four websites are recommending that you don’t watch a movie and one is recommending that you do, the probability is that you won’t “really like” the movie no matter how good that one website is overall. The progression of probabilities in the example below gives some perspective of how combining websites works:

Websites Recommending a Movie Probability I Will “Really Like”
None 3.9%
Netflix Only 8.4%
Netflix & MovieLens Only 31.9%
Netflix, MovieLens, & Criticker Only 50.9%
Netflix, MovieLens, Criticker & IMDB Only 71.1%
All Five 96.6%

Stated Simply, your odds increase with each website that recommends a particular movie. If, for example, you were to only use Netflix for your movie recommendations, the probability of “really liking” a movie might be 69.9% but, in reality, it could be any one of the probabilities in the table above with the exception of the 3.9% for no recommendations. You wouldn’t know if other websites had recommended the movie.

So, if I had to choose one website, I’d choose Netflix-DVD if I were one of their 5,000,000 DVD subscribers. If I’m not already a subscriber I’d go with MovieLens. It would be a reluctant recommendation, though, because the strength in numbers provided by using multiple websites is just so compelling.

***

You’ll notice in the Top Ten Movies Available to Watch This Week that there are a number of movies on the list that are available on Starz. I’m taking advantage of the Comcast Watchathon Week which provides for free Starz, HBO, & Cinemax. Some of my highly rated movies which would ordinarily be unavailable are available for the short duration of this promotion. Bonus movies. Wahoo!!

 

 

Who is Your Favorite Actor or Actress? Criticker Knows.

Who is your favorite actor or actress?  If you can’t wait for the next Leonardo DiCaprio or Jennifer Lawrence movie, does that make them your favorite actors?  If you have rated on Criticker every movie you’ve ever seen, or in my case every movie seen in the last 15 years, the answer to these questions is just a click away.

Criticker has a number of neat tools on its website. One of my favorites is its Filmmaker List, which can be found by clicking the Explore button that appears along the top banner. You can rank  Actors, as well as Directors or Screenwriters, using a variety of criteria. I like to rank actors based on the average rating I’ve given the movies that they’ve  appeared in. Once you’ve ranked them, you can click on their name and see which of their movies you’ve seen and which ones you haven’t. You can also set a minimum number of movies for your rankings so that you can get the most representative sample that your number of ratings will allow.

For example, I have 1,999 movies rated in Criticker. If I set my minimum at 20 movies for Actors and rank them by average score, my top five favorite Actors are:

Actor Avg. Score # of Movies Seen Best Movie Not Seen
Tom Hanks 85.88 26 Cloud Atlas
Harrison Ford 83.50 24 The Conversation
Morgan Freeman 82.50 22 The Lego Movie
Phillip Seymour Hoffman 81.18 22 Boogie Nights
Samuel L. Jackson 81.00 25 Kingsman: The Secret Service

Based on a 15 movie minimum, my favorite Actresses are:

Actress Avg. Score # of Movies Seen Best Movie Not Seen
Kate Winslet 79.13 15 Hamlet (1996)
Scarlett Johansson 75.52 21 Hail, Caesar!
Judi Dench 74.22 17 Hamlet (1996)
Laura Linney 74.63 16 Mr. Holmes
Natalie Portman 74.35 17 Paris, je t’aime
Meryl Streep 74.28 18 Fantastic Mr. Fox

I included 6 on my Actress list because you just can’t leave Meryl Streep off of any list of the Best Actresses. Also, the Best Movie not seen is based on the highest predicted Criticker Score for movies I haven’t seen, or haven’t seen in the last 15 years.

There are a couple of surprises on my lists. Samuel L. Jackson is a surprise. I can’t say that I watch a particular movie because Samuel L. Jackson is in it. It does, however reflect how many quality movies he’s been in. Scarlett Johansson is another surprise. It’s amazing that I have seen 21 movies of hers and she is only 31 years old.

There are favorite actors of mine who didn’t make the list, such as Paul Newman and Jodie Foster. In Paul Newman’s case, he didn’t meet the 20 minimum and his average movie score wasn’t high enough (79.32 in 19 movies). Jodie Foster would have been the highest on my list with an average score of 79.64 but I’ve only seen 11 of her movies, under the 15 movie minimum I set.

When you first go to the Filmmaker List the default for the minimum movies rated is 3. Under that criteria my favorite actor of all time is Billy Dee Williams (95.67 and 3 movies). “I love Brian Piccolo”, and Lando Calrissian (Star Wars V & VI) as well.

 

 

 

Criticker: Whose Movie Recommendation do you trust?

Criticker is not as well known a movie site as Rotten Tomatoes or IMDB. Unlike those better known sites, Criticker evaluates movies based on your taste in movies. More accurately, it estimates the rating that you will probably give a movie based on the ratings of other Criticker users that have the most similar taste in movies to you.

A friend, let’s call him Jack, recommends a movie to you. You watch the movie and it is one of those movie experiences that reminds you why you enjoy watching movies. Another friend, let’s call her Jill, recommends a movie. You watch it and you have to prop up your eyelids with toothpicks to stay awake. If future recommendations from Jack and Jill follow the same pattern, you keep on watching movies recommended by Jack but stop watching movies recommended by Jill. You reach the conclusion that you and Jack have similar taste in movies and you and Jill have different taste in movies. In the end you trust the movie recommendations of Jack because you seem to really like the same movies. This is the basis for the Criticker website movie ratings.

Criticker is not as well known a movie site as Rotten Tomatoes or IMDB. Unlike those better known sites, Criticker evaluates movies based on your taste in movies. More accurately, it estimates the rating that you will probably give a movie based on the ratings of other Criticker users that have the most similar taste in movies to you. Criticker has created a tool called the TCI (Taste Compatibility Index)). It uses the index to identify moviegoers who statistically have the most similar taste in movies to you and aggregates the scores from those moviegoers to produce the probable rating, from 1 to 100, that you might give the movie you’re interested in watching.

Here’s the thing. No matter how similar Jack’s taste in movies is to yours, there will be times when Jack recommends a movie that you don’t like. If that happens you may begin to question whether Jack really does have the same taste in movies. If Jack recommended 10 movies to you and you really liked 8 of them, you can’t be sure that you will like 8 of the next 10 movies he recommends. It may be a random event that you like 8 of Jack’s recommendations. It could just as easily have been 5 or 6. If, on the other hand, Jack has recommended 100 movies and you really liked 80 of them, the chances that you will really like 8 of the next 10 movies he recommends are greater. The same is true with Criticker. The more movies that you rate on the website, the more confident you can be of the accuracy of the probable rating that Criticker provides for the movies you are interested in seeing.

To get started, use the link at the top of the page to go to the website. Set up an account. It’s free. Then start rating movies that you’ve seen. Criticker asks you to rate movies on a 1 to 100 scale. If you ask me, that’s tough to do. For example, what criteria do you use to give one movie an 86 and another movie an 87. Unless you have established criteria to differentiate movies that finely, it’s almost impossible to do without sacrificing consistency in your ratings . In a future post, I’ll outline how I established criteria for a 100 point scale. For now, I would keep your scoring simple by rating movies on a 10 point scale and converting the score to a 100 point scale for Criticker. For example, if you rate a movie 8 out of 10 on IMDB, score it as an 80 for Criticker. If, when you were rating the movie for IMDB, you had difficulty deciding whether it was a 7 or an 8, you can rate it a 75 on Criticker. The important thing is to have a consistent set of scoring rules that are applied uniformly across all of your movies.

Go ahead and get started. Pretty soon you’ll find that there are many people out there whose movie recommendations you can trust. Just remember that there is no one whose taste is exactly like yours.