Objectively Speaking, What Are The Top Six Movies From 1992 to 1997.

Now, I might admit that a Top Six list from a seemingly random six year period seems a little odd. There is a method to my Movie Madness.

Now, I might admit that a Top Six list from a seemingly random six year period seems a little odd. There is a method to my Movie Madness.

As I’ve mentioned on more than one occasion, I’m building a twenty five year movie database with solely objective factors to better identify those movies most of us would “really like”. It’s a time consuming process. If I’m uninterrupted by other priorities in my life, I can usually add a complete year to the database in a week and a half. There will always be interruptions, though, and I don’t expect to finish my project before mid-year 2018.

I’m a little impatient to get some useful information from my efforts and so I thought it might be fun to create an Objective Best Movie List for however many years I’ve completed. I’ve completed six years and so I now have a list of the best six movies from my completed time frame. I should complete 1998 by the weekend and after incorporating the new data into my algorithm I’ll be able to create a Top Seven list. Now that you have the picture here’s the top six in ascending order.

6. Sense and Sensibility (1995). IMDB Avg. 7.7, Certified Fresh 80%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

This was the first of a mid-1990’s run of Jane Austen titles to make it to the big screen. Emma Thompson won the Oscar for Best Screenplay. She is the only person to ever win both a Best Acting and a Best Screenwriting award. The movie is also noteworthy for the breakthrough performance of Kate Winslet who at age 20 earned her first of seven Oscar nominations.

5. In the Name of the Father (1994). IMDB Avg. 8.1, Certified Fresh 94%, CinemaScore A, Oscar- 4 Major nominations, 3 Minor

This is the movie that will probably surprise many of you. This biopic of Gerry Conlon, who was wrongly imprisoned for an IRA bombing, was the second of Daniel Day-Lewis’ five Best Actor nominations. He lost 30 pounds in preparation for the role and spent his nights on the set in the prison cell designed for the movie.

4. Good Will Hunting (1997). IMDB Avg. 8.3, Certified Fresh 97%, CinemaScore A, Oscar- 4 Major nominations,, 5 Minor

This movie is in my personal top ten. Two relatively unknown actors, Matt Damon and Ben Affleck became stars overnight and won Oscars for Best Screenplay as well. If either of them ever get a Best Actor award, they’ll join Emma Thompson in that select group. In his fourth nominated performance Robin Williams won his only Oscar for Best Supporting Actor.

3. Toy Story (1995). IMDB Avg. 8.3, Certified Fresh 100%, CinemaScore A, Oscar-1 Major Nomination, 2 Minor

Toy Story’s ranking is driven by its 100% Fresh Rotten Tomatoes rating from 78 critics. While its Oscar performance is weaker than the other movies on the list, it should be noted that Toy Story was the first animated movie to ever be nominated for Best Screenplay. As the database grows, I would expect that the number of Oscar nominations and the number of wins will become credible factors in these rankings. For now, receiving one Major and one Minor nomination has the same impact on the algorithm as for a movie like Titanic that won eleven awards. This is probably the only movie of the six that appears out of place in the rankings.

2. Shawshank Redemption (1994). IMDB Avg. 9.3, Certified Fresh 91%, CinemaScore A, Oscar- 3 Major nominations, 4 Minor

Shawshank still ranks as IMDB’s top movie of all time. At some point, I’m going to write an article about movies that achieve cult status after having only modest success at the box office. Shawshank would be one of those movies. After a pedestrian $28,341,469 domestic gross at the Box Office, it became one of the highest grossing video rentals of all time.

1. Schindler’s List (1994). IMDB Avg. 8.9, Certified Fresh 96%, CinemaScore A+, Oscar- 4 Major nominations, 8 Minor

Interestingly, this is the only movie of the six on the list to win Best Picture. It is also the only one on the list to earn an A+ from CinemaScore. Combine that with its twelve Oscar nominations and you can see why, objectively, it is at the top of the list.

Objectivity improves as data grows. It should be fun to see this list change as the database grows.

What do you think?

 

If You Want to Watch “Really Like” Movies, Don’t Count on IMDB.

There seems to be a correlation between IMDB rating and the probability of “really like” movies in the group. The problem is that the results suggest that IMDB does a better job identifying movies that you won’t “really like” rather than which ones that you will “really like”.

Today’s post is for those of you who want to get your “geek” on. As regular readers of these pages are aware, IMDB is the least reliable indicator of whether I will “really like” a given movie. As you might also be aware, I am constantly making adjustments to my forecasting algorithm for “really like” movies. I follow the practice of establishing probabilities for the movies in my database, measuring how effectively those probabilities are at selecting “really like” movies, and revising the model to improve on the results. When that’s done, I start the process all over. Which brings me back to IMDB, the focus of today’s study.

My first step in measuring the effectiveness of IMDB at selecting “really like” movies is to rank the movies in the database by IMDB average rating and then divide the movies into ten groups of the same size. Here are my results:

IMDB Avg Rating Range # of Movies Probability I Will “Really Like”
> 8.1 198 64.6%
7.8 to 8.1 198 60.6%
7.7 to 7.8 198 64.6%
7.5 to 7.7 198 58.6%
7.4 to 7.5 198 55.1%
7.2 to 7.4 198 52.5%
7.0 to 7.2 198 42.4%
6.8 to 7.0 198 39.4%
6.5 to 6.8 198 35.4%
< 6.5 197 11.7%
All Movies          1,979 48.5%

There seems to be a correlation between IMDB rating and the probability of “really like” movies in the group. The problem is that the results suggest that IMDB does a better job identifying movies that you won’t “really like” rather than which ones that you will “really like”. For example, when I’ve gone through the same exercise for Netflix and Movielens, the probabilities for the top 10% of the ratings have been over 90% for each site, compared to the 64.6% for IMDB.

With the graph displayed here, you can begin to picture the problem.

IMDB Rating Graph

The curve peaks at 7.4. There are enough ratings on the low ratings side of the curve to create significant probability differences in the groups. On the low side, it looks more like a classic bell curve. On the high side, the highest rated movie, Shawshank Redemption has a 9.2 rating. The range between 7.4 and 9.2 is too narrow to create the kind of probability differences that would make IMDB a good predictor of “really like” movies. IMDB would probably work as a predictor of “really like” movies if IMDB voters rated average movies as a 5.0. Instead an average movie is probably in the low 7s.

So, what is a good average IMDB rating to use for “really like” movies? Let’s simplify the data from above:

IMDB Avg Rating Range # of Movies Probability I Will “Really Like”
> 7.7 636 62.7%
7.3 to 7.6 502 55.4%
< 7.2 841 33.7%
All Movies          1,979 48.5%

If we want to incrementally improve IMDB as a predictor of “really like” movies, we might set the bar at movies that are rated  7.7 or higher. I’m inclined to go in the opposite direction and utilize what IMDB does best, identify which movies have a high probability of not being “really like” movies. By setting the IMDB recommendation threshold at 7.3, we are identifying better than average movies and relying on the other recommender websites to identify the “really like” movies.

IMDB is one of the most utilized movie sites in the world. It has a tremendous amount of useful information. But,if you want to select movies that you will “really like” don’t count on IMDB.