Criticker: Whose Movie Recommendation do you trust?

Criticker is not as well known a movie site as Rotten Tomatoes or IMDB. Unlike those better known sites, Criticker evaluates movies based on your taste in movies. More accurately, it estimates the rating that you will probably give a movie based on the ratings of other Criticker users that have the most similar taste in movies to you.

A friend, let’s call him Jack, recommends a movie to you. You watch the movie and it is one of those movie experiences that reminds you why you enjoy watching movies. Another friend, let’s call her Jill, recommends a movie. You watch it and you have to prop up your eyelids with toothpicks to stay awake. If future recommendations from Jack and Jill follow the same pattern, you keep on watching movies recommended by Jack but stop watching movies recommended by Jill. You reach the conclusion that you and Jack have similar taste in movies and you and Jill have different taste in movies. In the end you trust the movie recommendations of Jack because you seem to really like the same movies. This is the basis for the Criticker website movie ratings.

Criticker is not as well known a movie site as Rotten Tomatoes or IMDB. Unlike those better known sites, Criticker evaluates movies based on your taste in movies. More accurately, it estimates the rating that you will probably give a movie based on the ratings of other Criticker users that have the most similar taste in movies to you. Criticker has created a tool called the TCI (Taste Compatibility Index)). It uses the index to identify moviegoers who statistically have the most similar taste in movies to you and aggregates the scores from those moviegoers to produce the probable rating, from 1 to 100, that you might give the movie you’re interested in watching.

Here’s the thing. No matter how similar Jack’s taste in movies is to yours, there will be times when Jack recommends a movie that you don’t like. If that happens you may begin to question whether Jack really does have the same taste in movies. If Jack recommended 10 movies to you and you really liked 8 of them, you can’t be sure that you will like 8 of the next 10 movies he recommends. It may be a random event that you like 8 of Jack’s recommendations. It could just as easily have been 5 or 6. If, on the other hand, Jack has recommended 100 movies and you really liked 80 of them, the chances that you will really like 8 of the next 10 movies he recommends are greater. The same is true with Criticker. The more movies that you rate on the website, the more confident you can be of the accuracy of the probable rating that Criticker provides for the movies you are interested in seeing.

To get started, use the link at the top of the page to go to the website. Set up an account. It’s free. Then start rating movies that you’ve seen. Criticker asks you to rate movies on a 1 to 100 scale. If you ask me, that’s tough to do. For example, what criteria do you use to give one movie an 86 and another movie an 87. Unless you have established criteria to differentiate movies that finely, it’s almost impossible to do without sacrificing consistency in your ratings . In a future post, I’ll outline how I established criteria for a 100 point scale. For now, I would keep your scoring simple by rating movies on a 10 point scale and converting the score to a 100 point scale for Criticker. For example, if you rate a movie 8 out of 10 on IMDB, score it as an 80 for Criticker. If, when you were rating the movie for IMDB, you had difficulty deciding whether it was a 7 or an 8, you can rate it a 75 on Criticker. The important thing is to have a consistent set of scoring rules that are applied uniformly across all of your movies.

Go ahead and get started. Pretty soon you’ll find that there are many people out there whose movie recommendations you can trust. Just remember that there is no one whose taste is exactly like yours.

What Movie Are You?

In the next series of Posts, I will introduce movie recommender sites that try to answer the question “What Movie Are You” based on the movies that you “really like”.

This past weekend I watched Saturday Night Fever for the fourth time. Roger Ebert mentions in his Great Movies review of the film that it was Gene Siskel’s favorite movie of all-time, having seen it 17 times. I’m in the Siskel camp. It is one of my favorite movies of all-time as well. I watched it for the first time in a Chicago area theater when it first came out in 1977. I was in the first year of my new job, the first year of 35 successful years with the same company. I was within a year of meeting my future wife, married 36+ years and still going strong. And, a little less than two years prior, I had left the middle class, New England town I grew up in and moved to the Chicago area. As it turned out, it was that momentous decision that shaped my entire adult life.

When I mention to others that Saturday Night Fever is a favorite of mine, a typical reaction is “I hate disco”. It is so much more than a disco movie. Disco is just its milieu. It is a movie about dreams and the barriers that get in the way of realizing those dreams. It is about being stuck in your current existence and coming to the realization that you won’t like the consequences of staying stuck. It is about breaking away and giving yourself a chance.

As I watched Saturday Night Fever that first time, I began to identify with the movie. I identified with Tony Manero’s yearning to create a bigger footprint in his life than he could in his Bay Ridge neighborhood. I recognized the emotional traps that were holding him back from pursuing his dream. I felt his relief when he finally decided to make the move to Manhattan, even though he had no job to go to. I was Saturday Night Fever without, of course, the disco dance king lifestyle.

In the next series of Posts, I will introduce movie recommender sites that try to answer the question “What Movie Are You” based on the movies that you “really like”. No site can identify all of the deep down personal reasons why a movie connects with you. Under my system, for example, there was only a 28.2% chance that I would “really like” Saturday Night Fever. But, the movies that you do “really like”, do identify the types of movies that draw you in and these sites effectively select quality movies within genres you enjoy watching. The sites are all different, using a variety of assumptions and methodologies. They are all just waiting for you to start rating the movies you’ve seen, both good and bad, so that they can get to know you.

In the meantime, consider sharing a comment on your reaction to this Post. Are there any movies that connect with you on a personal level? What Movie Are You?

Is There Something Rotten (Tomatoes) in Denmark?

With apologies to William Shakespeare and Hamlet, does the influence of corporate profit incentives have a corrupting influence on movie recommender websites.

With apologies to William Shakespeare and Hamlet, does the influence of corporate profit incentives have a corrupting influence on movie recommender websites? Movie Ratings have become big business. Amazon bought IMDB in 1998 to promote Amazon products. There appears to be a synergy between the two that doesn’t seem to impact IMDB’s rating system. On the other hand, the Netflix business model, which began as DVD mail order business,  today is a very different business. Netflix has become heavily invested in original entertainment content for its online streaming business and is using a recommender algorithm for that business that is different than its gold-standard algorithm used for the DVD business. Does the Netflix algorithm for its online streaming business better serve the interest of Netflix subscribers or Netflix profits? I’m sure Netflix would say that it serves both. I’m not so sure. This will be a topic of interest for me in future posts. The more immediate concern is Rotten Tomatoes.

It was announced on Feb. 17, 2016 that Rotten Tomatoes, along with the movie discovery site Flixster, was sold to Fandango. For those of you who are not familiar with Fandango, it is one of two major online advance movie ticket sales sites. MovieTickets.com is the other site. For a premium added to your ticket price, Fandango allows you to print movie tickets at home to allow the moviegoer to avoid big lines at the theater.

So, why should we be concerned? Let’s start with the perception that Rotten Tomatoes has become so influential that it makes or breaks movies before they are even released. Here are a couple of articles that express the growing concern film-makers have with Rotten Tomatoes scores: Rotten Tomatoes: One Filmmaker’s Critical Conundrum and Summer Box Office: How Movie Tracking Went Off the Rails. Whether it is true or not, the movie industry believes that the box office success or failure of a film is in the hands of 200 or so critics and the website that aggregates the results, Rotten Tomatoes.

This impact that Rotten Tomatoes has on the box office each week may be a driving force behind Fandango’s acquisition. In CNN Money’s article  announcing the purchase, Fandango President Paul Yanover states “Flixster and Rotten Tomatoes are invaluable resources for movie fans, and we look forward to growing these successful properties, driving more theatrical ticketing and super-serving consumers with all their movie needs,”. Fandango makes money when more people go to the movies, particularly on opening weekends for well-reviewed movies, when lines are expected to be long. Rotten Tomatoes’ Certified Fresh designations drive opening weekend long lines. Logically, Fandango business interests would be better served by even more movies earning the Certified Fresh rating.

Am I being too cynical? Well, according to a study by Nate Silver’s FiveThirtyEight site  Fandango has done this before. According to FiveThirtyEight Fandango used some creative rounding to inflate their movie ratings in the past. Has Fandango learned its lesson? They claim that Rotten Tomatoes will maintain their independence within their corporate structure. Maybe, but from my experience, corporate acquisitions are made to create profitable synergies – more Certified Fresh ratings, more moviegoers, more long lines for tickets, more “theatrical ticketing” in advance, more profits.

If you begin to “really like” fewer movies that are Certified Fresh on Rotten Tomatoes you might conclude that there may be something Rotten (Tomatoes) in Fandango…if not in Denmark.

 

Rotten Tomatoes, IMDB and the Wisdom of Crowds

In the Introduction of James Surowiecki’s The Wisdom of Crowds, the author writes that “under the right circumstances, groups are remarkably intelligent, and are often smarter than the smartest people in them”. This prescient book, written in 2004, was describing the crowd-sourcing, data driven world that we live in today. If you want information, you type a couple of words into Google and you find exactly what you were looking for on the first page of links. If you are visiting a new city and you’re looking for a good restaurant, you check Yelp to identify the highest rated restaurants. And, if you want to go to the movies, you check Rotten Tomatoes and IMDB to see which of the movies you are considering is the highest rated.

The “right circumstances” for groups to be intelligent, according to Surowiecki, is that the group has to be big enough, diverse, and individual decisions within the group need to be made independently. Rotten Tomatoes is independent enough, most of the critic reviews are made prior to the release of the movie without knowledge of how other critics are rating the movie. Diversity is an interesting question. They are all movie critics after all and most of them are men. Still, they certainly bring a diverse set of life experiences. So, diversity isn’t optimal but still exists. The biggest question mark is whether the group is big enough. Star Wars: The Force Awakens is the most reviewed movie I’ve come across on Rotten Tomatoes with a little more than 335 critics reviews counted in the rating. My database average is 104 reviews. That is not a big sample size for statistical analysis. While, logically, movies rated Certified Fresh 95% should be better than Certified Fresh 75% movies, my data doesn’t support that.

“Really Like” Don’t “Really Like” Total % “Really Like”
CF > 88% 284 155 439 64.7%
CF < 88% 283 154 437 64.8%

There is virtually no difference between movies rated higher than Certified Fresh 88% and those less than Certified Fresh 88%. On the other hand, when you just look at Certified Fresh vs. Fresh vs. Rotten movies, the group allocates the movies intelligently.

“Really Like” Don’t “Really Like” Total % of Total Database % “Really Like”
 CF 567 309 876 44.6% 64.7%
F 324 399 723 36.9% 44.8%
R 91 272 363 18.5% 25.1%

It turns out that crowds of critics are pretty smart.

IMDB certainly meets the criteria for an intelligent group. It is big enough, Star Wars: The Force Awakens has over 450,000 votes, for example. While not as diverse demographically as one might like, it is much more diverse than a crowd of critics. And, moviegoers who vote on IMDB cast their vote independently (how influenced they are by other ratings is a subject for another day). When I rank the movies in my database by Avg. IMDB Rating and allocate them in groups identical to the Rotten Tomatoes table, you get the following results:

Avg. IMDB Rating “Really Like” Don’t “Really Like” Total % of Total Database % “Really Like”
> 7.4 552 324 876 44.6% 63.0%
6.7 to 7.4 361 362 723 36.9% 49.9%
< 6.7 69 294 363 18.5% 19.0%

Crowds of moviegoers are pretty smart as well.

Let’s go one step further. What would these results look like for movies that Rotten Tomatoes rated Certified Fresh and IMDB rated 7.4 or higher:

“Really Like” Don’t “Really Like” Total % of Total Database % “Really Like”
370 156 526 26.8% 70.3%

How about if Rotten Tomatoes rated the movie Rotten and IMDB had an average rating of 6.7 or less:

“Really Like” Don’t “Really Like” Total % of Total Database % “Really Like”
24 193 217 11.1% 11.1%

This is the basis for my rating system. When you combine movie recommender systems together, you improve your chances of selecting movies that you will “really like” and avoiding movies you won’t “really like”. It turns out that crowds of critics and moviegoers are the wisest crowds of all.

 

Rotten Tomatoes: The Critics Aren’t Always Right but, collectively, they are Not Often Wrong

I lived in Chicago from 1976 to 1980. During that time I discovered a little show on WTTW, the local PBS channel, called Sneak Previews.  In the show, a couple of local film critics showed clips from recent movies and each gave their individual review of each movie.  Those film critics, Gene Siskel and Roger Ebert, were in the early years of a show that, through more than 35 years, would go through a number of name changes, would eventually be syndicated to a nationwide audience, and would endure contract disputes, the death of Siskel and the serious illness to its other originator Ebert.  People across the nation tuned in to find out if a movie they were thinking of seeing would get “two thumbs up”.  Like Roman emperors at the coliseum, the box office fate of a movie might hinge on whether Siskel & Ebert gave a movie thumbs up or thumbs down. As a viewer, if a movie got a “two thumbs up” it landed on my mental list of movies I’d consider watching.  If it got a “two thumbs down” it landed on my” don’t waste my time watching” list. But, Siskel & Ebert were competitors from rival Chicago newspapers, and, not surprisingly, they didn’t always agree about a movie. Some movies got a split decision. Siskel would give a “thumbs up” and Ebert would give a “thumbs down”, or vice versa. This left me in the quandary of having to choose which critic to put my faith in since there was no consensus opinion.

This brings me to Rotten Tomatoes. With no disrespect intended to Siskel & Ebert, or any other critic, Rotten Tomatoes is the concept of “two thumbs up” on steroids. The website aggregates the opinions of critics from around the globe. Instead of giving a “thumbs up” or a “thumbs down”, critics label a movie as “Fresh” or “Rotten”. Instead of two critics, a widely distributed movie might garner up to 300 critic reviews. Rotten Tomatoes includes reviews only from critics who have been certified by film critic associations or writing guilds. In addition, they designate some of those critics as “top critics”, well-respected critics writing for newspapers or national magazines. In fact, Roger Ebert was one of those “top critics” before his death.  If a given movie has been reviewed by at least 40 critics, including at least 5 “top critics”, and 75% of those critics designate the movie as “Fresh”, then the movie earns Rotten Tomatoes top designation of being “Certified Fresh”. If less than 60% of the critics rate the movie as “Fresh”, then the movie is designated as “Rotten”. Movies in between, for the most part, are designated as “Fresh”.

I have a lot of respect for film critics. All of the other movie recommender websites that I use rely on feedback from moviegoers after they’ve seen the movie. Movie critics form their opinion, most of the time, before the movie has been released to the general public. They don’t know whether it will be a blockbuster at the box office or a flop. They rely on their expertise without the benefit of feedback from the viewing public. In my next article, I’ll get into how effective Rotten Tomatoes has been in leading me to movies that I “really like”. For now, I’ll just say it’s amazing how often good film critics get it right. Two Thumbs Up!

***

Beginning with this article, I am going to attempt to keep a regular schedule for my posts – two a week, Monday and Thursday. In addition, I plan on updating my movie lists by each Wednesday. Look for my next article, Rotten Tomatoes, IMDB and The Wisdom of Crowds to be posted March 10th.

What Shall I Watch Tonight?

The first day of each month is a big day in my obsessive quest to watch movies that I will “really like”. At the end of each month I recalibrate my probabilities and start the next month with a fresh Top Ten Movies to Watch list (see updated list) . Here’s the rub, only one of those movies is available for me to watch tonight. It therefore is really a list of the Top Ten Movies to Watch Someday.

I’m adding a new list under my Movie Lists section, Top Ten Movies Available to Watch This Month. Technically, almost any movie I want to watch is available if I’m willing to pay for it. But, I do have a budget with an already significant allocation to it. So, the movies available for me to watch in a given month are limited to “free” movies available from my cable company (Comcast), HBO, Showtime, Amazon Prime, Netflix, and Netflix DVD (2 a month limit). My list is made up of the movies that are available to watch this month on these platforms plus two movies from Netflix DVD. I’ll generally use the Netflix DVDs to make a dent in my Someday list. There are also some miscellaneous streaming channels (Crackle, Tubi TV etc.) that I’ll use on occasion.

Which brings me back to the first day of the month. While each of these platforms will make some weekly additions and deletions to their available movies, there are wholesale changes on the first day of each month. The supply and demand curve for “What Shall I Watch Tonight?” can be radically altered on the first day of each month.

So, “What Shall I Watch Tonight?” I don’t know yet but check out the list. You’ll find it there.

 

IMDB…and the Oscar Goes To

On Sunday, the 2016 Academy Award for Best Picture will be announced. The pundits expect a close race among Spotlight, The Revenant,  and The Big Short, with Mad Max: Fury Road a possibility for an upset. Six weeks ago the Las Vegas odds makers had set the odds for each movie as follows:

  1. Spotlight                                    4:5
  2. The Revenant                           6:5
  3. The Big Short                            8:1
  4. The Martian                               8:1
  5. Mad Max: fury Road              20:1
  6. Bridge of Spies                        30:1
  7. Room                                          40:1
  8. Brooklyn                                    50:1

To determine a winner, voters from the Academy membership, representing a variety of film disciplines, vote for the movie that represents the highest cinematic achievement of 2015. The discipline with the highest representation in the voting is acting. Actors make up 22% of the Academy voters and presumably have the greatest influence on the ultimate winner.

What if IMDB voters chose the Academy Award winner for Best Picture? While IMDB voters don’t represent a variety of film disciplines, they do represent different demographic perspectives. If each of these demographic slices of the IMDB voters chose the Best Picture winner, the results for each group would be:

  • Age Under 18                              The Revenant
  • Age 18 – 29                                   Room
  • Age 30 – 44                                   Room
  • Age 45+                                          Spotlight
  • Males                                              Room
  • Females                                         Room
  • United States                               Room, Spotlight (tie)
  • Non-United States                     Room

And, after combining the votes for all of these IMDB voter groups, the Oscar, in an upset, goes to Room.

The average IMDB ratings (as of February 22, 2016) for the eight nominees reflect a tight race:

  1. Room                                            8.3
  2. Mad Max: Fury Road                8.2
  3. The Revenant                             8.2
  4. Spotlight                                      8.2
  5. The Martian                                8.1
  6. The Big Short                             7.9
  7. Bridge of Spies                           7.7
  8. Brooklyn                                      7.6

Although the co-star of Room, Brie Larson, is the favorite to win Best Actress, I don’t believe Room will win Best Picture on Sunday. Academy voters and IMDB voters are very different. Just as actors will have the greatest influence over who wins the Oscar for Best Picture tomorrow, there are demographic segments that have heavily influenced our IMDB voting for Best Picture. Here are the three primary groups influencing the IMDB vote with their percentage of the aggregate IMDB vote for all eight movies displayed alongside:

  • Voters Aged 18 – 29             52% of total vote
  • Non-US Voters                     79% of total vote
  • Male voters                            84% of total vote

Although the IMDB voting for Room reflects a pretty strong consensus across almost all groups, the vote is dominated by Young, Male, Non-US IMDB voters.

Sunday night, as you watch the Oscars, the lack of diversity among the Academy nominees will be the topic most commented on by Chris Rock, the emcee, the presenters, and the winners. But if you really want to know why  a particular actor or actress didn’t get a nomination, or why a particular movie didn’t win IMDB Best Picture, check out who voted. It’s all there.

 

 

 

IMDB: The Ultimate Word of Mouth

Before the internet, one of the ways people decided what movies to watch was through “word of mouth”. Family, friends, neighbors, work associates etc. would talk about a movie they had seen recently that they really liked. If enough people mentioned the same movie it became a movie that you wanted to see as well.

Today, IMDB (Internet Movie DataBase) is the ultimate “word of mouth” source of feedback on a movie.  From its 60 million registered users, ratings are generated for over 3.4 million movies, TV shows, and episodes of TV shows. (Since this is a movie selection blog I’ll stick to the website’s movie benefits.) After completing the free IMDB registration, users can vote for a movie they’ve seen on a 1 to 10 scale, with 10 being the highest.  From all of the ratings, IMDB compiles an average rating for each movie, with some controls in place to prevent ballot stuffing. So, instead of getting “word of mouth” feedback from a few family and friends, IMDB provides you with feedback from movie watchers from around the globe. If, for example, your movie choice for the evening is between Saving Private Ryan and Life is Beautiful, IMDB provides you with feedback from over 835,000 people for Saving Private Ryan and over 315,000 for Life is Beautiful. But, here is where it gets a little bit tricky. Both of these World War II related movies have an average rating of 8.6. They are both great movies. Which movie will you enjoy more? It depends on how “average” you are.

Because of the volume and diversity of IMDB viewers, the average rating for a movie may not be a demographic fit for you. While the two movies being considered have the same average rating,  the average rating for United States IMDB voters is 8.8 for Saving Private Ryan and 8.4 for Life is Beautiful. The average rating for female IMDB voters is 8.9 for Life is Beautiful and 8.1 for Saving Private Ryan.  Having this information puts a new perspective on which movie you’d prefer to watch.

One of the first of the many useful features available on IMDB that you should become familiar with is the capability to look at a demographic split of the votes that go into a specific rating for a specific movie. This feature is not available directly from the IMDB phone app. It can only be accessed on the website. But, if you go to the bottom of the page for the movie that you pulled up on the phone app, there is a link to the website page for the movie. When you access the movie on the website it will provide you with the average rating for the movie. Right next to the average rating it will show you the number of votes the rating is based on, or how much “word of mouth” feedback you’re getting on this movie. If you click on the number of votes, a page opens up with all of the demographic data behind the feedback population. It tells you how women rated the movie vs. men. It tells you how different age groups rated the movie. It splits US and non-US voters. In a nutshell, it gives you the opportunity to see how the group most like you rated the movie. It’s also a good tool to use when you are trying to select a movie for a group of people to watch.

The Shiny Penny

You pull a pocketful of change out of your pocket and two coins catch your eye. The first is a tarnished old quarter that you can hardly identify as a quarter and the second is a brand new shiny penny. You’ll probably get rid of that old tarnished quarter in the first transaction that comes along. You have an irrational fear that its not worth the same as other quarters in your pocket. As for the shiny penny,you’ll probably use up all of the older pennies in your pocket before you give up that shiny new penny.

We have a tendency to select movies to watch on the same basis. We prefer to watch a mediocre shiny new movie rather than a much better tarnished older movie. We can’t resist the allure of the unknown experience of the new. Or, we fear that the new movie will come up in conversation at a social gathering and we’ll be left out of the discussion. Whatever the reason, it’s an emotional choice and, like many emotional choices, it comes with greater risk of regret.

If your goal is to spend a couple of hours totally engaged in a magical movie experience, then you need to be more selective in what you watch and you need to broaden your pool of movies to watch. The reality is that the number of shiny pennies that are magical movie experiences is limited. In any given year, there may only be a handful of new movies that are “wow” movies. There might be a dozen or two more that you’ll “really like”. The less systematic you are in your movie selection the more unsatisfying movie experiences you’ll need to go through before you find those dozen or so shiny new movies worth watching.

So, how do you improve your chances of picking movies you’ll “really like”? First, create a watchlist. Identify the movies you want to watch before you sit down on the couch and start scrolling through the list of movies available on Netflix, Amazon Prime, or On Demand. Second, set some criteria for the movies you’ll put on your watchlist. It can be as simple as targeting movies that sound interesting to you and are Certified Fresh by Rotten Tomatoes or it can be as complex as the Bayesian probability approach that I use (more on my approach in a later post). Third, don’t limit yourself to recent movies, Cast a wider net. Movie-making has been going on for over 100 years. If you really don’t like old movies, check out movies released since 2000, for example. Finally, include great movies that you’ve seen before in the pool of movies available for your watchlist.

If you click the link on the sidebar of this page titled My Top Ten Movies to Watch, you can see my current watchlist. These are movies that I either haven’t seen before or haven’t seen in the last 15 years. These are the movies with the highest probability that I will “really like”. The list includes newer movies that I’ve never seen before and older movies that I have seen before.

Every month I remove movies from my database that I haven’t watched in fifteen years. Those that meet my selection criteria, I’ll watch again. If it’s a movie that I’ve seen only once before, it will often feel like I’m watching it for the first time. And, if it’s one of those magical movies, I’m grateful that I valued that old tarnished quarter instead of the shiny new penny.

Will I “Really Like” this Blog?

I love baseball, movies and analyzing data. Since analyzing baseball data is a well-traveled path and analyzing data from baseball movies is too narrow a path, I am left with the intersection of movies and data analysis. Specifically, I analyze data generated by five Movie Ratings websites: IMDB, Rotten Tomatoes, Netflix-DVD, Movielens, and Criticker. There are other sites I could include and maybe will include in the future. For example, Metacritic is a fairly well known website but, for now, I’ve chosen not to use Metacritic because it is similar to Rotten Tomatoes with a less robust volume of movie ratings. I focused on these five sites because they are a good cross section of the methodologies I’ve come across that are used to rate movies.

Over the past few years I have built and maintained a database of all of the movies I have watched in the last 15 years. As of January 31, 2016, my database contains 1,957 movies. For each movie, I have entered the ratings provided by IMDB and Rotten Tomatoes, as well as the personalized ratings generated by Netflix-DVD, Movielens, and Criticker. If you are unfamiliar with these sites, the links at the top of the page will get you to the home page for each site. IMDB and Rotten Tomatoes don’t require any work on your part to see ratings. Criticker and Movielens base their ratings off of the ratings you provide for the movies you’ve seen. Their value as movie guides requires some effort on your part. Netflix-DVD is also based off of your ratings but has the additional requirement that you be a subscriber to their DVD service.

At about this point you are probably asking, “Why is he doing this?”  Initially, I wanted to test which website was the best at leading me to movies that I’d “really like”. Instead, I ended up with an algorithm, using all five websites, that provides me with the probability that I will “really like” a particular movie.  For example, the two Oscar nominated movies for Best Picture this year that I haven’t seen are Bridge of Spies and The Revenant. Based on my algorithm, there is a 98.1% chance that I will “really like” Bridge of Spies while there is only a 49.5% chance that I will “really like” The Revenant. I will watch both movies but it will be with the recognition that Bridge of Spies is close to a sure thing while The Revenant is a 50/50 proposition. Where these probabilities come from is a topic for another day.

 I’ve been using some form of this algorithm to select the movies to watch over the last two years. A comparison of the last two years with the first two years used in my study would suggest I’ve been pretty successful. I watched 165 movies during the 1999 & 2000 calendar year and “really liked” 72 of them. Over the last two years, 2014 & 2015, I watched 182 movies and “really liked” 163.  I’ve gone from “really liking” 44% of my movies in the first years of my study to 90% over the last two years.                                                      

I have a friend of mine who, on occasion, will ask me what I thought of a particular movie. Mostly, I’ll tell him I “really liked” it. He then dismisses my recommendation by saying “but you like everything.” He’s right! I’ve reached the point where 9 out of 10 movies I watch I “really like”. It’s not, however, because I like everything. It’s because I’m able to identify those movies that I probably will “really like” and avoid watching those that I probably won’t like.

As to the question posed in today’s title, “Will I ‘Really Like’ this Blog?” I’ll say this. If you frequently watch movies that you wish you hadn’t, you will “really like” this blog. You won’t have to build your own personal movie selection algorithm. You will, though, gain a better understanding of various movie websites and how they can help you pick a movie to watch that you will “really like”.