Why I Write.

Not enough of our big budget summer movie options measure up in quality to the Marvel or Star Wars franchises.

Everyone expects Avengers: Infinity War, which is widely released in the U.S. tomorrow, to dominate the box office for the next four weeks until Disney’s other can’t miss blockbuster, Solo: A Star Wars Story, takes over on May 25th. Disney moved Infinity War up a week from its original release date to give the movie one additional week to dominate the box office before other big budget competition begins to divide up fans of the big screen.

I will admit that I am excited about seeing Avengers: Infinity War. I grew up a Marvel comic book geek and so far MCU has successfully translated the humor and the humanity of the characters from their pages to the screen. Too often though movies with big budgets spend much of those budget dollars trying to convince us that we should “really like” the movie they created rather than creating the movie we will “really like”. When the expensive product created doesn’t match the creative vision, they plan advertising campaigns to induce the viewing public to bail them out.

As we crash headlong into another blockbuster season, I hope that the industry has more surprises for us this summer. I hope that there are more under the radar summer classics such as Hell or High Water or The Big Sick that overcome the hype of the big budget movie ad campaigns to capture the attention of lovers of quality films. I hope that there are several of these movies and not just one or two. I hope that audiences reject the big budget films that aren’t of the quality of the Marvel and Star Wars franchises. That is how the overall quality of the films available for us to see get better. Movie producers make the movies that they think that people will go to see. If we go to the theater to see more “really like” movies, they will make more “really like” movies.

This is my mission. I want to warn you off of the over-hyped mediocrity of big budget misfires and lead you to the gems that are hidden in plain sight. I do this, not by solely telling you what movies I’ve seen and “really like”, but by consolidating and analyzing the data from the movies that you and other lovers of film have seen and “really like”. In this way, I hope to do my little part in improving the quality of what’s available for us to see and suggest to you what other enthusiasts are identifying as movies that you might “really like”.

This is why I write.

 

“Really Like” Movie Recommendations Are Even Better When You Exercise a Little Judgement

Last Saturday night my wife Pam and I watched 20th Century Woman for our weekend movie night. If you’ve been following the Objective Top Twenty all year, you’ll note that this movie has been on the list for most of the year. We were pretty excited to see it. In the end, though it kind of left us feeling a little flat.

Last Saturday night my wife Pam and I watched 20th Century Woman for our weekend movie night. If you’ve been following the Objective Top Twenty, you’ll note that this movie has been on the list for most of the year. We were pretty excited to see it. In the end, though, it wasn’t the movie we expected.

20th Century Woman is a semi-autobiographical movie directed and written by Mike Mills and reminisces about his teenage years in Santa Barbara, CA, He is raised by a single mother, played by Annette Bening, with the assistance of two other women in his social circle.

It is an intriguing movie with interesting characters. I wasn’t bored by it but the movie didn’t quite connect with me. As an aside, I found it interesting that Greta Gerwig, who co-stars as one of the other female influences in the story, turned around after this movie and drew on her own teenage experience in Sacramento, CA.  Gerwig wrote and directed a similar movie, the recently released and highly acclaimed Lady Bird. While Mills made the focus of his movie about the mother, Gerwig centered her movie on Lady Bird, the teenager. Perhaps 20th Century Woman would have more effectively connected with me if it were focused on the teenager, Jamie. Punk Rock also has a prominent place in 20th Century Woman, a music genre that passed me by without hardly an acknowledgement of its existence.

I ended up rating this movie as a “like” but not a “really like” movie. The “really like” algorithm estimated that there was a 67% probability that I would “really like” 20th Century Woman. Is this a case of the movie simply representing the 33% probability that I wouldn’t “really like” it. Sure, but that doesn’t mean that there weren’t warning signs that it might end up in the 33%.

Without getting into the mathematical weeds of the algorithm, let it suffice to say that the probability that I will “really like” a movie is the blend of the objective data that goes into the Objective Top Twenty and subjective data from Netflix, Movielens, and Criticker which are based on my personal taste in movies. If the data from the subjective sites is limited, my “really like” probability is weighted closely to the objective data. On the other hand, if the subjective data is plentiful, then its recommendation is very reliable and my “really like” probability is close to the subjective recommendation.

You might find this illustration helpful. The Credibility Quintile organizes the movies into five groups based on how reliable the subjective data is. Quintile 5 is very reliable data and Quintile 1 is not very reliable. The five movies listed all have close to the same probability that I will “really like” them but are in different quintiles.

Movie Credibility Quintile Objective “Really Like” Probability % Subjective “Really Like” Probability % Probability I Will “Really Like” This Movie
Men of Honor 5 63.4% 69.0% 67.2%
Far and Away 4 61.6% 69.6% 66.6%
Nebraska 3 69.1% 63.4% 66.3%
Fabulous Baker Boys, The 2 65.3% 69.9% 67.0%
20th Century Women 1 68.3% 51.2% 67.0%

While all five movies have relatively the same overall probability, they aren’t equally reliable. Men of Honor is clearly a movie that, according to the highly reliable Quintile 1 data, I will like more than the rest of the world and the algorithm reflects that. The same could be said for Far and Away. The movie Nebraska, on the other hand, seems to be a movie that I would like less than the general public. Note as a Quintile 3 movie my probability is halfway between the objective and the subjective probabilities.

It’s the last two movies that illustrate the point I want to make. The probability that I will “really like” The Fabulous Baker Boys is identical to 20th Century Woman. Both movies are in below average credibility quintiles. That is where the similarities end. When you look at the subjective probabilities for both movies, The Fabulous Baker Boys has a strong trend towards being a movie I will “really like”. Even without reliable data it might be a movie worth taking a chance on. 20th Century Woman is headed in the opposite direction towards being a movie I probably wouldn’t “really like”. I should have caught that before watching the movie. It doesn’t mean I would have given up on the movie. It just means that I should have waited another cycle or two for more data to more reliably predict whether I would “really like” it or not.

Algorithms are tools to help you analyze data. Using algorithms to make decisions requires the exercise of a little judgement.

 

 

“Really Like” Movie Experiences With My Family at Thanksgiving

Over the course of a typical Thanksgiving weekend, movies will become a part of our family experience. We will watch them. We will discuss them. For me, my family is my own private focus group. They challenge my ideas and generate new avenues of thought.

Over the course of a typical Thanksgiving weekend, movies have become a part of our family experience. We watch them. We discuss them. For me, my family is my own private focus group. They challenge my ideas and generate new avenues of thought to explore.

This Thanksgiving was no different as my wife Pam and I flew into Seattle to visit with Meggie, Richie and Addie, our daughter, son-in-law and 4 month old granddaughter. Our son Brendan and his girlfriend Kristen (a very loyal follower of this blog) flew in from Boston. And our youngest, Colin, made the trip up the coast from L.A. With our family scattered from coast to coast, these family gatherings are very special.

Movies aren’t the only topic of conversation, especially when Addie’s in the room, but they do surface from time to time. Richie and I had a conversation about my Objective Top Seven from the years 1992 to 1998 that was in my last post. While he thought Schindler’s List was good, he would never put it at number one. He liked movies that made him feel happy when they were over. Now, Scent of a Woman, that was a movie on my list he could get on board with. On the other hand, my son Brendan couldn’t understand why his favorite movie Braveheart wasn’t on the list.

My conversations with Richie and Brendan illustrate why I rank movies based on “really like” probabilities. What movies we like and why we like them are unique to our own experiences and tastes. Many of us watch a movie to boost our mood. Schindler’s List is not a mood booster. On the other hand, if we are in the mood to expose ourselves to a harsh reality of the human experience and have our emotions touched in a very different way, there are few movies as moving as Schindler’s List. I confess that, like Richie, I prefer the mood boost to the harsh reality of life. The movie Moonlight has been sitting on my Watch List for some time now, waiting for me to be in the mood to experience it.

Later in the weekend, Meggie and Colin watched The Big Sick with me on Amazon Prime. They were really excited to see it based on the enthusiastic recommendations from Pam and I, and from many of the other people in their lives. At the end of the movie, they indicated that they both liked it but expected more from a movie that everyone else had raved about. It gave me another interesting insight into why people “really like” some movies but not others. Your expectation for a movie can significantly shape your opinion of the movie. Watching a movie that others say you “gotta see” may set the bar so high that only the great movies will reach it. A mere really good movie has no shot.

That expectations shape your opinion of a movie is a truism. If I flip the scenario to movies that I’ve stumbled upon that became unexpected movie treasures, I can attest to a second truism. Good movies that fly under the radar will be enjoyed more than they have any reason to be. One of my personal top fifty movies is the greatest baseball movie few people have seen, Bang the Drum Slowly. Less than 5,000 voters have rated it on IMDB. Released in 1973, it stars De Niro before he was “De Niro”. At the time it didn’t go totally unnoticed. The movie earned a Best Supporting Actor nomination for Vincent Gardenia. I only saw the movie because I went to a double feature at the drive-in. The second movie was one of those “gotta see” movies. Bang the Drum Slowly was the first. That’s the movie that I fondly remember today and not the second feature.

Rating movies is not a science. Movie fans who rate movies on websites like IMDB don’t use a Pythagorean Formula to derive that one correct answer. But it’s from these disparate reasons for each individual rating that I try to tease out some understanding each week as to which movies you will “really like”.

I am very thankful for the strong support and inspiration of my family at Thanksgiving and all of the other 364 days of the year.

 

Why Did “The Big Sick” Drop Out of the Objective Top Fifteen This Week?

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

This past Sunday my wife, Pam, and I went to see The Big Sick. The movie tells the story of the early relationship days of the two screenwriters, Emily Gordon and Kumail Nanjiani. In fact, Nanjiani plays himself in the movie. It is the authenticity of the story, told in a heartfelt and humorous way, that makes this film special.

On the following day, last weekend’s blockbuster, Dunkirk, moved into the second spot in the revised Objective Top Fifteen rankings. When a new movie comes on the list another one exits. This week’s exiting movie, ironically, was The Big Sick. Wait! If The Big Sick is such a great movie why isn’t it in my top fifteen for the year? Are all of the other movies on the list better movies? Maybe yes. Maybe no. You’ll have to determine that for yourselves. You see the Objective Top Fifteen is your list, not mine.

I developed the Objective Top Ten, which became Fifteen the beginning of July and will become Twenty the beginning of October, to provide you with a ranking of 2017 widely released movies that are most likely to be “really like” movies. Because the ranking is based on objective benchmarks, my taste in movies has no influence on the list. The four benchmarks presently in use are: IMDB Avg. Rating, Rotten Tomatoes Rating, Cinemascore Rating, and Academy Award Nominations and Wins. A movie like Hidden Figures that meets all four benchmarks has the greatest statistical confidence in its “really like” status and earns the highest “really like” probability. A movie that meets three benchmarks has a greater “really like” probability than a movie that meets only two benchmarks. And so on.

The important thing to note, though, is that this is not a list of the fifteen best movies of the year. It is a ranking of probabilities (with some tie breakers thrown in) that you’ll “really like” a movie. It is subject to data availability. The more positive data that’s available, the more statistical confidence, i.e. higher probability, the model has in the projection.

Which brings me back to The Big Sick. Cinemascore surveys those movies that they consider “major releases”. The Big Sick probably didn’t have a big advertising budget. Instead, the producers of the film chose to roll the movie out gradually, beginning on June 23rd, to create some buzz and momentum behind the movie before putting it into wide release on July 14th. This is probably one of the reasons why Cinemascore didn’t survey The Big Sick. But, because The Big Sick is missing that third benchmark needed to develop a higher probability, it dropped out of the Top Fifteen. On the other hand, if it had earned at least an “A-” from Cinemascore The Big Sick would be the #2 movie on the list based on the tie breakers.

And, that is the weakness, and strength of movie data. “Major releases” have it. Smaller movies like The Big Sick don’t.

***

This weekend may be the end of the four week run of Objective Top Fifteen movie breakthroughs. Atomic Blonde, the Charlize Theron spy thriller, has an outside chance of earning a spot on the list. As of this morning, it is borderline for the IMDB and Rotten Tomatoes benchmarks. I’m also tracking Girls Trip which earned a Certified Fresh just in the last couple of days from Rotten Tomatoes and has an “A+” in hand from Cinemascore. For now, it is just below the IMDB benchmark. We’ll see if that changes over the weekend.

 

 

This Is Turning Into a “Really Like” Summer at the Movies.

In case you haven’t noticed, we are in the midst of a pretty good run of high quality movies this summer. Since the first weekend in May, which serves as the unofficial beginning of the summer movie season, there have been at least ten movies that have a 7.2 or higher IMDB average rating and have a Certified Fresh rating on Rotten Tomatoes.

In case you haven’t noticed, we are in the midst of a pretty good run of high quality movies this summer. Since the first weekend in May, which serves as the unofficial beginning of the summer movie season, there have been at least ten movies that have a 7.2 or higher IMDB average rating and have a Certified Fresh rating on Rotten Tomatoes.

May to July 2017 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Baby Driver 8.4 C. Fresh 97%
Spider-Man: Homecoming 8.2 C. Fresh 93%
Wonder Woman 8.0 C. Fresh 92%
Guardians of the Galaxy Vol. 2 8.1 C. Fresh 81%
Big Sick, The 8.0 C. Fresh 97%
I, Daniel Blake  7.9 C. Fresh 92%
A Ghost Story 7.5 C. Fresh 87%
Okja 7.7 C. Fresh 84%
The Beguiled  7.3 C. Fresh 77%
The Hero  7.3 C. Fresh 76%

And if early indicators are accurate, War for the Planet of the Apes will join the list after this coming weekend. And, if the early buzz on social media holds up, Christopher Nolan’s new movie Dunkirk will join the list the following weekend.

This seems to me to be an unusually high number of quality movies for the summer so far but I can’t tell you how unusual…yet. I’m working on a new long term project. I’m creating a database solely made up of objective “really like” movie indicators. It will include all movies finishing in the top 150 in receipts at the box office for each of the last 25 years. This database will provide a better representation of the bad movies that are released each year as well as provide a more robust sample size.

For now, I can only compare this year’s quality to 1992 (the first of the 25 years in my new database). Allowing for the fact that Rotten Tomatoes wasn’t launched until 1998, I’ve allowed movies that aren’t Certified Fresh but would otherwise be if there were enough critic reviews of the movie. Even with that allowance, there are only 3 movies released between May and July 1992 that meet the quality criteria I’m using for this summer.

May to July 1992 Wide Released Movies IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh
Night on Earth             7.5 Fresh 73%
Enchanted April             7.6 Fresh 83%
A League of Their Own             7.2 C. Fresh 78%

I’ll also add that the IMDB average ratings tend to decline over time. It is probable that a few of this year’s movies will ultimately not meet the 7.2 IMDB rating minimum. But, with 7 of the 10 movies sitting with IMDB ratings at 7.7 or better, this year’s list should hold up pretty well over time.

***

As I mentioned above War for the Planet of the Apes opens tomorrow. It is easy to overlook how good this franchise has been. Here are the “really like” indicators for the franchise including a very early look at tomorrow’s entry.

IMDB Rating Rotten Tomatoes Rating Rotten Tomatoes % Fresh Cinema Score
Rise of the Planet of the Apes (2011)             7.6 C. Fresh 81% A-
Dawn of the Planet of the Apes (2014)             7.6 C. Fresh 90% A-
War for the Planet of the Apes (2017)             9.1 C. Fresh 93% ?

Franchises tend to get tired after the first movie. From the critics’ perspective, this franchise appears to get better with each new movie. I expect to see War for the Planet of the Apes on the Objective Top Fifteen list on Monday.

What Am I Actually Going to Watch This Week? Netflix Helps Out with One of My Selections.

The core mission of this blog is to share ideas on how to select movies to watch that we’ll “really like”. I believe that there have been times when I’ve bogged down on how to build the “really like” model. I’d like to reorient the dialogue back to the primary mission of what “really like” movies I am going to watch and more importantly why.

The core mission of this blog is to share ideas on how to select movies to watch that we’ll “really like”. I believe that there have been times when I’ve bogged down on how to build the “really like” model. I’d like to reorient the dialogue back to the primary mission of what “really like” movies I am going to watch and more importantly why.

Each Wednesday I publish the ten movies on my Watch List for the week. These movies usually represent the ten movies with the highest “really like” probability that are available to me to watch on platforms that I’ve already paid for. This includes cable and streaming channels I’m paying for and my Netflix DVD subscription. I rarely use a movie on demand service.

Now, 10 movies is too much, even for the Mad Movie Man, to watch in a week. The ten movie Watch List instead serves as a menu for the 3 or 4 movies I actually most want to watch during the week. So, how do I select those 3 or 4 movies?

The first and most basic question to answer is who, if anyone, am I watching the movie with. Friday night is usually the night that my wife and I will sit down and watch a movie together. The rest of the week I’ll watch two or three movies by myself. So, right from the start, I have to find a movie that my wife and I will both enjoy. This week that movie is Hidden Figures, the 2016 Oscar nominated film about the role three black female mathematicians played in John Glenn’s orbit of the earth in the early 1960’s.

This movie became available to Netflix DVD subscribers on Tuesday May 9. I received my Hidden Figures DVD on that day. Something I’ve learned over the years is that Netflix ships DVD’s on Monday that become available on Tuesday. For this to happen you have to time the return of your old DVD to arrive on the Saturday or Monday before the Tuesday release. This gives you the best chance to avoid “long wait” queues.

I generally use Netflix DVD to see new movies that I don’t want to wait another 3 to 6 months to see or for old movies that I really want to see but aren’t available on my usual platforms.

As of the first quarter of 2017, Netflix reported that there are only 3.94 million subscribers to their DVD service. I am one of them. The DVD service is the only way that you can still access Netflix’ best in the business 5 star system of rating movies. It is easily the most reliable predictor of how you’ll rate a movie or TV show. Unfortunately, Netflix Streaming customers no longer have the benefit of the 5 Star system. They have gone to a less granular “thumbs up” and “thumbs down” rating system. To be fair, I haven’t gathered any data on this new system yet therefore I’ll reserve judgement as to its value. As for the DVD service, they will have me as a customer as long as they maintain their 5 star recommender system as one of the benefits of being a DVD subscriber.

The 5 star system is a critical assist to finding a movie for both my wife and I. Netflix allows you set up profiles for other members of the family. After my wife and I watch a movie, she gives it a rating and I give it a rating. These ratings are entered under our separate profiles. This allows a unique predicted rating for each of us based on our individual taste in movies. For example, Netflix predicts that I will rate Hidden Figures a 4.6 out of 5 and my wife will rate it a 4.9. In other words, according to Netflix, this is a movie that both of us, not only will “really like”, but we should absolutely “love”.

Hidden Figures has a “really like” probability of 61.4%. It’s Oscar Performance probability is 60.7% based on its three nominations. Its probability based solely on the feedback from the recommender sites that I use is 69.1%. At this point in time, it is a Quintile 1 movie from a credibility standpoint. This means that the 69.1% probability is based on a limited number of ratings. It’s not very credible yet. That’s why the 61.4% “really like” probability is closer to the Oscar Performance probability of 60.7%. I would fully expect that, as more people see Hidden Figures and enter their ratings, the “really like” probability will move higher for this movie.

Friday Night Movie Night this week looks like a “really like” lock…thanks to Netflix DVD.

 

 

Can You Increase Your Odds of Having a “Really Like” Experience at the Movie Theater

My experience got me thinking about the probabilities of seeing “really like” movies at the movie theater. These movies have the least data to base a decision off of and yet I can’t recall too many movies that I’ve seen in the theater that I haven’t “really liked”. Was this reality or merely perception.

Last Friday, my wife and I were away from home visiting two different sets of friends. One group we met for lunch. The second group we were meeting in the evening. With some time to spare between visits, we decided to go to a movie. The end of April usually has slim pickings for “really like” movies at the theater. With the help of IMDB and Rotten Tomatoes, I was able to surface a couple of prospects but only one that both my wife and I might “really like”. We ended up seeing a terrific little movie, Gifted.

My experience got me thinking about the probabilities of seeing “really like” movies at the movie theater. These movies have the least data to base a decision off of and yet I can’t recall too many movies that I’ve seen in the theater that I haven’t “really liked”. Was this reality or merely perception.

I created a subset of my database of movies that I’ve seen within 3 months of their release. Of the 1,998 movies in my database, 99 movies, or 5%, met the criteria. Of these 99 movies, I “really liked” 86% of them. For the whole database, I “really liked” 60% of the movies I’ve watched over the last 15 years. My average score for the 99 movies was 7.8 out of 10. For the remaining 1,899 movies my average score was 6.8 out of 10.

How do I explain this? My working theory is that when a movie comes with an additional cash payout, i.e. theater tickets, I become a lot more selective in what I see. But, how can I be more selective with less data? I think it’s by selecting safe movies. There are movies that I know I am going to like. When I went into the movies theater a couple of months ago to see Beauty and the Beast I knew I was going to love it and I did. Those are the types of movie selections I tend to reserve for the theater experience.

There are occasions like last Friday when a specific movie isn’t drawing me to the movies but instead I’m drawn by the movie theater experience itself. Can I improve my chances of selecting a “really like” movie in those instances?

Last week I mentioned in my article that I needed to define better what I needed my “really like” probability model to do. One of the things that it needs to do is to provide better guidance for new releases. The current model has a gap when it comes to new releases. Because the data is scarce most new releases will be Quintile 1 movies in the model. In other words, very little of the indicators based on my taste in movies, i.e. Netflix, Movielens, and Criticker, is factored into the “really like” probability.

A second gap in the model is that new releases haven’t been considered for Academy Awards yet. The model treats them as if they aren’t award worthy, even though some of them will be Oscar nominated.

I haven’t finalized a solution to these gaps but I’m experimenting with one. As a substitute for the Oscar performance factor in my model I’m considering a combined IMDB/Rotten Tomatoes probability factor. These two outputs are viable indicators of the quality of a new release. This factor would be used until the movie goes through the Oscar nomination process. At that time, it would convert to the Oscar performance factor.

I’ve created a 2017 new release list of the new movies I’m tracking. You can find it on the sidebar with my Weekly Watch List movies. This list uses the new “really like” probability approach I’m testing for new releases. Check it out.

If you plan on going to the movies this weekend to see Guardians of the Galaxy Vol. 2, it is probably because you really liked the first one. Based on IMDB and Rotten Tomatoes, you shouldn’t be disappointed. It is Certified Fresh 86% on Rotten Tomatoes and 8.2 on IMDB.