The Art of Selecting “Really Like Movies: Older Never Before Seen

Last week I stated in my article that I could pretty much identify whether a movie has a good chance of being a “really like movie” within six months of its release. If you need any further evidence, here are my top ten movies that I’ve never seen that are older than six months.

Last week I stated in my article that I could pretty much identify whether a movie has a good chance of being a “really like movie” within six months of its release. If you need any further evidence, here are my top ten movies that I’ve never seen that are older than six months.

My Top Ten Never Seen Movie Prospects 
Never Seen Movies =  > Release Date + 6 Months
Movie Title Last Data Update Release Date Total # of Ratings “Really Like” Probability
Hey, Boo: Harper Lee and ‘To Kill a Mockingbird’ 2/4/2017 5/13/2011          97,940 51.7%
Incendies 2/4/2017 4/22/2011        122,038 51.7%
Conjuring, The 2/4/2017 7/19/2013        241,546 51.7%
Star Trek Beyond 2/4/2017 7/22/2016        114,435 51.7%
Pride 2/4/2017 9/26/2014          84,214 44.6%
Glen Campbell: I’ll Be Me 2/9/2017 10/24/2014        105,751 44.6%
Splendor in the Grass 2/5/2017 10/10/1961        246,065 42.1%
Father of the Bride 2/5/2017 6/16/1950        467,569 42.1%
Imagine: John Lennon  2/5/2017 10/7/1998        153,399 42.1%
Lorenzo’s Oil 2/5/2017 1/29/1993        285,981 42.1%

The movies with a high “really like” probability in this group have already been watched. Of the remaining movies, there are three movies that are 50/50 and the rest have the odds stacked against them. In other words, if I watch all ten movies I probably won’t “really like” half of them. The dilemma is that I would probably “really like” half of them if I do watch all ten. The reality is that I won’t watch any of these ten movies as long as there are movies that I’ve already seen with better odds. Is there a way to improve the odds for any of these ten movies?

You’ll note that all ten movies have probabilities based on less than 500,000 ratings. Will some of these movies improve their probabilities as they receive more ratings? Maybe. Maybe not. To explore this possibility further I divided my database into quintiles based on the total number of ratings. When I look at the quintile with the most ratings, the most credible quintile, it does provide results that define the optimal performance of my algorithm.

Quintile 5

# Ratings Range > 2,872,053

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 152 134 88% 8.6 8.5 -0.1
Movies Seen Once 246 119 48% 7.5 6.9 -0.7
             
All Movies in Range 398 253 64% 7.9 7.5  

All of the movies in Quintile 5 have more than 2,872,053 ratings. My selection of movies that I had seen before is clearly better than my selection of movies I watched for the first time. This better selection is because the algorithm results led me to the better movies and my memory did some additional weeding. My takeaway is that, when considering movies I’ve never seen before, put my greatest trust in the algorithm if the movie falls in this quintile.

Lets look at the next four quintiles.

Quintile 4

# Ratings Range 1,197,745 to 2,872,053

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 107 85 79% 8.3 8.3 0.1
Movies Seen Once 291 100 34% 7.1 6.4 -0.7
             
All Movies in Range 398 185 46% 7.4 6.9
Quintile 3

# Ratings Range 516,040 to 1,197,745

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 122 93 76% 7.8 8.0 0.2
Movies Seen Once 278 102 37% 7.1 6.6 -0.6
             
All Movies in Range 400 195 49% 7.3 7.0
Quintile 2

# Ratings Range 179,456 to 516,040

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 66 46 70% 7.4 7.5 0.2
Movies Seen Once 332 134 40% 7.0 6.4 -0.6
             
All Movies in Range 398 180 45% 7.1 6.6
Quintile 1

# Ratings Range < 179,456

# of Movies # “Really Like” Movies % “Really Like” Movies Proj.  Avg. Rating All Sites My Avg Rating My Rating to Proj. Rating Diff.
Movies Seen More than Once 43 31 72% 7.0 7.5 0.5
Movies Seen Once 355 136 38% 6.9 6.2 -0.7
             
All Movies in Range 398 167 42% 6.9 6.4

Look at the progression of the algorithm projections as the quintiles get smaller. The gap between the movies seen more than once and those seen only once narrows as the number of ratings gets smaller. Notice that the difference between my ratings and the projected ratings for Movies Seen Once is fairly constant for all quintiles, either -0.6 or -0.7. But for the Movies Seen More than Once, the difference grows positively as the number of ratings gets smaller. This suggests that, for Movies Seen More than Once, the higher than expected ratings I give movies in Quintiles 1 & 2 are primarily driven by my memory of the movies rather than the algorithm.

What does this mean for my top ten never before seen movies listed above? All of the top ten is either in Quintiles 1 or 2. As they grow into the higher quintiles some may emerge with higher “really like” probabilities. Certainly, Star Trek Beyond, which is only 7 months old, can be expected to grow into the higher quintiles. But, what about Splendor in the Grass which was released in 1961 and, at 55 years old, might not move into Quintile 3 until another 55 years pass.

It suggests that another secondary movie quality indicator is needed that is separate from the movie recommender sites already in use. It sounds like I’ve just added another project to my 2017 “really like” project list.

 

 

Create, Test, Analyze, and Recreate

Apple’s IPhone just turned 10 years old. Why has it been such a successful product? It might be because the product hasn’t stayed static. The latest version of the IPhone is the IPhone 7+. As a product, it is constantly reinventing itself to improve its utility. It is always fresh. Apple, like most producers of successful products, probably follows a process whereby they:
Create.
Test what they’ve created.
Analyze the results of their tests.
Recreate.
They never dust off their hands and say, “My job is done.”

Apple’s IPhone just turned 10 years old. Why has it been such a successful product? It might be because the product hasn’t stayed static. The latest version of the IPhone is the IPhone 7+. As a product, it is constantly reinventing itself to improve its utility. It is always fresh. Apple, like most producers of successful products, probably follows a process whereby they:

  1. Create.
  2. Test what they’ve created.
  3. Analyze the results of their tests.
  4. Recreate.

They never dust off their hands and say, “My job is done.”

Now I won’t be so presumptuous to claim to have created something as revolutionary as the IPhone. But, regardless of how small your creation, its success requires you to follow the same steps outlined above.

My post last week outlined the testing process I put my algorithm through each year. This week I will provide some analysis and take some steps towards a recreation. The results of my test was that using my “really like” movie selection system significantly improved the overall quality of the movies I watch. On the negative side, the test showed that once you hit some optimal number of movies in a year the additional movies you might watch has a diminishing quality as the remaining pool of “really like” movies shrinks.

A deeper dive into these results begins to clarify the key issues. Separating movies that I’ve seen at least twice from those that were new to me is revealing.

Seen More than Once Seen Once
1999 to 2001 2014 to 2016 1999 to 2001 2014 to 2016
# of Movies 43 168 231 158
% of Total Movies in Timeframe 15.7% 51.5% 84.3% 48.5%
IMDB Avg Rating                   7.6                   7.6                   6.9                   7.5
My Avg Rating                   8.0                   8.4                   6.1                   7.7
% Difference 5.2% 10.1% -12.0% 2.0%

There is so much interesting data here I don’t know where to start. Let’s start with the notion that the best opportunity for a “really like” movie experience is the “really like” movie you’ve already seen. I’ve highlighted in teal the percentage that My Avg Rating outperforms the IMDB Avg Rating in both timeframes. The fact that, from 1999 to 2001, I was able to watch movies that I “really liked” more than the average IMDB voter, without the assistance of any movie recommender website, suggests that memory of a “really like” movie is a pretty reliable “really like” indicator. The 2014 to 2016 results suggest that my “really like” system can help prioritize the movies that memory tells you that you will “really like” seeing again.

The data highlighted in red and blue clearly display the advantages of the “really like” movie selection system. It’s for the movies you’ve never seen that movie recommender websites are worth their weight in gold. With limited availability of movie websites from 1999 to 2001 my selection of new movies underperformed the IMDB Avg Rating by 12% and they represented 84.3% of all of the movies I watched during that timeframe. From 2014 to 2016 (the data in blue), my “really like” movie selection system recognized that there is a limited supply of new “really like” movies. As a result less than half of the movies watched from 2014 through 2016 were movies I’d never seen before. Of the new movies I did watch, there was a significant improvement over the 1999 to 2001 timeframe in terms of quality, as represented by the IMD Avg Rating, and my enjoyment of the movies, as represented by My Avg Rating.

Still, while the 2014 to 2016 new movies were significantly better than the new movies watched from 1999 to 2001, is it unrealistic to expect My Ratings to be better than IMDB by more than 2%? To gain some perspective on this question, I profiled the new movies I “really liked” in the 2014 to 2016 timeframe and contrasted them with the movies I didn’t “really like”.

Movies Seen Once
2014 to 2016
“Really Liked” Didn’t “Really Like”
# of Movies 116 42
% of Total Movies in Timeframe 73.4% 26.6%
IMDB Avg Rating                       7.6                                  7.5
My Avg Rating                       8.1                                  6.3
“Really Like” Probability 82.8% 80.7%

The probability results for these movies suggest that I should “really like” between 80.7% and 82.8% of the movies in the sample. I actually “really liked” 73.4%, not too far off the probability expectations. The IMDB Avg Rating for the movies I didn’t “really like” is only a tick lower than the rating for the “really liked” movies. Similarly, the “Really Like” Probability is only a tick lower for the Didn’t “Really Like” movies. My conclusion is that there is some, but not much, opportunity to improve selection of new movies through a more disciplined approach. The better approach would be to favor “really like” movies that I’ve seen before and give new movies more time for their data to mature.

Based on my analysis, here is my action plan:

  1. Set separate probability standards for movies I’ve seen before and movies I’ve never seen.
  2. Incorporate the probability revisions into the algorithm.
  3. Set a minimum probability threshold for movies I’ve never seen before.
  4. When the supply of “really like” movies gets thin, only stretch for movies I’ve already seen and memory tells me I “really liked”.

Create, test, analyze and recreate.

 

In Pursuit of “Really Like” Movies, Playing Tag with Movielens Can Be Helpful

Before I plumb the depths of this madness, a few words about a Movielens tool that helps me organize the mania, would be appropriate. Movielens has a tagging system. There are Community Tags that are available for everyone to use.

You may wonder where my list of Top Ten Movies Available To Watch This Week comes from. Or, you may be curious how far Mad Movie Man madness extends. Or, maybe you neither wonder nor are curious, but are instead just happy to have another place to go to for movie recommendations. Whether you have questions unanswered or questions unasked, today is your lucky day. This is the day that you discover how obsessive I can be in pursuit of movies that I will “really like”.

Before I plumb the depths of this madness, a few words about a Movielens tool that helps me organize the mania, would be appropriate. Movielens has a tagging system. There are Community Tags that are available for everyone to use. For example, here are the Community Tags for There’s Something About Mary.

 

You can use these tags as an additional screen when searching for a movie to watch. The number next to the tag tells you how often the tag has been used for this movie. By looking at these tags you could conclude that this movie isn’t for everyone and it probably wouldn’t meet the cringeworthy test. There is a plus sign next to the tag that allows you to agree or disagree with the tag. Also, if you thought the movie was “hilarious”, you can click on the tag and all of the movies will come up that have been tagged “hilarious” Try it on the list above.

If you want to keep track of all of the movies that you thought were “hilarious” there is a box where you can make “hilarious” one of your own personal tags for this movie. If you want to add a tag that isn’t listed for this movie, you can do that too. It is a very dynamic system for gaining additional insights into a movie and for helping you to organize your movies if you are so inclined.

Which brings me back to my manic inclinations. I keep a list of approximately 450 movies for which I’ve calculated “really like” probabilities. Each of these movies I assign the Movielens tag “reviewed” to. This allows me to keep track of movies that I’ve already put on my list of 450. These movies stay on the list as long as they qualify as a recommended movie on one of the five movie websites I use. The data for each movie is refreshed every 90 days.

Movies that I come across that aren’t on the list of 450 that I’m intrigued by are tagged “prospect” in Movielens. Whenever I watch a movie from the list of 450, or it is removed from the list because one of the websites no longer recommends it, I replace it with a movie from the “prospect” list. Movielens allows you to go to Your Tags and sort the movies you’ve tagged. For example, I take the movies that I’ve tagged “prospect” and Movielens sorts them by Movielens Recommended movie, from the highest to the lowest. The highest recommended “prospect” movie moves to the list of 450 to replace the movie removed from the list.

Each Wednesday, after reviewing which movies from the list of 450 are available to watch on the various movie outlets available to me, I rank them by their “really like” probabilities, with the top ten making my list.

Now you understand why I’m the “Mad” Movie Man and how playing tag with Movielens enables my madness.

 

 

 

Until That Next Special Movie Comes Along

As I was thinking about special movies the last few days, a question occurred to me. Can I use my rating system to find movies I’ll “love” rather than just “really like”? Of course I can.

Happy 4th of July to all of my visitors from the States and, to my friends to the North, Happy Canada Day which was celebrated on this past Saturday. It is a good day to watch Yankee Doodle Dandy, one of those special movie experiences I’m fond of.

This past weekend I watched another patriotic movie,  Courage Under Fire with Denzel Washington, Meg Ryan, and a young Matt Damon among others in a terrific cast. It was one of those special movies that I yearned for in my last post on July movie prospects. It was a July 1996 release that wasn’t nominated for an Academy Award (how it didn’t get an acting nomination among several powerful performances astounds me). It earned a 94 out of 100 score from me. I loved this movie. The feeling I get after watching a movie this good is why I watch so many movies. It is the promise that there are more movies out there to see that I will love that feeds my passion for movies.

As I was thinking about special movies the last few days, a question occurred to me. Can I use my rating system to find movies I’ll “love” rather than just “really like”? Of course I can. Any movie that earns a rating of 85 out of 100 or higher meets my definition of a movie I will “love”. An 85 also converts to a five star movie on Netflix. I can rank each of the movie rating websites that I use in my algorithm from highest rating to lowest. I then can take the top 10% of the rankings and calculate the probability that a movie in that top 10% would earn a score of 85 or higher. Regular readers of this blog shouldn’t be surprised by the results.

Top 10% Threshold Actual % of My Database Probability for “Love” Movie
Netflix >  4.5 9.5% 81.4%
Movielens >  4.2 10.7% 76.9%
Criticker >  90 10.3% 55.4%
IMDB >  8.1 10.8% 45.8%
Rotten Tomatoes >  Cert. Fresh 95% 10.4% 41.7%

High Netflix and Movielens scores are the most reliable indicators of “love” movies. Here’s my problem. There are no movies that I haven’t seen in the last fifteen years that have a Netflix Best Guess of 4.5 or higher. There are fewer than 10 movies that I haven’t seen in the last fifteen years with a Movielens predicted score of greater than 4.2. Here’s the kicker, the probability that I will “love” a movie with a Movielens predicted score of 4.2 or better that doesn’t also have a Netflix Best Guess greater than 4.5 is only 62%. It seems the chances to find movies to “love” are significantly diminished without the strong support of Netflix.

On the 1st of each month Netflix Streaming and Amazon Prime shake up the movies that are available in their inventory. The July 1 shakeup has resulted in a couple of new movies being added to my list of the Top Ten “Really Like” Movies Available on Netflix or Amazon Prime. This list is actually mistitled. It should be the Top Ten “Love” Movies Available. Take a look at the list. Perhaps you haven’t seen one of these movies, or haven’t seen it in a while. It is your good fortune to be able to watch one of these movies the next time you are in the mood for a special movie experience.

As for me, I’m still hoping that one of the movies released this year rises to the top of my watch list and is able to captivate me. If it were easy to find movies that I will “love”, I would have named this blog Will I “Love” This Movie?. For now, I will continue to watch movies that I will “really like” until that next special movie comes along.

Cinemascore Is For Opening Weekend, but Beware of Grade Inflation

The movie industry has its own version of the exit poll, Cinemascore. In the pre-IMDB days of 1978, the movie industry had the same concerns with critics that they have with Rotten Tomatoes today. The industry felt critics had too much influence with the viewing public. Cinemascore filled this perceived need to balance the sway of critics by measuring the opening night reaction to a movie from moviegoers who were walking out of the theater

We have just endured another Presidential Primary season where every tea leaf was micro-analyzed and every phrase parsed to death. One of the primary tools of the political pundits is the exit poll. In key districts across the primary State, pollsters await voters as they exit the polling place to determine who the voters were pinning their hopes on to lead the free world at that very moment and why. The exit poll fills our insatiable desire for instant feedback for what we’re collectively thinking.

The movie industry has its own version of the exit poll, Cinemascore. In the pre-IMDB days of 1978, the movie industry had the same concerns with critics that they have with Rotten Tomatoes today. The industry felt critics had too much influence with the viewing public. Cinemascore filled this perceived need to balance the sway of critics by measuring the opening night reaction to a movie from moviegoers who were walking out of the theater. Like political exit polls, the theaters polled in the survey were specifically selected to provide a cross section, regionally and demographically, of the viewing public in the U.S. and Canada. Participants in the survey answer six questions about the movie they’ve just watched including the assignment of a grade from A to F.

By going to the website linked above you can view the average grade from the surveys given to recent major movie releases. You can also type in a movie title released after 1978 to see that movie’s average grade. With a paid subscription you can enter the website and presumably access results from the other five questions surveyed. Not all movies are surveyed, only those considered major releases.

How useful are these grades? Well, if you absolutely can’t wait to see a movie, but you can hold off until Saturday night, they can be quite useful. The survey sample is representative of moviegoers like you. I would expect that most attendees of an opening night movie have a high degree of interest in the movie, just like you. On the other hand, if your decision to attend an opening weekend movie is more casually made, Cinemascore could be deceiving.

The 24 recent movie releases currently displayed on the Cinemascore Home Page are ranked below by grade with the accompanying IMDB rating results:

Cinemascore
Recent Movie Results
Movie Cinemascore # IMDB Votes IMDB Avg. Rating
ME BEFORE YOU A                6,217 7.9
CAPTAIN AMERICA: CIVIL WAR A            229,127 8.3
GOD’S NOT DEAD 2 A                3,809 3.3
JUNGLE BOOK, THE A              84,945 7.8
TEENAGE MUTANT NINJA TURTLES: OUT OF THE SHADOWS A-                8,013 6.5
X-MEN: APOCALYPSE A-            102,507 7.4
ALICE THROUGH THE LOOKING GLASS A-              12,479 6.4
CONJURING 2, THE A-                7,238 8.4
NOW YOU SEE ME 2 A-                3,696 7.2
BARBERSHOP: THE NEXT CUT A-                2,099 6.1
MY BIG FAT GREEK WEDDING 2 A-                8,083 6.2
ANGRY BIRDS MOVIE, THE B+              13,128 6.4
WARCRAFT B+              45,628 7.8
HUNTSMAN: WINTER’S WAR, THE B+              23,522 6.2
MONEY MONSTER B+              11,349 6.8
MOTHER’S DAY B+                3,643 5.4
BATMAN V SUPERMAN: DAWN OF JUSTICE B            299,641 7.0
KEANU B                7,507 6.6
NEIGHBORS 2: SORORITY RISING B              16,437 6.1
POPSTAR: NEVER STOP NEVER STOPPING B                2,815 7.4
RATCHET AND CLANK B                1,778 6.1
CRIMINAL B-                5,452 6.4
NICE GUYS, THE B-              23,900 7.8
DARKNESS, THE C                1,901 4.1

If an IMDB rating of 7.3 or higher is considered an above average rating, then only a Cinemascore of A is solidly reinforced by the IMDB average ratings. Of the 7 movies  receiving an A- grade only X-Men: Apocalypse and The Conjuring 2 were considered above average by IMDB voters. A Cinemascore of A- may not translate favorably when the more general audience begins to view the film. If on the other hand, you are really into Christian movies and you were really looking forward to God’s Not Dead 2, Cinemascore is going to be a better indicator of the quality of the movie than IMDB, whose voters may not be representative of your taste in movies.

Cinemascore was created before we had sites like IMDB. It still has its use for “must see” opening weekend moviegoers and movies for unique tastes. Once you get past opening weekend, however, IMDB is probably a better tool for word of mouth feedback.

***

6/17/2016

I’ve entered my final estimate for Finding Dory this morning. The early indicators are that this will be a critical and box office success. I’ve forecasted it will be a “really like” of 85%.

Netflix Streaming: The Other Story

So if you go to streaming Netflix with a list of movies in mind that you will “really like” you are bound to be somewhat disappointed, and Netflix doesn’t want you to be disappointed.

When you think about it, the story of Netflix is rather remarkable. They slayed one industry, video rental stores. They no longer exist. Since Netflix began streaming their entertainment properties in 2007, cable TV companies have been struggling to stay relevant. Many millennials have never been cable customers, preferring streaming options like Netflix, and more and more existing cable customers are cutting the cord. Now, Netflix is in a pitched battle with the producers of movie and television entertainment. On February 1, 2013, Netflix premiered House of Cards to rave reviews and signaled their intent to become the first worldwide streaming network of original content, for both TV and cinematic films. Their powerful rivals in this battle are responding (In response to Netflix’ announcement of 600 hours of original programming in 2016, HBO announced 600 hours of their own.) and the outcome is still in doubt, but, if track record means anything, don’t bet against Netflix.

In each of these Netflix inspired industry revolutions, Netflix has had their finger on the pulse of consumer frustration. They understood how frustrating it was to do business with video stores; the inability to find movies that you would “really like”, the wasted expense when you didn’t get a chance to watch the movie you rented before it had to be returned, and the additional fees for returning the video late. Netflix had an answer for all of these DVD rental frustrations. Netflix understood the frustration of expensive cable bills that supported obscure channels that never got watched and responded with an internet alternative. And, now, Netflix is using all of the data they’ve collected that indicate what their customers enjoy watching and producing original content that their customers should enjoy watching.

So, how does all of this Netflix history impact our quest to find movies that we will “really like”? Put simply, the gold standard algorithm used by Netflix-DVD may be an endangered species. Netflix’ vision of creating a world wide streaming network filled with their own content is being successfully executed, with 75 million subscribers in 190 countries. Of those 75 million subscribers, only 5 million are DVD subscribers. While the DVD business contributes profit to the bottom line of Netflix, it is a part of its past, not its future.

As for the algorithm that Netflix offered $1 million to try to improve upon, it no longer fits into their plans. Its value to the DVD business is that it assists subscribers to find movies that they will “really like” and put them in a queue so that, even if the movies and shows you most want to watch are unavailable, the next DVD in the queue will be one that you will “really like”. On the streaming side, satisfying their customers requires a different strategy. Because of the cost to license movies and shows to stream, and because of the huge investment Netflix has made in original content, the library of entertainment that exists on Netflix is smaller. It was reported last week by Allflicks, a website that tracks what’s available to watch on Netflix, that in a little more than 2 years the number of shows and movies available to watch on Netflix has shrunk by 31.7%. The number of movies available to watch instantly went from 6,404 to 4,335 during that time. So if you go to streaming Netflix with a list of movies in mind that you will “really like” you are bound to be somewhat disappointed, and Netflix doesn’t want you to be disappointed.

Netflix is a data behemoth. Not only do they collect the ratings that you give to each movie, they know what movies you’ve browsed, when you browsed it, on what device you browsed it. They know if you started to watch a movie and stopped. From that data they have determined that if a typical viewer doesn’t find something to watch on Netflix within the first 5 minutes of browsing, they will go someplace else to find something to watch. They have created 76,897 unique ways to describe their content. They know which of those 76,897 will most appeal to you and organize them into rows of content and put them at the top of your list so that you will choose to watch one of their movies or shows available on Netflix. They even know to not show you heavy movies like Schindler’s List on a Wednesday night when you just get home from work. Yes, it is that creepy.

My recommendation is to use Netflix like any other home viewing entertainment option available. Know what you want to watch before you go there. If they have it, great. If not, go somewhere else to watch it.

 

 

What Shall I Watch Tonight?

The first day of each month is a big day in my obsessive quest to watch movies that I will “really like”. At the end of each month I recalibrate my probabilities and start the next month with a fresh Top Ten Movies to Watch list (see updated list) . Here’s the rub, only one of those movies is available for me to watch tonight. It therefore is really a list of the Top Ten Movies to Watch Someday.

I’m adding a new list under my Movie Lists section, Top Ten Movies Available to Watch This Month. Technically, almost any movie I want to watch is available if I’m willing to pay for it. But, I do have a budget with an already significant allocation to it. So, the movies available for me to watch in a given month are limited to “free” movies available from my cable company (Comcast), HBO, Showtime, Amazon Prime, Netflix, and Netflix DVD (2 a month limit). My list is made up of the movies that are available to watch this month on these platforms plus two movies from Netflix DVD. I’ll generally use the Netflix DVDs to make a dent in my Someday list. There are also some miscellaneous streaming channels (Crackle, Tubi TV etc.) that I’ll use on occasion.

Which brings me back to the first day of the month. While each of these platforms will make some weekly additions and deletions to their available movies, there are wholesale changes on the first day of each month. The supply and demand curve for “What Shall I Watch Tonight?” can be radically altered on the first day of each month.

So, “What Shall I Watch Tonight?” I don’t know yet but check out the list. You’ll find it there.