Review System for Films
Greetings, Programs, and Happy New Year!
I know I said in my last post that I would be writing more regularly, though that post was from last September… In truth, though, I have been writing more regularly, just not here! I’ve been working on a short story for quite a while now that has slowly but surely blossomed into a novella, and after peer reviews and editing I am now actively looking into having it published, which is exciting! I’ll post more updates about that story here as I have them. In the meanwhile, though, a new year means new blog posts here! And what better way to start a new year (albeit in March) than looking back and reflecting on the year which just concluded?
Something you may have gathered from this blog or this website it that I really love films (and games, but that’s a post for another time). For the past several years, I have been iterating on a review system for films to ultimately cultivate my own rankings of the best films from a given year. What started as a side project to improve my analytical skills has now become my own award show of sorts, and because I am proud of that system I figured I would write about it here!
Before I introduce my system/criteria, allow me to introduce how and why I developed this system in the first place. I have often been asked and have asked others what their favorite movies are from a given year, but the most common response to such a question is confusion amidst a struggle to remember what all released in a given year. To personally resolve this issue, I began writing down when I watched movies, how I watched them (in theaters or at home, digitally or on 35mm film), and my initial thoughts on them, accompanied by a rating out of five stars. I quickly found that I tended to rate films on the top half of the scale, so I adjusted my scale a bit and defined some criteria by which films could be rated on my scale. The biggest changes wrought by this shift are as follows:
A good movie - the standard enjoyable movie experience, as I define it - became rated as 3 stars. Envisioning my rating system as a Bell Curve, most films should fall into this category, in the center of the scale.
4.5 stars is a technically perfect film, one that I cannot think to improve in any way. I can identify all the different elements of it (cinematography, direction, editing, etc.) and all of them are perfectly intertwined to create a seamless experience. When thinking about a film, if I can finish the sentence, “I loved it, except for ____”, the film does not belong in this category.
5 stars is reserved for the films that speak to my soul, the films which I cannot stop thinking or talking about. Many “perfect” 4.5 films are films which I stop thinking about shortly after watching or otherwise don’t carry with me into my day-to-day life, but the 5 star moniker is reserved for those which stay in my psyche long after the credits roll.
With this new system in place, I quickly started thinking of how I can break down my analysis of craft and effectiveness further. This quickly snowballed into analyzing how award shows like the Oscars (the Academy Awards, if you insist on being formal) analyze films and thinking of the benefits and pitfalls of such systems. To explore this further, I decided to choose a few “nominees” at the end of each year and analyze them under a larger microscope, breaking down each of my ratings further into different components that make up the films themselves: story, acting, cinematography, etc. After rating each of the nominees (out of 10 possible points) on each of the possible categories, I then combined these scores and ranked the films by their scores to determine the “best” films of a given year. I sought to reduce the subjectivity of my reviews of films, though I still maintained an element of personal preference.
Hence, the first iteration of my review system was formed in 2022 with the following criteria:
Story
Acting
Cinematography
Creativity
Editing
Music
Sound
After rating each nominee for each of the categories (out of 10 possible points), I decided to introduce a bit of subjectivity with weights to certain categories based on the elements of film which most contribute to my enjoyment of them. This is my review system, after all, so I decided to add a small dash of my personality to it in this way. If you choose to use this criteria/system, feel free to change or remove the weights altogether! That said, and with the knowledge that I prioritize story, creativity, and music the most, here are the categories with weights from the 2022 year:
Story (x5)
Acting
Cinematography
Creativity (x3)
Editing
Music (x2)
Sound
These new weights ultimately totaled to 140 possible points, so I rated all my nominees relative to that value. For this first year, I decided the nominees were all the films released that year which I watched in some format, so in total I had 13 nominees. On this scale, I rated Babylon the highest (137 points), with a tie for second between The Batman and Everything, Everywhere, All At Once (135 points). I loved many of the films which scored in the lower half of the nominees, and I largely expected my list’s top results, so I took these as signs that my system was working relatively well.
For those who are particularly interested in the big award shows each year (The Academy Awards, The BAFTAs, The Critics Choice Awards, etc.), you may be wondering about my confidence in my system despite its results not aligning with the results of the awards shows. For example, Everything, Everywhere, All At Once won Best Picture at the Oscars this year, yet it placed second on my list. I can address this in a few ways:
My system is at least partially subjective with its use of weights for certain categories. Many large award shows do not formally factor in categories at all, instead asking a body of voters to rank-order their preferences for each award. My system is a conglomerate score from the different categories and is thereby different from the scoring systems of the major award circuits.
Award circuits do not by any means justify fact! Although major award circuits carry with them prestige and visibility, their outcomes are often much more representative of an agenda as opposed to excellence of craft. There are thousands of lists out there about the worst Oscar winners of all time, yet the films in such lists are still Oscar winners. An award from a major award circuit goes a long way and is very meaningful in various regards, but it is not a guaranteed moniker of excellence. This fact maintains a need for additional forms of review and scoring, where true excellence lies somewhere in the middle of all available options.
After achieving relative success in my own view with my system, I decided to carry the same methodology into the next year, maintaining the same categories and weights while again rating films out of a possible 140 points. For 2023, though, I had 18 nominees of films I watched that year (this was enough to fill up exactly one full page of notebook paper, and though I didn’t rate all the films I saw from 2023 I chose the 18 which I deemed to be the most “worthy” of being nominated). For this year, Oppenheimer took home the top prize (140 points - I saw it 6 times in theater and thought it was virtually perfect in every aspect), followed by Across the Spider-Verse (138 points) and a tie for third between Poor Things and John Wick: Chapter 4 (137 points).
Again, the top films were relatively what I expected, so I figured that my system was working at least relatively as expected. These rankings were largely reflected in the major awards circuits as well, with Oppenheimer and Poor Things winning big in every instance, for whatever it’s worth. However, after rating these films I added a sticky note to the sheet of rankings with thoughts of improvements for the next year:
Cap # of considerations?
Add preference score (or just ranking)
Other Categories? Set/production design?
Should older movies be considered? Maybe yes and no? Should I do both just to see what happens?
Thus brings me to the list from this past year, where I made the changed I pondered in 2023. For this new year, I added two categories, which are (with their weights, respectively):
Set/Production Design
Rewatchability (x0)
The Rewatchability category was added purely for my own curiosity, comparing the amount of times I watched a movie throughout the year in comparison to its score; as such, this curiosity is reflected by a multiplier of x0 to ensure it doesn’t otherwise impact results. I also decided to include films from previous years which I watched this year in the 2024 list, with the intent being to determine the best of all the films I saw this year. I restricted the nominees of films from previous years to films which I watched for the first time this past year (otherwise watching Inception each year as I do would ensure a nomination for it each year - a feat which has its worth in the right review systems but does not match the result I am ultimately aiming for with mine), and as a whole for this year I limited the amount of nominees to 10 at a maximum (I nominated 8 from this past year and 10 from all years, then only analyzed the films which I hadn’t seen before 2024 between the two lists). This year, the nominees were judged against 150 possible points (which is a nicer number than 140, in my humble opinion), and again the results were relatively as I expected. Dune: Part Two took the top prize (150 points), Nosferatu followed in second place (146 points), and Furiosa took the third spot (145 points). A film like Conclave - a film which I loved 90% of but was immensely disappointed by the abruptness of the ending - scored low in the Story category due to my singular gripe but scored highly in all the other categories. Despite these high scores, the film ranked 10th of the 13 nominees (as an aside: I know I mentioned 10 nominees just a few sentences ago - this 13 represents an aggregate value of two lists I made, one with exclusively films released in 2024 and another with films made in any year which I had not seen prior to 2024), a fact that to my mind reflected the importance of the subjective weights added to specific categories.
I have some thoughts as to what to change for next year, but given that I am generally satisfied with the current state of this system I have created I wanted to extend my work so far to the masses and to hear your thoughts on it! Do you like this system? How do you think it can be further improved in future iterations? What would you change about it if you were to adapt it to your personal use? As always I would love to hear your thoughts!
And with that, the search for a publisher (and the grind of another project or two…) continues! I will be posting here more often with any updates I may have as well as other musings like this, so feel free to subscribe to my new email list (!!!) if that interests you! In any event, thank you for reading - I look forward to hearing your thoughts!
- James