“Sweet Christ this was good” – Analysing recommendations in movie reviews

The author of this post is Julia Mizerski, a B.A. student of English Studies at the University of Bonn, who has written an excellent term paper in the course “The Pragmatics of Digital Communication” (Dr. Stefanie Pohle) in the summer term 2018.

Have you ever watched a movie after reading a very convincing review? Or have you ever wondered how a reviewer manages to get you excited about a movie? Well, I did, and so I chose to explore how the comment “Sweet Christ this was good” on the movie “Brawl in Cell Block 99” (2017) can get me to watch a movie.

It isn’t really that hard to come up with a research question that combines one of your passions with linguistics (that is, if linguistics isn’t already your passion). After all, everything is connected to language somehow. As my passion, besides linguistics of course, are movies, I wondered: Do reviews for different movie genres differ from each other? So, to be more precise, I tried to find out whether reviews of the drama “Professor Marston and the Wonder Women” (https://www.youtube.com/watch?v=r991pr4Fohk) differ from reviews of the action movie “Brawl in Cell Block 99” (https://www.youtube.com/watch?v=5hfAExhHTMM).

For this analysis, which was for a term paper in my linguistics class, I focussed on how reviewers recommend that the reader of this review watch a particular movie. In order to analyse these recommendations, I sorted the reviews into different categories. This was important, because not all recommendations look the same. While some are very direct (e.g. “watch it”), others can be very subtle (e.g. “it was so refreshing”).

The collection of reviews that I had a look at (which is called my “dataset”) were taken from “letterboxd” (https://letterboxd.com/). This is a website which lets its users rate and review movies. As I was looking for recommendations, I selected only positive reviews for the two movies I wanted to focus on. However, these were still too many, so I ended up selecting 50 reviews for my analysis by using a random number generator (https://www.random.org/). This is a tool that can help with the selection process by letting chance decide which cases to choose, so that you as a researcher don’t accidentally prefer one kind of reviews over others.

Working with my dataset looked like this: The categorization system I chose included different categories of recommendations. I went through all 50 reviews, looked for phrases that could be seen as recommendations, and put each recommendation in a fitting category of that system.

The most important recommendation strategies in my categorisation system were direct and indirect recommendations, which both split up into smaller categories again. The direct category included examples that were, as the name already suggests, very direct. If somebody wrote “see this film” in their review, this would be the category for it. The indirect category however includes mostly hints. An example for this would be the description of a movie as “incredible” in a review.

To show you an example, here is a review I analysed:

I decided to classify “This gem of a film flew under the radar in 2017” as a recommendation, and more precisely, as an indirect one, because the reviewer doesn’t tell the reader directly that they recommend the movie.

After sorting through all 50 reviews like this, I looked at how many recommendations made up each of the categories in my system. What was surprising was that an overwhelming amount of the recommendations fell into the indirect category, which mainly consisted of strong and mild hints.

Here are some of the recommendations I classified as strong hints: “absolutely phenomenal”, “a wonder, truly”, “what a great film”, “a satisfying, bloody good time”. The following recommendations, on the other hand, were seen as mild hints: “This is a breath of fresh air”, “I’m so invested”, “I am genuinely so overwhelmed”, “I could easily watch this again tonight”.

Now the difference between these categories might be hard to spot, so here comes a little more explanation. Strong hints were mostly positive words – words that can be seen as positive even in another context (e.g. “phenomenal”). Mild hints on the other hand refer to personal evaluations of the movie. Just because someone is “overwhelmed” by a movie, that doesn’t mean you would be, too. Still, the reviewer tells us that they think the movie is good, so it’s a hint, just not a strong one.

So what did my analysis show? Well, mainly that people like to use strongly positive words for recommending a movie no matter if it’s a drama or an action film. If someone tells you that a film is “great”, “amazing” or “wonderful”, they might want you to watch it. When it comes to mild hints though, the reviews for the drama are quite far ahead and reviewers tell you that they loved the film and even use heart-emojis. Reviewers of the action movie don’t do that very much – or at all.

Here’s the gist of it: People writing movie reviews don’t seem to be very comfortable giving direct recommendations like “Go see this fucking film”, they rather use hints. And if you look very closely, you might even find differences in how people use hints to recommend specific movies. To come back to my initial question how a reviewer manages to get me excited about a movie: a hint such as “Sweet Christ this was good” definitely does the job of a recommendation.


You may also like...