Movie Music and Its Recognizability
(click on images to enlarge them in a new window)
The Setup
The first movie score I remember completely listening through separate of the movie it was written for was John Williams’ score to Harry Potter and the Sorcerer’s Stone. Since then, my interest in film scores and, subsequently, my personal collection, have grown considerably, leading to my current collection of 116 complete instrumental film scores, plus several individual themes. Typically, a movie will have what is considered the “theme song,” with some of the most iconic being those for Star Wars, Superman, and Pirates of the Caribbean, among others. When considering those that everyone seems to recognize in comparison to others that may be less recognizable, I grew curious; what made some movie theme songs more recognizable than others? I decided to do some research on this, assuming that there had to be some sort of correlation between the most recognizable film themes and between the least recognizable film themes.
I first went through my iTunes library and compiled a list of what I personally considered to be the most recognizable movie themes. I did not set a limit for myself, only knowing that I wanted diversity; I wanted to be sure to include originals and remakes (i.e. old Star Trek vs. new Star Trek, old Batman vs. new Batman, etc.), several themes by the same composers, and themes from movies that were made by the same production studio, which is why I used themes from four different Pixar movies. Selecting anything and everything that fit within these parameters while remaining what I considered to be “recognizable,” my list came out to forty-one individual “theme songs.” From there, I selected ten- to twenty-second-long clips of the main themes from each score and put them all into a playlist. I arranged the themes so that no two themes by the same composer were located right next to each other on the playlist, and likewise for originals/remakes or Pixar films. The playlist can be found in the following YouTube video:
Once this had been done, I had to decide what I was going to use to categorize and compare each theme to one another so I could see what it was that made each one recognizable or not, so I created the following survey:
The reasoning behind each question is as follows: Name, Classification, Birth Year) I will admit that, because the three classes whose data I used were all classes that I was in, I only asked for the names of the participants so that I could see how my friends did. As for Classification and Birth Year, I ended up not using this data because it proved irrelevant; most participants were born within a five-year period, with too little variation to figure that data into the results. Title) Asking for an answer whether they knew (or thought they knew) for sure or not would allow me to factor in the subconscious. Sometimes, you can remember things on instinct rather than on knowledge, and so I thought that it would be interesting to see in the results. 1) Asking whether or not they had seen the film (again, or thought they had seen it) that they thought the theme was associated with allowed me to see if the recognition of a theme was based on experience or cultural permeation. As shown in the results, not all of the most recognizable film themes were seen by everyone who was surveyed, showing a deeper level of cultural permeation. 2-4) While not every film theme ever composed can fit into these categories, I thought that people would be able to fit each theme into at least three of these six categories without stretching the imagination too much. Having each person sort each theme as they saw fit gave me an opportunity to see how people think; if there was a trend for the more recognizable film themes versus an opposite (or simply different) trend for the less recognizable themes, I could attribute it to the categories that they fell in.
I visited four college classes in total, though I had to throw out the results from one class due to not being able to complete the survey before the class ended. In each class, I passed out the two-page survey, explained the rules as outlined at the top of the page, answered any questions, and got started. Playing each clip only once, I gave an average of about twenty to thirty seconds between each clip in which the students could answer all five questions. The main reason for this speed was so as not to take up more of the teacher’s class time than necessary, though I certainly wish that I could have made it a bit less frantic. Each class took about twenty-five minutes to get through the whole survey, and, with fifteen students in each class, that brought the total number of participants up to forty-five.
The Data
Data entry was more complex than I had initially anticipated, mainly because there was so much of it; doing the math, I discovered that, with three classes, fifteen students in each class, forty-one film themes, and five questions per theme, the total number of possible cells of data in Microsoft Excel was over nine thousand (15 x 3 x 41 x 5 = 9, 225). It took six hours to input the data for the first class alone, with me working over a span of three days for several hours each day in order to get the data from all three classes completely inputted into an Excel spreadsheet. I inputted each class separately, sorting the data by film, so the data looked like this:
As I inputted the answers for the first two questions (the title of the film and whether or not they had seen it), I used the number value “1” for incorrect answers and for people who had not seen the film, and I used the number value “0” for correct answers and for people who had seen the film. When these numbers were averaged together, the result was a percentage which, as the above picture shows, represents the percentage of people who guessed the title incorrectly and the percentage of people who had not seen the film, so flipping the percentage would result in the opposite (correct percentage, seen percentage). I made a couple of assumptions when inputting the data for the second column: I only counted a movie as “seen” if the person put a definite “yes” as their answer, which means that I marked that they had not seen it if they put “maybe,” “not sure,” or simply left the space blank. I did this because I wanted to only include definite “yeses” or, in other words, people who were confident enough in their answer or who knew they had definitely seen the film but the title escaped them. Aside from averaging together the totals of the categorical values, I kept track of one more piece of data: how many times someone guessed the wrong movie but managed to guess another movie with a score composed by the same composer, which I figured into the results later.
The Results
Speaking of results, the most recognizable film themes were mostly what I expected, though I was surprised by a couple of exclusions from the top of the list, namely John Williams’ themes to the original 1978 Superman film and to the 1982 film E.T. The Extra-Terrestrial. It is interesting to note that John Williams’ theme for the Harry Potter film franchise is the only theme that was universally guessed correctly and seen, with his theme for the Star Wars saga being in a close second. In fact, the top three most recognized film themes are all Williams compositions, with the fact that roughly only half of the participants had seen Jaws showing the depth to which both the film and Williams’ theme music have permeated our culture. The other film theme with a notable drop from percentage correct to percentage seen is Monty Norman’s original theme to the James Bond films. With twenty-three films featuring the character and this theme song to date, the most recent having been released just last year, it is easy to see why so many people were able to guess it correctly.
In comparison, you can see that all but two of the least recognizable film themes had less than fifty percent of people saying that they had seen it, with Randy Newman’s theme to the Pixar film A Bug’s Life and Alan Silvestri’s theme to Night at the Museum being the exceptions. Also, the only two film themes that nobody was able to guess correctly were James Horner’s theme to The Amazing Spider-Man and John Williams’ theme to War Horse, which is intriguing because it puts John Williams themes at both the extreme top and extreme bottom of the list. This list also includes three films whose scores were nominated for Best Original Score at the Academy Awards. What caused this? Now that I knew the results, what conclusions could be drawn?
The Analysis
Looking at the category results for the most recognized film themes, there is quite a bit of variation. For example, in the sad to happy category, four of the six themes were considered to be happier (higher bars), while the themes for Jaws and Titanic were both thought of as sad. Taking these variations into consideration, I decided that it would be more practical to find the average of the four “happier” themes rather than average all six values together, so the idea was to find the trend rather than the average. This led to the shown values, or 6.5 in the sad to happy category (moderately happy), 7.6 in the pretty to exciting category (mostly exciting), and 4.8 in the light to dark category (neither one nor the other). The results for the least recognizable films…
…show just as much variation, though, by looking at the typical trend, a couple of basic observations can be made. The sad to happy rating is about the same as it was for the most recognizable film themes at about a 6.5 (moderately happy), with a 4.3 in the pretty to exciting category (moderately pretty), and a 3.1 in the light to dark category (mostly light). Between the two graphs, the following basic observations can be made: the more recognizable film themes are generally more exciting, and the least recognizable film themes are considered to be slightly lighter. Perhaps this reveals that the combination of exciting music with a darker tone makes for a more recognizable film theme, while prettier and lighter is more forgettable.
However, due to the large variation from theme to theme in categorical rating, I am unwilling to put all of my faith in these values. There was too much room for error in these results; many people did not provide ratings for the categories for every single film on the survey, meaning that I had to leave blanks in the spreadsheet. This also means that this data is not entirely representative of the whole group, unlike the first two questions (film title and seen/not), for which data exists for all forty-five students. What questions can be asked based on these first two questions alone?
Further Questions
It occurred to me that the release year of the movie might be a factor, especially among the general age group of these participants, the majority of whom were born between the years of 1985 and 1994. Using the same list of most and least recognizable film themes, taking note of the release year of each movie, or, for series like Star Wars, the release year of the first movie in the series, and averaging them together, you can see that the average release year of the most recognizable is about twenty years older than the average release year of the least recognizable. To confirm that this is indeed a correlation, I took the average of all of the films located between the most and least recognized and averaged their release years together, which ended up being 1996, showing that my suspicions appear to be correct. Though the majority of the test group had not yet been born in 1986, these results could suggest that the more time that the film and its music have spent in “cultural circulation,” the more likely it is to be recognized by the general population, which is why the newer themes, some from as recent as 2012, were not as widely recognized by the participants.
The final bit of data I took into consideration was the average domestic box office gross of each film. I assumed that the higher the box office gross for a film, the more likely that it would have permeated the culture; a higher box office gross means that more people saw the film. For series of films, I averaged the domestic box office gross of every film in the series to come up with the average per film in the series, then averaging together all of the most recognizable themes and the least recognizable, as well as for all of the films in between for confirmation, and the results once again seem to support the theory.
Not shown above is the average for the middle films, which came out to be $241,059,348, showing the correlation. Five of the most recognizable films, or, at least, films from the series, are located on the list of the top twenty highest-grossing films of all time, while none of the least recognizable films are located anywhere higher than forty-eight on that same list. This is a correlation that makes sense in that it monitors the number of people who are theoretically familiar with the film.
The demographics of the participants varied significantly from class to class; I classified one class as “non-music majors,” the second class as “music majors,” and the third, a class about the cultural relevance of superheroes, I affectionately referred to as “the geeks.” I would not have separated these demographics if I had not noticed a significant difference between each class. The “non-music major” class had the fewest number of film themes that everyone guessed correctly, with only two, whereas the “music majors” had the highest number of themes that everyone guessed correctly, with six. Additionally, the “music majors” had several people leaving more cells blank than people in the other two classes, meaning that they were more careful with their answers, resulting in the higher number of themes that were unanimously guessed correctly. The “geeks” came in second place with five themes that everyone guessed correctly; these people I considered more likely to be people like me, who spend more time with the movies and their music than the average person.
The Composers
Leaving behind the argument of what makes the theme recognizable or not, I did my best to choose selections from a variety of composers, but, of course, John Williams is hard to avoid when you are creating a list of most recognizable film themes, so he appears on the playlist eleven times. As mentioned before, I kept track of how many times someone guessed the wrong movie but still guessed another movie with a score composed by the same composer. I multiplied how many times a composer appeared in the playlist by forty-five, the number of participants, to show how many times it was possible for the group to guess the composer’s themes correctly, and then I divided the number of times the wrong film but correct composer was named in order to get a percentage of how many people guessed him correctly.
I have always said that all of Hans Zimmer’s scores sound the same, and now I have proof (kidding)! These percentages are exactly what I would have expected; I have always argued that John Williams’ music is recognizable because he has a distinct sound, whereas Zimmer’s music is recognizable because all of his themes sound the same. Something to note is that William’s highest number of incorrect guesses while still guessing the correct composer was eight with his theme for Superman, while Zimmer’s themes for Inception and the Christopher Nolan Batman trilogy each had ten incorrect guesses, with most people switching up the two films…because they sound the same! I mean this mostly in jest, though the results certainly seem to support my argument. Another interesting tidbit of information gathered from this data is that the number of people who guessed Randy Newman’s themes (A Bug’s Life and Monsters, Inc.) guessed either Toy Story (another Newman score) or some other Pixar film, most often Ratatouille, which was scored by Michael Giacchino. This general association of Pixar music sounding the same is one reason why I wanted to include multiple made by the same production company; it shows that people associate similar films together, even by the music.
Conclusions
This study was by no means perfect. If I could do it again, I would arrange to have more time to complete each survey for each group. Twenty-five minutes, though it allows completion of the survey, is too rushed, not giving people time to think. Although a gut reaction could be encouraged, I think relaxing everything just a bit so that the study is not so frantic would benefit it as a whole. In addition, while forty-five people allows for a decent sample, I would preferably have more participants, especially from a wider demographic. Forty-five people born within a ten-year window is not indicative of the population as a whole, so having a wider age range would be ideal.
My personal collection of film music is quite extensive, but, admittedly, there is a large amount of music that I do not own, so a second study would ideally utilize more music. I would also like to take out my own bias by having a sort of “committee” choose the film themes that would be used for the survey. Additionally, creating more categories would provide greater variety in ways that people can categorize the music, showing possible greater correlations between most and least recognizable film themes. Ensuring that everyone fills out every possible blank on the survey form would be excellent so that the data collected can be completely indicative of the population surveyed. Lastly, eliminating conversation and making sure that the survey environment is completely quiet would help in making sure that everyone’s answers are their own and that everyone has equal opportunity to hear every clip in its entirety.
Starting out on this research project, I assumed that there was some sort of correlation that made a film’s theme music recognizable or not, and the data collected seems to support this theory. There are several ways that this correlation manifests itself; the category ratings do not provide an obvious relation, but the release years and the average domestic box office gross certainly do. While studying this data, I could not help but think about how interesting all of this was, and I honestly hope that I get the opportunity to perform this survey again on a larger scale.
—————————————————–
I hope you all liked the first post on my new supplemental blog to my ChadLikesMovies review site, ChadTalksMovies. Your feedback would be much appreciated; the future of this blog and what it becomes will rely quite a bit on what you all think!
-Chad