A Computational Approach to Analyzing Musical Complexity of the Beatles

paper, specified "long paper"
Authorship
  1. 1. Manuel Burghardt

    Leipzig University of Applied Sciences

  2. 2. Florian Fuchs

    Universität Regensburg (University of Regensburg)

Work text
This plain text was ingested for the purpose of full-text search, not to preserve original formatting or readability. For the most complete copy, refer to the original conference program.


Introduction: The Beatles and musical complexity

The Beatles are considered to be one of the most influential bands of the 20th century, who still shape and influence pop music today (Everett, 2001). In the course of their creative history, the band has proven an enormous range and variety of individual compositions. One reason for the unusually large musical diversity of the Beatles is that with Paul McCartney, John Lennon and George Harrison three persons were involved in composing the Beatles’ songs (MacDonald, 1995). In addition, the Beatles’ producer George Martin also had a considerable influence on the composition of many songs (Martin & Hornsby, 1979).
In this paper we will investigate how this diversity in composition is reflected in musical complexity in the work of the Beatles by using computational methods. So far, related work studies can be found in Mason (2012), who statistically analyzes the properties of Beatles songs in order to decipher what he calls the “Beatles genome”. The aspect of complexity has already been investigated by Eerola et al. (2000), who use a MIDI corpus to analyze the relationship between musical complexity in Beatles songs and its effect on chart placement. Eerola et al. (2000) also discover a highly significant increasing time-trend in melodic complexity, i.e. the Beatles’ songs became melodically more complex through the course of time. While Eerola et al. (2000) only looked at singular melodies and their complexity, we present a study that takes into account all of the available melodies and also the chords to compute the complexity of a song. We present an exploratory tool for the interactive visualization of musical complexity distributions in the work of the Beatles. The visualizations can be scaled from all of the Beatles’ albums to single songs and composers, to investigate complexity on a more detailed level.

Corpus and Method

Corpus – The corpus used for this study is based on guitar tablatures from the online platform
Ultimate Guitar

Ultimate Guitar portal: https://www.ultimate-guitar.com/, (Note: all URLs mentioned in this paper were last checked Nov. 25, 2018)

, which were downloaded in
GuitarPro

Guitar Pro Tool: https://www.guitar-pro.com/

format and converted to
MusicXML

MusicXML documentation: https://www.musicxml.com/for-developers/

for analysis. This platform has already been successfully used as a data source for other scientific studies (Di Giorgi et al., 2017) and includes 205 songs from the Beatles’ first single in 1962 to their last studio album in 1970.

Normalization – To be able to compare songs with different scales to each other we normalize the note inventory according to Cuddy et al. (1981), who propose the
Roman Numeral Analysis method. With this method all notes of a diatonic scale are represented by Roman numerals, starting from the basic note of the scale as step I. Tones that are not part of the scale are marked with a sharp (#) (cf. Fig. 1).

Example for the classification of tones according to the Roman Numerals Analysis using the scales of C Major /A Minor and D Major / B Minor.

Complexity model – To operationalize the concept of musical complexity we rely on experiments by Krumhansl & Shepard (1979), in which test persons were asked to evaluate how a certain tone completes the tone sequence of a scale. The results show that scale tones (the keynote in particular) are rated better than non-diatonic tones. Building on this previous work, we define four levels of complexity (cf. Fig. 2). We use this complexity model for both, the analysis of single notes as well as for chord progressions.

Classification of tones into different levels of expectation-based complexity using the example of C Maj. / A Min.

Computation of results – Our corpus of MusicXML files is analyzed by means of Python scripts that parse individual notes and chords from the data and count their frequencies. For the recognition of chords, the existing
music21

Music 21 Toolkit: http://web.mit.edu/music21/

library is used, as it provides many useful functions for the analysis of transcribed music. Because both the tone material and the chord material of each song are to be analyzed on the basis of the previously described Roman Numeral categories, it is necessary to identify the scale of a song. In many songs the scale is explicitly annotated by means of global accidentals and therefore can be extracted directly from the MusicXML transcription. For those cases where global accidentals are missing, we apply an existing algorithm for scale detection (Madsen et al., 2007). By assigning notes and chords to the previously introduced four complexity levels, a complexity distribution can be calculated for each song and album.

Results and Discussion

The results can be analyzed statistically, to detect general trends in the development of musical complexity in the work of the Beatles. We conducted a Pearson-Bravais (Pearson, 1895) correlation test to investigate how musical complexity has developed through time. For each year, we calculate the frequency of tones and chords that belong to the complexity levels 1+2 (rather low complexity) and also the frequency of tones that belong to the complexity 3+4 levels (rather high complexity). For the higher complexity levels, we find a weak positive correlation (cf. Cohen, 1988) for both tones (r = 0.208, p=0.005) and chords (r = 0.167, p=0.024). These results indicate a gentle trend toward increased musical complexity over time, but cannot confirm the observation of a highly significant correlation as noted by Eerola et al. (2000). Our results rather seem to correspond with existing research on the musical development of the Beatles, which does not describe a general complexity trend, but rather identifies different phases (of different composers) and individual albums with increased complexity (Everett, 2001). This observation is also illustrated by the following graph (cf. Fig. 3), which does not show a clear trend in the development of complexity levels, but rather goes up and down over time.

Development of tonal complexity levels

The graphs for chord complexity levels largely correspond to the tonal complexity levels. The detailed graphs are available via the online visualization tool.

(level 1 = low complexity, level 4 = high complexity) for the Beatles’ albums over time.

We can also show that a general complexity trend for albums is problematic, as even just a few outlier songs can substantially influence the overall complexity score of an album. This can be best illustrated for the album “A Hard Day’s Night”. While Fig. 1 suggests that tones on the complexity level 4 for the whole album increased as compared to the previous (and also the successive) albums, a closer look (cf. Fig. 4) at the individual songs shows that in fact only two of the 13 songs show high complexity on level 4 (“And I Love Her” = 31%; “When I Get Home” = 42%).

Overview of the distribution of tonal complexity levels (level 1 = low complexity, level 4 = high complexity) for all songs of the Beatles’ album “A Hard Day’s Night” (1964).

These observations reflect the initial notion that the work of the Beatles is extremely heterogeneous, which is also a result of the band’s different composers and their individual musical development. When we look at musical complexity from the perspective of individual composers (cf. Fig. 5), Paul McCartney seems to be the most stable composer, i.e. his complexity curves largely correspond to the overall complexity curves of the Beatles albums (cf. Fig. 3). George Harrison and John Lennon each have several albums where they contribute songs with higher complexity.

Overview of the distribution of tonal complexity levels (level 1 = low complexity, level 4 = high complexity) for the main composers of the Beatles. Note that George Harrison was not involved as a composer for the albums “Please please me”, “A Hard Day’s Night” and “Beatles for Sale” and overall only composed 24 songs.

The absence of a general trend or temporal pattern in musical complexity lead us to present the results of our computational approach in a rather exploratory interface that can be used to look at complexity from different scales. The visualization tool is available online

Visualization tool: https://fuchsflo90.github.io/beatles-analysis/#

and can be used to explore complexity for both tones and chords, for any album of the Beatles filtered by the main composer. In addition to the complexity scores, it also provides information on the frequencies of different scales and rhythms.

Conclusion
In this paper we showcased the application of a computational approach to measure musical complexity in a corpus of user-generated transcriptions of Beatles songs. We were able to demonstrate that musical complexity did not consistently increase over time (only a weak correlation was measured) and that high complexity seems to be a situational phenomenon that can occur for single songs rather than for a whole album. The approach presented in this paper can be considered as a case study for further computational studies on musical complexity, thus adding to the branch of computational musicology as part of the Digital Humanities. As next steps we plan to extend the approach to comparative complexity analyses for different bands and genres.

Bibliography

Cohen, J.
(1988). Statistical power analysis for the behavioral sciences. Lawrence Erlbaum Associates Publishers.

Cuddy, L. L., Cohen, A. J., & Mewhort, D. J. K.
(1981). Perception of structure in short melodic sequences. Journal of Experimental Psychology: Human Perception and Performance, 7(4), 869–883.

Di Giorgi, B., Dixon, S., Zanoni, M. & Sarti, A.
(2017). A data

driven model of tonal chord sequence complexity. IEEE/ACM Transactions on Audio Speech and Language Processing.

Eerola, T., North, A. C., & Le, L.
(2000). Expectancy

Based Model of Melodic Complexity. Proceedings of the Sixth International Conference on Music Perception and Cognition, (January), 1–7.

Everett, W.
(2001). The Beatles as Musicians: The Quarry Men through Rubber Soul. New York: Oxford University Press.

Krumhansl, C. L., & Shepard, R. N.
(1979). Quantification of the hierarchy of tonal functions within a diatonic context. Journal of Experimental Psychology. Human Perception and Performance, 5(4), 579–594.

MacDonald, I.
(1995). Revolution In The Head: The Beatles’ Records and the Sixties. Pimlico.

Madsen, S., T. Widmer, G., & Kepler, J.
(2007). Key

Finding with Interval Profiles. ICMC.

Martin, G., & Hornsby, J.
(1979). All you need is ears. New York: St. Martin’s.

Mason, D.
(2012). The Beatles Genome Project : Cluster Analysis of Popular Music in R. Book of Contributed Abstracts. The R User Conference, useR! 2013, (2010), 2012.

Pearson, K.
(1895). Contributions to the Mathematical Theory of Evolution. III. Regres

sion, Heredity, and Panmixia. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 186(0), 254–317.

If this content appears in violation of your intellectual property rights, or you see errors or omissions, please reach out to Scott B. Weingart to discuss removing or amending the materials.

Conference Info

In review

ADHO - 2019
"Complexities"

Hosted at Utrecht University

Utrecht, Netherlands

July 9, 2019 - July 12, 2019

436 works by 1162 authors indexed

Series: ADHO (14)

Organizers: ADHO