110010 - A Fanfare for a Digital Age

This Fanfare was commissioned by Professor Ian Pyle for the 50th anniversary of Computer Science at the University of York. Appointed on 1st January 1973, Ian was the Founding Professor of Computer Science. He offers the Fanfare as a gift to the Department.

Composer (and computer scientist) David Keeffe says: "The initial request specified that it 'was to be played by a computer'. When creating this work, I had to consider the musical and technical aspects of the request. It had to make sense musically, regardless of what technology was used, and because it was to be much more than a few bars of trumpet call, it needed a narrative that could be related to the occasion."

David Keeffe originally studied music at the University of York. After graduating with a BA in 1979, he was offered a DPhil studentship in the Department of Computer Science, which he completed in 1984. David moved to Australia in 1997, graduating with an MMus in 2006 and a PhD in 2018 in music composition, both from the University of Melbourne.

Find out more about David Keeffe's work: go to https://www.music3149.com/

Watch https://youtu.be/aB14XxUbyaw on YouTube

11010 - A Fanfare for a Digital Age

Program Notes

David Keeffe BA, DPhil York, MMus, PhD Melb, FTCL

Composer (and computer scientist) David Keeffe originally studied music at the University of York, and after graduating with a BA in 1979, was offered a DPhil studentship in the department of computer science, which he completed in 1984. After moving to Australia in 1997, David graduated with an MMus in 2006 and a PhD in 2018 both from the University of Melbourne, in music composition. He writes:

Suffice it to say I am very grateful to the department and Ian Pyle in particular for taking what must have been a gamble on a computing outsider. That gamble has paid off for me, having developed a career in IT development working for various large organisations and then moving to Australia to become an independent consultant.

This fanfare was commissioned by Ian Pyle for the 50th anniversary of computer science at the University of York and is offered as a gift to the department.

The initial request specified that it “was to be played by a computer”, so when creating this work, I had to consider the musical and technical aspects of the request. It had to make sense musically, regardless of what technology was used, and because it was to be much more than a few bars of trumpet call, it needed a narrative that could be related to the occasion.

As one might expect, the fanfare start and ends with the sounds of heralding brass sounds, with the opening representing a small number of players, and the ending representing a large ensemble, symbolising the growth of the department over the years. In between, the passage of time is represented by melodic ideas and sonorities that allude to some hit songs of the initial decades since the department’s founding. The choice of decade was also guided by how musical technology developed over that time. The listener is invited to identify the actual allusions!

Making the fanfare “played by a computer” was an interesting challenge if it was to be more than simple playback of a digitised recording of human performers. The other extreme would have been the development of a completely bespoke program to render the musical ideas. The compromise I landed on was to deliver a digitised recording that could be played back through a wide variety of devices (after all, the piece only represents about 3.5 minutes in a much larger celebration), but whose creation would only be practical using computer technology – and to a small extent, uses algorithmic methods to generate musical ideas.

I’ve always been a notation-based composer, so the initial composition was notation based – but managed on a computer. Eventually, as the fanfare developed, I was in a position to export the notational ideas as a MIDI (Musical Instrument Digital Interface) file. MIDI is a control protocol – it defines what notes to play and how to play them (attack, duration, etc) but not how they will actually sound.

Each exported musical line was imported into a DAW (a Digital Audio Workstation) and specific sounds were assigned to each track. The sounds used in this fanfare represent actual brass instruments and simulations of rock/pop sounds from the 1970s onwards: all of this has become possible with the advent of extremely powerful personal computing resources unheard of when the department was founded. The composition and mixing all took place on a quad-core Apple iMac with 40Gb of RAM and nearly a terabyte of solid-state disk. Compare that to the computer which hosted the York PULSE project which astounded us with 10 Megabytes of disk about the size of a large pizza pan.

Let’s consider some more details of “played by computer”.

Sound generators are now almost all based on a well-defined API, and generate digital audio for playback, but how that audio is created is dependent on the internals of the generator.

It’s probably safe to say that most modern generators use so-called sampled instruments: the sounds of an actual instrument played by an actual human are carefully recorded and then filtered and adjusted so that requests for actual pitches are satisfied with a reasonably realistic sound. The degree of realism depends largely on the size and quality of the original sample set and how it is managed in real time.

However, some generators are algorithmic: code simulates the analogue components (oscillators, filters, noise generators) of the synthesisers and organs of the 1960, ‘70s and ‘80s, and the user interface allows the musician to adjust various parameters. Such a concept is not new: Music I to Music IV from Bell Labs (where else?), dating back to the late 1950’s, were essentially a programming language for sound synthesis. The system has its successors, including Csound from MIT and others.

One generator (not used in this composition) [Pianoteq] even models the physical parameters of various keyboard instruments and produces a startlingly realistic result.

The actual sounds come from a number of digital generators:

  • Brass
    • NotePerformer 3 from Wallander Instruments
    • Symphony Brass from Native Instruments.
    • Personal Orchestra from Garritan
  • Synth Lead 1 (MiniMoog D)
    • Model*E from Steinberg
  • Synth Lead 2 (Oberheim X)
    • OB-Xd from discoDSP
  • Theremin
    • Resonator II from Michael Bietenholtz
  • Rock Organ
    • Kontakt Factory “Rock Basic” from Native Instruments
  • Lead Guitar 1
    • Goliath “Overdriven GTR” from EastWest
  • Lead Guitar 2
    • Kontakt Factory “Elektrik Guitar” from Native Instruments
  • Electric Bass
    • VB-1 from Steinberg
  • Percussion
    • Kontakt Factory “Central Stage Kit” from Native Instruments
    • Kontakt Factory “Orchestral Timpani” from Native Instruments
    • Kontakt Factory “Orchestral Percussion” from Native Instruments
    • Personal Orchestra “Cymbals” from Garritan
  • Effects
    • Reverb “Raum” from Native Instruments