BRIDGES
BRIDGES
THE IDEA
While taking a musical composition course my junior year of college, we had an extracurricular opportunity to compose a work with a guest vocalist. There were no requirements except to have some kind of electronic component——whatever we wanted it to look like. I decided to apply a program I had written as a midterm for the course Computer Music: Sound Representation and Synthesis (CPSC 432) that generates backing tracks based on parameters input by a user. I projected a frequency spectrogram to visually display the sounds audience members were hearing in real time.
// concert publicity
THE CODE
When a user first opens the program, the first thing they see is the following:
Users then have the option to adjust the following parameters:
Once the user inputs a given key, I use a function called ~keyInMIDI to convert the symbol input into its corresponding MIDI note and generate a 5 octave scale around it using ~scaleGenerator.
Every note produced by the program is in the pentatonic major or minor scale of the given key, unless a user decides to manually add new chords to be played that are out of the realm of the pentatonic scale.
There are 4 parts total:
Harp
Flute
Chords
Ocean Waves
The harp repeats a two measure motif for a third of the piece, then switches to a B section, then returns to the original A section. Both the notes as well as the rhythms of the motif are randomly selected. Below is the SynthDef (or instrument) I programmed (based on a SynthDef for a guitar) to re-create the sound of a harp:
The high, flute-like part plays a sustained random note on the pentatonic scale for two measures. Both the note and the time a note is played are both random. To create the flute sound, I used additive synthesis to create a sound until I found something similar to the flute sounds of Green’s STREET, a piece we studied in class.
I wanted some kind of chords to be occasionally played in the background, so I re-used the function to generate the flute’s sound and applied it to a different synth. I also created a ~transpose function to bring it down an octave that will be helpful in future projects. Using a ~chordFinder function to look through a dictionary of chords, I decided to encode the default as a I6 chord that plays on the downbeats of random measures. The sound I used itself is an adjusted combination of both the harp and flute sounds.
The song begins with a sample of waves that fades in and out. I found a sample on freesound.org and added an envelope to fade it in and out in time with the song’s tempo, as well as some reverb.
To play the output, the following is run:
Here is a sample output with all of the randomness applied:
THE MUSIC
The score for this piece is based on Sol LeWitt’s Wall Drawing #786A displayed on a wall of the Yale University Art Gallery in New Haven, Connecticut.
// Wall Drawing #786A by Sol LeWitt, Yale University Art Gallery
Using the same figures LeWitt uses to denote the components of the artwork to the left, I created instructions for a musician to follow. I first divided the artwork into 28 squares, then picked one curved line for each performer. The following is the score Jennifer and I read while performing:
In terms of notes, dynamics, and phrasing, I wrote the following for a performer:
The following note is also written for a performer:
The pre-generated backing track will provide the tonic E, and will continue to provide an E minor sonic landscape. Around every six seconds the electronic track will denote the start of a new phrase. The entire piece should be around 2 minutes 54 seconds.
Working with the singer, we decided that E minor would be the scale most fitting with her voice. This project was my first composition ever performed live, and I hope to continue combining code with instrumental performance going forward.
Original score PDF available upon request.
/* Bridges is a piece of music initially performed on voice and violin, but is intended for any two instruments with computer. The piece’s graphic score is based on one of my favorite works of art at the Yale University Art Gallery by Sol LeWitt. The generative backing track was programmed in SuperCollider. */
This piece was premiered at the Yale College New Music: Re-Mapped and Re-Wired concert at the Yale Center for Collaborative Arts and Media in April 2O23 with mezzo soprano Jennifer Beattie and myself on violin.