Saturday, April 10th, 2009: Essay #1 Down! Feel free to critique, I turn it in tomorrow.

Started last night at midnight and finished at 6:30am. LOL

The Computer as a Musical Instrument by Kimberly Hall (April 10th, 2009)

 

With the onset of Pierre Schaeffer (1910-1995) as “the first DJ” in 1936 (Barrett 2009, wk1), musical mediums were developed as a necessity for the evolving recording and reproduction industry. This in turn required the mediums to enter a constant state of evolution as well which led to what Barrett refers to as the birth of the computer musician. In this essay I will explain and prove my process in creating a real-time performance piece using the computer as an instrument.

In Laptop Performance: Techniques, Tools and a New Interface Design, Zadel and Scavone (2006, 2) observe that the emergence of MIDI (musical instrument digital interface) has made it possible to create interactive computer music that can be controlled in real-time. For example the direct reaction of a designated MIDI file played, caused by the action of pressing a button on a videogame controller. They continue in their summarization that computer performance subculture has become prominent over the last 15 years and now is focused on software itself as the primary performance tool and the computer/controller as the instrument. (Zadel and Scavone 2006, 4)

In preparation for our performance, Chris Yates and I followed a series of techniques and problem solving strategies to create several MIDI and audio clips. Our close musical partnership began in the result of too many ideas amplified and no sense of direction, so we opted to divide the work and assign the tasks and duties amongst ourselves instead. As Chris is a guitarist and capable of playing chords he worked on creating most of the melodic MIDI’s by way of the Axiom49 MIDI controller. I, on the other hand, am more logically and analytically inclined and recorded audio files and pieced both our clips together into the final piece. In doing this, we were able to quickly create a better quality final product by way of gestural translation.

After piecing together the bare bone structure of the piece, we then started KEY and MIDI mapping clips, SENDS, and triggers in order to facilitate a professional real-time performance. Chris handled all of the effects, panning and volume with the MIDI controller along with playing on the keyboard live. This allowed him to change the envelope, and texture of the sounds we created. The computer and I worked in an action-reaction type situation where I’d manually trigger a sliced MIDI clip or Live would start it on the next down beat. Also, being that reverb and delay effects were added to the overall environment of the musical structure and individually to certain cues via the SENDS, the intuitiveness of the computer’s reaction to the mapped decisions was increased. While performing our piece, we found that we would move with along with the beat. This continues the idea that when musicians relate to the computer as an instrument, they become physically and emotionally absorbed. (Barrett 2009, wk2)

 Ableton (2009) claims that they make software for “creating, producing and performing music”, also Barrett (2009) explains that Ableton Live is “loop-based and much more focused on creating dance music but still very flexible and widely used”. They add that Ableton Live “has introduced a new approach to making music with computers on stage and in the studio”, and that they started Ableton in order to realize their personal vision of a computer-based music-making solution. Furthermore Emmerson (2000) highlights that the instrument is an extension of the body therefore concluding that “instrumental gestures are extensions of vocal and physical gestures”. This notion is reinforced by Theberge’s (1997, 172) claims that much like how the musical tempo will travel through a pianists fingertips, a DJ experiences the same musical extension of his hands for his equipment as a pianist would his hands on a piano.

 

The findings from the Proceedings of the 14th International Conference on Auditory Display suggest that audio stimulation has a direct connection with the functionality and extension of your hands.

A study conducted by Rauscher, Shaw and Ky stirred

broad public interest for the so called ‘Mozart Effect’. During

their tests Rauscher et al. investigated the effect of listening to

classical music before subjects were taking a spatial reasoning

test. They found…college students who listened to 10

minutes of Mozart’s Sonata for two pianos in D major…

scored on average 8 – 9 points higher in a subsequent spatial

ability IQ test, compared to when they listened to a relaxation

tape or to no audio stimulus at all. (Fassbender et al. 2008, 2)

 

The most interesting finding however was that participants

performed best when listening to music on both ears and using

their right hand. Their explanation for this outcome is that

instrumental music is pre-dominantly processed in the right

brain half. This means that listening to and processing the music

occupies the right brain half so much that it does not have time

(and the necessity as with vocalized music) … to communicate

with the left brain half. (Fassbender et al. 2008, 2)

 

Brown and Barrett’s (2009) findings explain that the computer presence in music can also contribute to mental processes and the notion that to use a computer as an instrument, the computer must “contribute to or enhance the musical outcome” which therefore reflect the idea that the connection of audio stimulation and gestural tools is an important one.  In playing through the computer as an instrument, you can rely that it will predictably respond to your gestural or symbolic instruction (Barrett 2009, wk2). Brown and Barrett (2009) also state that the requirements for engaging with the computer as an instrument are, among others: familiarity, intuition, and involvement.

In conclusion, the aforementioned information in conjunction with the knowledge that if using Ableton Live 7 as a direct gestural tool, concludes that music can be created by the computer as a musical instrument. Therefore in our using the software Ableton Live 7 by: triggering clips and SENDS by use of hand gestures, feeling the need to dance in place, and becoming physically and emotionally absorbed with our performance, it is safe to assume that we were successful in using the computer as a musical instrument.

[Word count 1,018]

  

References

 

Ableton AG. 2009. http://www.ableton.com/live (Accessed 9 April 2009)

Ableton Live.7.0.15. 2009. Berlin: Apple.Software. Mac OSX

Barrett, L. 2009. KMB105 Music and Sound Technology: Week 2,4 Lecture Notes. http://www.blackboard.qut.edu.au (Accessed 8 April 2009)

Barrett, L. 2009. KMB105 Music and Sound Technology: Lectures: Week 1, 3-6.

Brown, A. and L. Barrett. 2009. KMB105 Music and Sound Technology: Week 2 Lecture Notes. http://blackboard.qut.edu.au (Accessed 8 April 2009)

Emmerson, S. ed. 2000. Losing Touch?: The Human Performer and Electronics, In Music, Electronic Media and Culture. 94-216. Aldershot:Ashgate. https://cmd.qut.edu.au/cmd//KMB105/KMB105_BK_81247.pdf (Accessed 8 April 2009)

Fassbender, E., D. Richards, B. Thompson, A. Bilgin and A. Taylor. 2008. The Effect of Music on Learning in Virtual Environments – Initial Results. In Proceedings of the 14th International Conference on Auditory Display, Paris, France June 24-27, 2008. Sydney: Macquerie University. http://www.icad.org/Proceedings/2008/FassbenderRichards2008.pdf (Accessed 8 April 2009)

Theberge, P. 1997. Music/Technology/Practice: Musical knowledge in action., In Any sound you can imagine: Making Music/Consuming Technology, 157-185. Hanover:University Press of New England

Zadel, M. and G. Scavone. 2006. Laptop Performance: Techniques, Tools and a New Iinterface Design. Music Technology Area Research Publication. Quebec:Schulich School of Music, McGill University. http://www.music.mcgill.ca/~zadel/research/publications/zadel_scavone_icmc2006.pdf (Accessed April 9th, 2009)

 

MIDI & Audio Reference List

Fast Beat.mid: C. Yates. 2009. Percussion. Auto Filter (Feedback to SEND)

Weee.mid, Woah.mid and Slow Down Drop.mid: C. Yates and K. Hall. 2009. Around The Head. Auto Filter (Feedback to SEND). Played with feedback from MIDI Controller, created sample, clipped, then looped.

One.mid, and Two.mid: C. Yates. 2009. Strat Tones Open. Acoustic Guitar, Dream Acoustic Guitar and Intimate & Colorful. Created sample then used live.

Layer One.mid: C. Yates. 2009. Grand Section/Sustain. Created sample then looped.

G10-Grand Piano.mid: C. Yates. 2009. G10-Grand Piano. Created chords sample then looped.

Kit-Cold Tight Room Stick.mid: C. Yates. 2009. Kit Cold Tight Room Stick. Basic Rock Beat. Created sample then looped.

High Hat Fast.mid: Live. 2009. High Hat Fast clip. Panned to the left.

Beat Crash.aiff: Live. 2009.Beat Crash clip.

1, 12-Double Bass Solo Legato.mid: Live. 2009. 1, 12-Double Bass Solo Legato clip. Flanger Effect. Panned to the left.

2, 12-Double Bass Solo Legato.mid: Live. 2009. 2, 12-Double Bass Solo Legato clip. Flanger Effect. Panned to the left.

Big Bass.mid, Bass1.mid, Bass2.mid, Bass3.mid, Bass4.mid and Ting.mid: K. Hall. 2009. 1, 20-Kit Deeper. Flat Reverb (Depression 1.28 kHz), Flat/Cut Reverb (Depression 4.50 kHz). Created sample then looped and used live.

Live Solo. Instrument: C. Yates. 2009. Dust Devil. Reverb and Simple Delay. Played live via MIDI controller.

Meex2.aiff: Rendered from ‘He’s With Me’ by K. Hall. 2005. Clipped then looped.

Ohhx2.aiff: Rendered from ‘He’s With Me’ by K. Hall. 2005. Clipped, looped then sped up.

Oohhoo ahaa.aiff: K. Hall. 2009. Clipped for time then looped.

Ohh.aiff: K. Hall. 2009. Clipped for time, looped then slowed down.

Ooh.aiff: K. Hall. 2009. Clipped for time, looped then sped up.

InMyNeckx2.aiff: Rendered from ‘Headache In My Neck’ by K. Hall.2006. Clipped then looped.

Ohhmy.aiff and Ohhmygod.aiff: Rendered from ‘Headache In My Neck’ by K. Hall.2006. Clipped then looped with feedback from SEND.

Single OMG.aiff: Rendered from ‘Headache In My Neck’ by K. Hall.2006. Clipped and used live.

Hahh hahh.aiff: Rendered from ‘Headache in My Neck’ by K. Hall.2006. Clipped, looped, sped up, and then panned to the left.

Ahh.aiff, ThTh.aiff, and ChhChh.aiff: K. Hall. 2009. Clipped, looped and used live.

SEND A: Auto Pan, Flanger

SEND B: Resonators, Reverb, Simple Delay, Crystal Bass, Bass Filter

SEND C: Filter Delay, Grain Delay (Feedback)

Master: Simple Delay (Controlled with MIDI Controller Live)

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s