Monday, April 10, 2006

Week 6

In Audio Arts this week we spent time in Studio 1 first reviewing how to route signal into Pro Tools. We also looked at monitoring your sound through the monitor speakers, headphones using the headphone amp, and also creating a headphone mix for recording musicians. We also sent signal through all the different rooms as a complete loop, i.e. Studio 2, Dead Room, EMU Space and Studio 1. This was very interesting and I realized that the studio setup was a lot more flexible than I originally thought. [1]

In Creative Computing we spent the whole lecture looking at the two pieces of software SPEAR and SoundHack. SPEAR we had looked at before and I have spent a little time playing around. I think this software has the potential to allow for some creative outcomes. SoundHack has some stranger features, which I will explore in due course. The ability to play any files as audio (including non-sound files) is strange and not something I have seen ever before. [2]

In Forum we listened to a number of pieces. Once again David Harris had a variety of alternative music. The first was by Edgard Varese, called Ecuatorial (1934). ('Edgard Varese', Wikipedia. Accessed 10/04/2006). This piece consisted of bass, 4 trumpets, 4 trombones, piano, organ, 2 Ondes Martenot, and 5 percussion instruments. The strings were very full in this piece and were an essential part of the overall sound. There was a deep male voice which gave the piece an oprah feel. Overall I felt the piece had a dramatic, primal feel to it. I enjoyed this piece and thought it allowed for experimentation but still held a strong musical sense. [3]

The next piece was by Milton Babbit: Ensemble for Synthesis. This piece was a 12-tone piece, but to me sounded more like a bunch of random sounds. But, I give Babbit some credit because the rhythm was also in the 12-tone technique i.e. each note had to be a different rhythmic value before any one rhythm was repeated. I could hear how each note was a different pitch but it was hard to tell if each rhythm value was different. Personally I don'’t see how you can tell because there is no timing. Or if there is timing it'’s virtually impossible to tell. I didn'’t really like this piece and it didn'’t sound like music to my ears. [3]

The next piece was by Barry Truax: Wings of Nike I Album. This was just one continuous sound played throughout the whole piece. I was waiting for something to happen then the song finished. I thought this piece was uninspiring. [3]

The second half of forum we spent the whole hour discussing what is music technology exactly. I found this to be quite interesting and it was really good to hear points of view not only from lecturers but also from students from different backgrounds. I think the overall conclusion from the session is that music technology is a hybrid of different disciplines. I do agree with this because music technology is a new and growing field. It was evolved from technology and is used to deal with music. Music has always been around but technology is new so therefore it is a hybrid and will continue to adopt new ideas and innovations as technology grows. [4]

[1] Christian Haines. "Audio Arts Lecture - Studio 1, Studio 2, Dead Room, EMU Space". Lecture presented at the Electronic Music Unit, University of Adelaide, 4th April 2006.

[2] Christian Haines. "Creative Computing Lecture - Audio Lab". Lecture presented at the Audio Lab, University of Adelaide, 6th April 2006.

[3] David Harris. "Music Technology Forum Lecture - EMU Space". Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 6th April 2006.

[4] Stephen Whittington, Mark Carroll, Tristan Louth-Robbins. "What is Music Technology?". Presented at EMU Space, Electronic Music Unit, University of Adelaide, 6th April 2006.