Monday, March 27, 2006

Week 4

This week in Studio 2 we spent more time on routing signal to different places. Understanding how this studio works is starting to get a bit tricky for me and I still need a lot more practice learning how to use the digital mixing desk. [1]

In Creative Computing we spent most of the time looking at Peak. I have used Peak a fair bit in previous courses so this class was more of a memory refresh on using the basic features. I found it helpful when Christian went through Normalising. I have used normalizing a lot but wasn’t 100% how I was effecting my sound files. It helped to learn exactly was normalizing does to a sound. The Blending effect was also very helpful. This was a feature I haven’t used before. [2]

We didn’t have a guest speaker this week. Instead David Harris played a number of different pieces, some being his own compositions from the early 90s. I have noticed that whenever we listen to music in Forum (this week and in previous weeks) they are always weird, abstract pieces that in my opinion lack structure, feeling, and music convention. It is sometimes good to break boundaries of music convention and produce music that is original. But, only if it sounds good. Music is subjective so there is no “correct” opinion on how music is supposed to sound, but I have to say that I haven’t liked too many of the pieces that has been presented to us.

Music is supposed to create emotion to the listener and I feel that these pieces so far have purely been for the purpose of intellectual study. As a student of Music Technology I have had some experience with experimentation of electronic music (I still have a long way to go) and I can appreciate innovative ideas (such as bowing a bass guitar, feedback, etc). But the musicians we’ve listened to that have tried these new ideas have done so and not thought about playing a really nice sounding piece of music. The pieces are very repetitive with random sounds and they don’t “touch me” or entice me to keep on listening. There’s no anticipation, which I believe is very important in music. In other words these pieces put me to sleep.

The first piece we listened to was “Surf Music 2 by Jack Vees”.[3] This piece was very repetitive and not very musical. It also went for an extremely long time (I didn’t time the piece but another student told me it was 21 minutes). The instruments used were bass guitar and a synthesizer (I think it was an analogue synth). I could hear a fair bit of modulation and filters throughout the piece. I also heard some feedback, which produced an interesting sound.

The next piece was Fog Trope II by Ingrid Marshall. This was written for a string quartet and I really enjoyed the sound of this piece. This piece had some really nice sounding strings with dynamics. There was also an airy ambient background sound, and a nice sharp bass which I think sounded great. There were other sounds like bird sounds, beeps, and possibly a voice through a vocoder.

David Harris played us two of his compositions. The first was on the piano. This piece was performed on the low end of the piano and was a repeating bass line with some variations. It was performed quite loud. My favorite part of this piece was at the very end when the piece ended and all the harmonics of the notes played previously echoed throughout the room. It was a really nice ambient sound. [4]

The next piece by David had strings predominately and also other sounds like vocal breathing (a hiss sound) and in the background it sounded like doors were opening and closing. This piece was broken up into three sections and it ended with a long glissando. Overall I did like this piece.[5]

[1] Christian Haines. 'Audio Arts Lecture - Studio 2'. Lecture presented at the Electronic Music Unit, University of Adelaide, 21st March 2006.

[2] Christian Haines. 'Creative Computing Lecture - Audio Lab'. Lecture presented at the Audio Lab, University of Adelaide, 23rd March 2006.

[3] Jack Vees. 'Bigraphical Material', Leisure Planet Music, http://www.leisureplanetmusic.com/composer/vees/bio.htm (Accessed 27/03/2006)

[4] David Harris. 'Elder School of Music', The University of Adelaide http://www.music.adelaide.edu.au/staff/composition/david_harris.html (Accessed 27/03/2006)

[5] David Harris. "Music Technology Forum Lecture - EMU Space". Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 23rd March 2006.

Monday, March 20, 2006

Week 3

In Audio Arts this week we spent time learning about the patchbay inside Studio 2. The patchbays that we used are the Behringer PX3000. [1] The ability to be able to switch between normalized and half-normalised connections is something I haven’t seen before. Also, I didn’t know that there were thru connections on patchbays prior to this lecture. The digital mixing desk that we are using is the Yamaha 01v.[2] This digital desk has a lot of features, one strange feature being the track on/off switch. I think this function is equivalent to making a track inactive in Pro Tools but I won’t know until I start using the desk. [3]

In Creative Computing we learnt about sound file formats and the type of information that is stored with different file formats. XML and Music XML are programming languages that are new to me, which is part of the metadata of sound files. We went through a program called Sample Manager, which is used to manage and organize samples. This software can store loops points, markers and regions which will be very useful for me when I start dealing with lots of sound files. [4]

In Forum this week we spent time with David Harris listening to alternative music. One of the pieces was “Gloria” Symphony #3 which was written by Glen Brancka. Overall I found this piece very repetitive. I could hear lots of different sounds emerging from different parts of the stereo space and it was interesting to hear sounds blending in and then fading out. I feel that this piece would have interested me more if it sounded more musical and had some structure. I did get something out of listening to this piece because it presented different techniques of music/sound production. In another piece by Robert Ashley
[5], I noticed some analogue synth sounds like LFOs running through filters, resonance, and maybe FM synthesis, Ring Modulation or a sampler. This piece had a lot of spoken word, robotic sounds, background beeps. [6]

Our guest speaker this week was Gordon Monro.[7] Monro background is in Mathematics, although he had more recently obtained a Masters Degree in Music Composition. He presented a creation of his that used a system to generate music. This is known as “Meta-Synthesis”. He had shapes on a screen where different colours represented different pitches. Red was a low pitch, blue a high pitch and green was in-between red and blue. As the shapes morphed into another the sounds changed accordingly. The mutation rate of changed starts off slow and then slowly builds up to an eventual rapid change. This installation lasts a long time and the sounds produced take a long time to change. I think that one listening to this installation would not notice a change in sound because of the length of the transformation.

Monro also presented an abstract video called “Red Grains”. This abstract video had different patterns that represented synthesized sounds created using granular synthesis. This created was different to the morphing shapes and had different sounds.[8]

[1] Zzounds. 'Behringer PX3000 48-Point Balanced Patchbay', Zzounds.com, http://www.zzounds.com/item--BEHPX3000. (Accessed 13/03/2006)

[2] Yamaha. 'Digital Mixers'. Yamaha Corporation of America, http://www.yamaha.com/yamahavgn/CDA/ContentDetail/ModelSeriesDetail/0,,CNTID%253D1483%2526CTID%253D,00.html (Accessed 13/03/2006)

[3] Christian Haines. 'Audio Arts Lecture - Studio 2'. Lecture presented at the Electronic Music Unit, University of Adelaide, 14th March 2006.

[4] Christian Haines. "Creative Computing Lecture - Audio Lab". Lecture presented at the Audio Lab, University of Adelaide, 16th March 2006.

[5] Robert Ashley. 'Robert Ashley', Loverly Catalog by Artist. http://www.lovely.com/bios/ashley.html (Accessed 13/03/2006)

[6] David Harris. "Music Technology Forum Lecture - EMU Space". Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 16th March 2006.

[7] Gordon Monro. 'Gordon Monro Electronic art & music'. Gordon Monro. http://www.gordonmonro.com/ (Accessed 13/03/2006)

[8] Gordon Monro. 'Music Technology Forum Lecture - EMU Space'. Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 16th March 2006.

Monday, March 13, 2006

Week 2

In Audio Arts this week we started spending time in Studio 2. It was good to see a Pro Tools HD system with a pretty good recording space. I really like the way the studios have been set up so that you can communicate between the different studio spaces. I would have liked to see more inputs and outputs but this is something I can overcome when it comes down to recording by being selective with the amount of microphones I need in a session. [1]

In Creative Computing we started spending time on OS X and its relation to Audio and MIDI. Even though I have already spent a lot of time using OS X, I found that there are some preferences that I have overlooked in the past (like the alert sounds). I look forward to learning more about Soundflower. This software sounds interesting and I have already started to think about ideas on how I could use it. OMS and Opcode are new to me. I have seen these before but I don’t know what they mean. I’m sure this will be a topic of study later on. [2]

Our forum speaker this week was Warren Burt.[3] Overall I couldn’t believe some of the work Warren has invented in the course of his career so far. Even though music technology is a relatively new topic of study, Warren proved to have broken the boundaries through creative, innovative ways music technology can be expressed as a technical form and as an art form. I really appreciated how he had created music using raw materials, and also non-musical devices such as a calculator, radio, old guitar strings, old computers, theremin and more.

One creation of Warren’s, which I especially liked, was his 5-Pound Synth. This included a calculator, radio, sampler and an effects unit. I am not entirely sure how this works, but I think the calculator sent a digital signal to the radio, which in turn sent a specific frequency to the sampler. The sampler would then send its sound into the effect unit, which would shape and create a new polished sound. This concept, or idea is not something I have seen before and Warren has opened my eyes to a new way of thinking.

The Ardvark synthesisers are also great creations. The Ardvark 4 had 16 Digital to Analogue converters and produced different, unpredictable sounds using Granular Synthesis. I don’t know a lot about Granular Synthesis, except that the sounds are created by altering the samples of the sound in different intervals by expanding or compressing (I think). The only synthesiser I have used which uses this method of synthesis is Propellerhead’s Reason’s Maelstrom Synthesiser. Overall Warren’s ideas were very innovative and I really learned a lot.

Towards the end of the session Warren showed how he used the software Plogue Bidule. Again, this idea of merging random musical tones with graphical dots and other patterns gave me some ideas on how in the future I could do the same. This was something I had not thought about before and it was great to see form of how software usage can create a creative final product. [4]

[1] Christian Haines. 'Audio Arts Lecture - Studio 2'. Lecture presented at the Electronic Music Unit, University of Adelaide, 7th March 2006.

[2] Christian Haines. 'Creative Computing Lecture - Audio Lab'. Lecture presented at the Audio Lab, University of Adelaide, 9th March 2006.

[3] Tropicapricorn Global Arts. 'Warren Burt'. Tropicapricorn, http://www.warrenburt.com/ (Accessed 13th March 2006)

[4] Warren Burt. 'Music Technology Forum Lecture - Schultz 1004'. Lecture presented in Schultz level 10, Electroninc Music Unit, University of Adelaide, 9th March 2006.

Week 1

The first week of Audio Arts we spent time going through the course objectives. There wasn’t much time spent in Studios 1 and 2 (the primary studios for first semester). The main point that stuck in my mind was to make sure I don’t take any food or drink into any studio spaces or the computer labs. [1]

In Creative Computing we spent most of the time exploring the Electronic Music Unit (EMU) website. This was new for 2006. I had spent some time before the lecture reading about the different studios for this year. The studio that particularly interested me was Studio 3. This was mainly because of the different keyboards and midi controllers and the electronic drum kit inside the space. I look forward to spending time in all the studios. [2]

Robert Minard came to speak to us about his life experience and career in music technology. This was our first forum session for the year. Robert taught me a lot about the industry. In fact, I had never even heard of Sound Installation at all before this session and had no idea that this was practised by anyone. I found it interesting to hear about his travels to Germany and how he had spent time promoting his work in various different countries. I found the sound installations fascinating, especially the way he mixed the natural environment (like plants and water) with speakers to produce musical sounds in a natural environment. [3]

[1] Christian Haines. 'Audio Arts Lecture - Studio 2'. Lecture presented at the Electronic Music Unit, University of Adelaide, 28th February 2006.

[2] Christian Haines. 'Creative Computing Lecture - Audio Lab'. Lecture presented at the Audio Lab, University of Adelaide, 2nd March 2006.

[3] Robert Minard. 'Music Technology Forum Lecture - Schultz 1004'. Lecture presented in Schultz level 10, Electroninc Music Unit, University of Adelaide, 2nd March 2006.