Week 11
In Audio Arts this week we recorded Jodie O’Regan, a classical singer in Certificate IV, performing vocals. Luke, Will and I used Studio 2 and the Dead Room. The microphone we used was the Neumann U87. This microphone was a very high quality condenser microphone. We were all shocked as to how sensitive this microphone was. Even adjusting the curtains in the dead room came through the microphone. I was at the computer setting up the tracks in Pro Tools. We used a compressor and a de-esser. Will and Luke set up the microphone stand with the microphone in the cradle. As Jodie was singing Luke suggested that we record the audio on an audio track and then route that audio track through an auxiliary track with a compressor and de-esser on the auxiliary. This is the process we ended up using, and I adjusted the compressor to maximize the dynamic range of the vocals. The only problem with this method is that the recording itself was uncompressed. (http://www.blogupload.com/69631/jodie_singing_for_blog.m4a) [1] If I was recording vocals, I would compress the vocals to tape. By doing this, the vocal recording has greater dynamic range and the fidelity of the audio is better too.
[2]This is because when you compressed audio you raise the noise floor. So, if you are compressing to tape you don’t have this problem. I still think our session sounded good and the compressed signal sounds good to me. [3]
In Creative Computing we spent more time on Pro Tools. The lecture this week was primarily on organizational practices in Pro Tools. We spent time looking at markers, bouncing, and saving your session as a containment to move around to other workstations. I didn’t know that when you save markers you can save zoom setting, and locate points. I have to say, this is a really great function of Pro Tools and I only wish I knew about it earlier.
In Forum this week we only had a session in the second hour. Stephen Whittington talked to us about his experience in the Music Technology industry.
[4]It was interesting to hear the he has performed with Warren Burt (a previous speaker in forum). I found it hard to understand some of the topic Stephen was talking about, but this is probably due to my lack of experience. I found it interesting when Stephen talked about the vocoder and how he used it for his own creativity.
[1] Jodie O’Regan. "Audio Arts Lecture - Studio 2, Dead Room ". Recording session recorded at the Electronic Music Unit, University of Adelaide, 23rd May 2006.
[2] Microphones. 'Neumann U87'. http://homepage2.nifty.com/miguel/images/u87.jpg (Accessed 29/05/2006)
[3] Peter Sansom. "Audio Arts Lecture - Studio 2, Dead Room". Lecture presented at the Electronic Music Unit, University of Adelaide, 23rd May 2006.
[4] Stephen Whittington. ‘Elder School of Music', Staff. http://www.music.adelaide.edu.au/images/stephen_whittington.gif (Accessed 29/05/2006)
Monday, May 29, 2006
Monday, May 22, 2006
Week 10
This week in Audio Arts our focus was on vocal recording. Unfortunately we did not get to do any actual recording, as our vocalist could not make it. Maybe I should have gone in the dead room and belted out some tunes myself.We went through all the theory related to recording vocals. I have recorded vocals before but my vocalist was a professional (a jazz singer in the Elder Conservatorium) and so it was actually quite easy. Also I only recorded about 20 seconds. In my experience, the ease of the recording session is highly dependant on the vocalist’s skills. If they know how to control their voice well and if they are experienced with microphones, then it should not be too hard. The difficulty arises when you are recording inexperienced musicians in general.
[1]
There are still lots of other factors, which we discussed. These included: spill, proximity effect, the distance from the mic and the performers’ experience. We talked about recording vocals with the band compared to recording overdubs, and also the types of microphones used. Usually a condenser mic would be used to record vocals, but I suppose you could experiment with any microphone to get different sounds. I don’t know if I will have to time to experiment for my assignments but I wouldn’t mind trying out a range of microphones to compare the different sounds at some point. [2]
We didn’t have a class for Creative Computing this week. The notes for this week were on Pro Tools. The main focus was on effects (inserts & sends), automation, regions and selections, mixing and mixdown. I think Pro Tools has a really user-friendly interface when it comes to applying these concepts. The automation is really easy to use and the different tools allows for some creative output (like the triangle, square, random line). The only problem is when you are using a plug-in with a large amount of controls; I find it hard to find the exact parameter you want to automate. [3]
In forum this week we listened to three pieces. Mr Bungle, Stockhausen and My Bloody Valentine. The piece by Mr Bungle (1991) had two parts: 1. Love is a Fist and 2. Dead Goon. Both these songs had a strange variation in mood changes. The song would play and build up to a point and then I felt a sudden change put the listener into another frame of mind. It was good in a way but it wouldn’t be good to listen to if you wanted to relax.
The piece by Stockhausen was called Hymen. This was made up of radio sounds combined with vocal samples.
[4] This piece could be considered music concrete. I didn’t like it much at all.
The piece by My Bloody Valentine was called To Here Knows When. I liked this piece because it had a nice melodic sound to it. The vocals were lost in the mix and way too soft but because it was so obvious I assume the artist did that on purpose. Towards the end, there were some nice pad chord sounds. [5]
Our guest speaker this week came to talk to us about copyright. This session was quite interesting and answered some questions of mine which I did not know prior.
[1] Hosa Tech, ‘Microphones’. http://www.hosatech.com/hosa/images/image_cmk_in_microphone.gif (Accessed 22/05/2006)
[2] Christian Haines. "Audio Arts Lecture - EMU Space". Lecture presented at the Electronic Music Unit, University of Adelaide, 16th May 2006.
[3] Digidesign. 2005. “Pro Tools Reference Guide”. Daly City CA
[4] ‘Karlheinz Stockhausen’ Wikipedia. http://en.wikipedia.org/wiki/Karlheinz_Stockhausen (Accessed 22/05/2006)
[5] David Harris. "Music Technology Forum Lecture - EMU Space". Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 18th May 2006.
[6] Robert Chalmers. 'Music Technology Forum Lecture on “Copyright” - EMU Space'. Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 18th May 2006.
Monday, May 15, 2006
Week 9
This week in Audio Arts we recorded electric guitar instead of acoustic. The main different is that instead of placing the mic in front of the sound box of the acoustic guitar, the mic was placed in front of the speaker of the amplifier. We looked at the “proximity effect”. This is how the microphone responds to frequencies at varying distances from the sound source. E.g. High frequencies are much more directional than bass frequencies so that means that as the microphone is pulled further away from the sound source the recorded signal will be brighter because the lower frequencies are less prominent and so they are not recorded as much.
We also looked at moving the microphone on a different axis relative to the sound source. This provided our recording with different takes and we compared them in class. [1]
Creative Computing was on Pro Tools again. We spent time on Groups in both the edit and mix window. I have found in my previous experience I don’t use groups much at all. I tend to set my drums in a group but purely for organizational purposes. I don’t usually group tracks to use for automation. In the past I tried this and ended up stuffing up my mixes because I would increase the volume of say a Snare Drum and end up pulling other tracks up that I didn’t intend to. However, I can definitely see how it would be useful and perhaps in the future I may try this method of mixing. Obviously if you need to increase all your drum sounds relative to the rest of the mix, grouping would be a great option. [2]
In Forum David Harris played us three songs. The artists were Christian Marclay and Pink Floyd. Christian Marclay used 8 turntables and mixed up various classical songs. This blew my mind as to how well he was able to merge into the next song. Being a DJ myself, I could really relate to how he cued and mixed into the next song. I could also hear how he used scratching techniques to stop records that were playing. The two sets of mixes were songs by Johann Strauss and then Jimi Hendrix.
The next song we listened to was “Shine on your Crazy Diamond” by Pink Floyd. This was a great song with really nice sounding guitars and awesome chord progressions. The vocals were typical of a great band.[3] I really liked the saxophone and how its timing changed and then the song followed. The later parts had keyboards, and horn solos, which sounded great also.
The next two Christian Marclay songs were good but not as good as the previous ones. The artists used were John Cage and Mariah Callas. The John Cage “mix” sounded typical of Cage but there was a beat in the background that really added something to the piece. I think it would’ve sounded really good if Marclay picked samples from Cage’s work and repeated them in a set layout with the beat to create a structured into, main section, ending. I still thought the piece sounded okay even though it didn’t have this.
The next mixture of Mariah Callas songs was torturing to my ears. The high pitches sustained and resonated through my head. I felt like I was going deaf couldn’t wait for it to end. [4]
In the second hour we had two honour student speakers, Seb Tomczak and Darren Curtis. Seb talked about his new creation of a MIDI interface. I didn’t really understand much of his creation and this is probably due to lack of experience. Still, I found it interesting. [5]
Darren’s presentation I found very interesting. I had heard of “Sound Healing” prior to this talk but I really enjoyed learning about it in more detail. This is a field of study I would consider researching in the future as it allows me to use my studies to help people. [6]
Overall the music we listened to and the talks really interested me this week. This has been the best forum this year so far.
[1] Christian Haines. "Audio Arts Lecture - Studio 2, Dead Room, EMU Space". Lecture presented at the Electronic Music Unit, University of Adelaide, 9th May 2006.
[2] Christian Haines. "Creative Computing Lecture - Audio Lab". Lecture presented at the Audio Lab, University of Adelaide, 11th May 2006.
[3] ‘Pink Floyd’, Wikipedia. http://upload.wikimedia.org/wikipedia/en/e/ea/Pink_Floyd_1968.jpg (Accessed 15/05/2006)
[4] David Harris. "Music Technology Forum Lecture - EMU Space". Lecture presented in the EMU Space, Electronic Music Unit, University of Adelaide, 11th May 2006.
[5] Seb Tomczak. 'Music Technology Forum Presentation - EMU Space'. Lecture presented in the EMU Space, Electronic Music Unit, University of Adelaide, 11th May 2006.
[6] Darren Curtis. 'Music Technology Forum Presentation - EMU Space'. Lecture presented in the EMU Space, Electronic Music Unit, University of Adelaide, 11th May 2006.
Monday, May 08, 2006
Week 8
In Audio Arts this week we spent most of our time learning about microphones and microphone techniques. [1] A lot of what we went through was stuff I have already learnt in my SAE course but it was good to hear it from a different perspective. We recorded an acoustic guitar in the dead room through to Studio 2 and into Pro Tools. The placement of the mic made a big difference to the recorded sound. I did feel the strings of the guitar were quite old as the sound was really dull and lacked warmth. I look forward to when we look at psychoacoustics in relation to microphone techniques. [2]
In Creative Computing we spent more time on Pro Tools. Again, this wasn’t anything new from last week as I am familiar with Pro Tools. It would be really good if you could save session files of Pro Tools as different version to make compatibility easier. I’m not sure if Pro Tools does this, I will have to check, but it would be good if you could save a Version 7 session file as a Version 6.4 so that we can swap between the Audio Lab and Studios 1 & 2 for example. We looked at short cuts, solo, mute, and polyphony. We also looked at automation both in drawing in non-real time and also writing data in real-time. All the automation modes, and automation in general, is something I have used a lot. I think that drawing automation is really the only efficient way to write automation data unless you are using a control surface.[3] One new feature which I thought was really useful was the [apple] + [option] + click, which sets the inputs for tracks 1-16. This would have saved me a lot of time in the past and I will definitely use that shortcut now. We finished briefly on plugin as inserts. Pro Tools has some useful plugins but in my experience the Waves Plugins are the best for Pro Tools. Logic Pro’s plugins are years ahead of Pro Tools also. [4]
In Forum we listened to some good music for a change. We listened to Led Zeppelin ‘Whole Lotta Love’, Pink Floyd ‘Bikes’, Pink Floyd ‘Breathe’, and Frank Zappa ‘Mr. Green Genes’. The last song was ‘Voile d’Orphae’. This song was weird but I actually didn’t mind it. One question that came to mind was “Why did I like these songs more?” Some still sounded weird and different but I wonder if my opinion changed purely because I have heard of the bands and appreciated them prior to the lecture. [5]
In the second hour of Forum we had two honors students speak to us about projects that they were undertaking for their thesis. They were Tim Swalling and Jasmine Ward. To be perfectly honest I didn’t really understand either of the projects.
I don’t know how their objectives related to music technology at all. Tim’s presentation was on artificial life with the relation to the creation of music. The talk was very technical and beyond my understanding of what music is meant to be. He talked about biological systems, genetic algorithms, cellular automata, and how to create an environment in which sound creation creates organisms to occupy the space. I had no idea. [6]
Jasmine’s talk was about industrial waste in Adelaide. This was also confusing to me because I didn’t see how this related to music either. I thought her objectives and views about this problem were interesting and valid but I was lost as to how this could be a topic for music technology. [7]
[1] Microphone. ‘Microphone’, Wikipedia. http://upload.wikimedia.org/wikipedia/commons/thumb/4/4f/Oktava319.jpg/128px-Oktava319.jpg (Accessed 08/05/2006)
[2] Christian Haines. "Audio Arts Lecture - Studio 2, Dead Room, EMU Space". Lecture presented at the Electronic Music Unit, University of Adelaide, 2nd May 2006.
[3] Plug-in Info. ‘D-Verb’, Digidesign. http://akmedia.digidesign.com/products/images/prd_enl_1039_8365.jpg (Accessed 08/05/2006).
[4] Christian Haines. "Creative Computing Lecture - Audio Lab". Lecture presented at the Audio Lab, University of Adelaide, 4th May 2006.
[5] David Harris. "Music Technology Forum Lecture - EMU Space". Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 4th May 2006.
[6] Tim Swalling. 'Music Technology Forum Lecture - EMU Space'. Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 4th May 2006.
[7] Jasmine Ward. 'Music Technology Forum Lecture - EMU Space'. Lecture presented in the EMU Space, Electroninc Music Unit, University of Adelaide, 4th May 2006.
Monday, May 01, 2006
Week 7
This week Tuesday was a public holiday and so we didn’t have a class on Audio Arts. In Creative Computing we started learning about Pro Tools.[1] I have used Pro Tools a lot so leaning the software wasn’t anything new. But, I did find it helpful how Christian explained how a sequencer works from the most simplest level. I wasn’t sure about the difference between a channel and a track but this lecture cleared that up for me. The organizational formats of the Pro Tool sessions were also helpful. [2]
In Forum we spent time listening to Iannis Xenakis, Gabrielle Manca, and Phil Glass. Iannis Xenakis is a Greek composer with a maths background. This piece was created using mathematical graphs and to me sounded like lots of buzzing.[3]It was called “Voyage to Andromeda” (1989). I didn’t consider this piece music and because I went for so long (15mins) it got quite annoying. There were dynamics in the piece with some arpeggitations at parts, and also sometimes some ambient parts. Overall this piece was just buzzing that went for too long.
The next piece was called “In Flagranti” by Gabrielle Manca (1990). This piece was composed and written for guitar. The performer used all parts of the guitar such as:
Slide guitar
Sliding up/down
Hard strumming
Slapping on the fret board
Deliberate string squeaks
Hammering up on the fret board
Some open strings
General hammer on/pull offs
This piece went for 9 minutes, which wasn’t too bad. The song sounded strange but it was interesting to hear different parts of the guitar played and to try and work out what the performer was actually doing. I have some guitar background so I could pick most of the techniques. It was interesting when David Harris stressed the fact that everything was planned out on a score and although the piece sounded a bit random and improvised it was definitely written exactly that way. I thought this piece was ok to listen to but it still didn’t have any melodic qualities.
The final piece we listened to was by Phil Glass called “Rubic” (1984). When I heard this piece the first thought that came to my mind was “Finally we’re listening to some music”. [4]This piece had many instruments like flutes, bass, strings, marimba and brass. The notation was quite fast and the song had structure. The piece had a running feeling to it and it was definitely musical. Before I found out what year this song was made my guess was late 80s to early 90s, so I was close.
In the second hour of forum we had a presentation by an honour student, Seb Tomczak. He showed up this project he was working on called “Milkcrate”. Milkcrate essentially is the creation of music with specific rules. It was something different and original but it isn’t really something I would be interested in doing. Overall there were 6 Milkcrates and each has a different sound.
[5]
In each Milkcrate people would be inside a room and would have to make sure at least one person is composing music at any one time. The tools used to make the sounds must fit inside a matchbox. This lead to some interesting creations and it was surprising to see how paperclips, rubber bands etc could produce decent sounds once processed.
One point, which Steven Wittington brought up which I thought, was a valid point is: “Who owns the copyright or intellectual property?”. Seb organized the works and made up the rules but he didn’t actually create any music. This is an interesting point, which I really don’t know the answer too.
[1] Pro Tools LE. 'Pro Tools LE Software', Digidesign. http://www.digidesign.com/products/sw/images/PTLE7.jpg (Accessed 1/05/2006)
[2] Christian Haines. "Creative Computing Lecture - Audio Lab". Lecture presented at the Audio Lab, University of Adelaide, 27th April 2006.
[3] 'Iannis Xenakis' Wikipedia. http://upload.wikimedia.org/wikipedia/en/a/ab/Xenakis2.jpg (Accessed 1/05/2006)
[4] 'Phillip Glass'. Dunvagen Music Publishing Inc. http://www.philipglass.com/images/biographies/pg5.jpg (Accessed 1/05/2006)
[5] 'Milkcrate'. EMU Electronic Music Unit. http://www.milkcrate.com.au/images/sessions/mc_6_big.jpg (Accessed 1/05/2006)