Wednesday, September 24, 2008

Week 8 Forum - Eraserhead

I was fairly excited to watch Eraserhead because I keep hearing about Lynch and I don't think I've seen any of his other films. I really enjoyed it - I'd definitely prefer to see something totally different than another boring movie stamped out of the usual mould.

There are some favourite moments.... I love the way Lynch manages to ironically reserve dialogue for completely obvious, pointless remarks. "Oh... you are sick" was insanely funny in my opinion, as well as "like regular chickens" and a few others. That explains why there is an Amon Tobin song of that name....

I like the way we do very different things in forum every week, and I think that watching a fairly experimental movie is a worthwhile endeavour when many of us are interested in film sound design and music. The constant rumbles and drones created a strong atmosphere and I thought they had a pleasantly solid tonal character. The contrast of the music and noise was very effective. 

I also think that all art forms influence each other in interesting ways and so it is a good thing to find interesting works in other disciplines.

Reference: David Harris. "Week 8 Music Technology Forum - Eraserhead." Lecture presented at the Electronic Music Unit, University of Adelaide, 18 September 2008.

Friday, September 12, 2008

Week 7 Audio Arts - Analog Synthesizers!

Above is a Roland SH-5 which I attempted to create some organic sounds with. It feels great to be playing with something that isn't a computer. The many controls allow easy realtime control to manually add that modulation you need without bothering to assign things to an LFO or modwheel. I used the pitch lever often to control pitch or filter in order to move it exactly as I needed. 


I found it quite surprising that it isn't too hard at all to obtain a few reasonably organic sounds, though I ran out of inspiration quickly.

The wind was created by filtering white noise with the LP and BP and I added in a slight whistle created by ring modulating the oscillators together.

I think I did the bird noises by increasing the resonance of the filter till it self oscillated. I then set up a suitable envelope and swept the filter with the pitch stick while playing notes.

Other sounds are fairly obvious! The white noise oscillator makes it all possible because otherwise we could only do pitched sounds.

Reference: Christian Haines. "Week 7 Audio Arts - Analog Synthesizers." Lecture presented at the Electronic Music Unit, University of Adelaide, 6 September 2008.


Week 8 Creative Computing - Ableton Live 2 - Let's Do the Time Warp Again Please


The Audio File

Here's my project for this week - it uses a bunch of samples from a recording of a school concert, a musical, and a rock band.

The warp markers were extremely useful to me. This interface for matching loops is so intuitive and accurate and makes it easy to work with looser beats that may fluctuate in tempo. It is so quick it would be feasible to do minor editing with this feature while playing live. I think the performance aspect of this software can be good when working with samples, because often really interesting combinations are found by trial and error which is easy in a performance environment.

I find myself losing touch with the macro form of the piece when sequencing live, however this is easily tidied up in the arrangement view. In creating this 45 second piece, I actually created a 2:20 groove and then raised the tempo and judiciously used "delete time" to get rid of the less interesting parts.

I had to just go for it and work quickly knowing that I couldn't save and improve the song incrementally over many sessions. This approach forced me to be more productive, creative and heavy-handed.

Reference: Christian Haines. "Week 7 Creative Computing - Ableton Live". Lecture presented at the Electronic Music Unit, University of Adelaide, 6 September 2008.

Week 7 Creative Computing


Wasn't sure if it needs to be 45 seconds or not?

Groove 2 MP3 (cut to 45 seconds)


I practiced a little at trying to get all the samples loaded really quickly and then switching scenes at the right moment. I think that the realtime nature of this software makes it an instrument that requires practice to really be able to improvise quickly. I wasn't able to do very much at all after 45 seconds, however with a little longer I was able to explore more, changing the loop length in order to switch time signature and create polyrhythms.

There are some quite good live DJ-type effects such as the "grain delay" which I used on trent's vocal. The XY controls are obviously geared towards performance.

I think the key/midi control options would really improve the agility of the performance if the "player" is skilled, however some setting up of the project may be required to make use of them.

The immediate nature of this performance sequencer and its restriction of saving meant that in some grooves I had happy accidents and discovered a new technique, effect, or sample combination while improvising. This is an example of the way in which restrictions may be conducive to creativity, which is then sped up by the focus on live performance.

Reference: Christian Haines. "Week 7 Creative Computing - Ableton Live." Lecture presented at the Electronic Music Unit, University of Adelaide, 6 September 2008.

Thursday, September 11, 2008

Week 7 Forum - Second Year Presentations

Super Collider

The second and and some third years showed us what they have been up to. This consisted mainly of max and supercollider patches.

I enjoyed it all! Sanads dance piece was pretty extensive for something he made last night.... I enjoyed it. He suggested that it is quite different to the dance music that people like, but I'm not so sure.... Other than the 6/8 time signature, I don't think its that different from what I've heard. I think this is probably because I don't understand the intricacies of dance music at all.

Edward's visual patch was way cool - very impressive and I liked that he could tweak parameters and get really different patterns.

The chord progression generator seems like a very useful device, even if it is just there for inspiration. The grid probability approach seems quite intuitive.

Freddie's "game" idea for his Max patch was really hilarious - I love it. Never would have thought about turning algorithmic composition into a game! Get your kids into max/msp this way......

I thought the supercollider stuff was interesting but it looks like heaps of work to get decent organic textures. That said - John Delaney did an amazing job of getting beautiful sounds from the code, and Matt's Fatboy Slim sample was absurdly rad.

Reference: Stephen Whittington. "Week 7 Music Technology Forum - 2nd and 3rd year presentations." Lecture presented at the Electronic Music Unit, University of Adelaide, 11 September 2008.

Week 6 Creative Computing - More Logic Skills



I found it rather difficult to apply many of the new skills taught in the tutorial to a song that had already been completed. In order to try to demonstrate the techniques and make a noticeable change to the song, I've made some adjustments.....

Added the enveloper to the drums. I've overdone it to the point where there is too much attack but at least you can tell it is there. I found this plugin more useful on my own recordings.

Humanized the drums. The timing and velocity is just slightly randomized and I think it makes the looping a little less obvious.

I added reverb to the instruments so they sound a little more dub. Particularly the drums have a very fake cavernous effect (I chose this deliberately in space designer).

Redid the arrangement a little.

Last minor touch.... REPLAYED THE MELODY THROUGH AN ARPEGGIATOR! which I lovingly configured in the environment as per screenshot above. One note is converted to a chord via transformers and then tastefully arpeggiated [to hell and back] to develop the melody and add a minimalist flavour.

Reference: Christian Haines. "Week 6 Creative Computing - Logic Skills 3." Lecture presented at the Electronic Music Unit, University of Adelaide, 2 September 2008.

Week 6 Audio Arts - Interaction Design - Age of Mythology


The menu includes options related to which ancient God you choose to base your civilization around, what kind of enemy you will face, and the layout of the map that the game is set in. Actual gameplay involves controlling the numerous buildings and personnel of your empire. The game is created by Ensemble Studios and marketed primarily to children.

More info

Sampling technology is used to trigger sounds as the user interacts with the controls.

Some of the sound effects are directly related to the user's actions while others are designed to alert the player to issues such as an enemy attack. Glyphs are used to identify different units as they are selected.

In the menu screen, sounds are used to indicate mouse over as well as various selections. A paper rustle indicates scrolling through the available Gods which applies the ancient, mystical aesthetic of the game to a practical interface sound. Other sounds such as the divine thunder for final selections follow a similar theme. This is an example of form meeting function.

The interaction design is similar to other types of sound design in that it uses sounds in a symbolic way to convey information while fitting with the aesthetic of the product.


Reference: Christian Haines. "Week 6 Audio Arts - Interaction design." Lecture presented at the Electronic Music Unit, University of Adelaide, 3 September 2008.

Thursday, September 4, 2008

Week 6 Forum - AUDIO RASA CHARADE FUN GAME PERSIAN BABY

In today's exercise we attempted to recognize sonically represented emotions. Some radically different approaches were used in portraying the "rasa" such as:
  • Sampling different styles of commercial music
  • Downloading the pure, instinctive cries of yet-to-be-conditioned babies from youtube
  • Traditional western art music conventions
  • Moody synthesized textures
  • Drawing on the cliches of mainstream cinema
  • Speaking about emotional subjects in a foreign language
Some techniques definitely seemed more effective than others. While I thought it was really cool, we didn't seem to be very good at understanding Sanad's Persian even with strong inflections. People who drew strongly on obvious cliches and used a range of different forms (eg acoustic instrument, synth, nature sound) seemed to be most successful.

I think it is very interesting that many of the baby cries were quite understandable and Stephen suggested that there may be some aspects of language and aural association that are fundamentally built into our physiology and not learnt. Thanks to Freddy for this interesting point.

I was particularly impressed by the guy that used a lot of synthesized/processed sounds. I think he did a great job at portraying emotion through sonic texture (which is what we should be focusing on in our course) and without using obvious cliches.

Reference: Stephen Whittington. "Week 6 Music Technology Forum - Emo Music." Lecture presented at the Electronic Music Unit, University of Adelaide, 5 September 2008.