Wednesday, October 29, 2008

Week 9 Audio Arts - FM Synthesis

Audio Demo

Basic FM Synth
A simple FM synth based on the readings using one carrier and one modulator. The amplitude of the modulator is controlled in such a way that at 1.0, the frequency of the carrier moves between 0 and 2f (f is the note being played).

One envelope controls the overall output, while another controls the modulation depth.

I found that this simple FM setup was actually very effective for recreating real world sounds, particularly metallic percussion sounds. I was so proud of my gamelan emulation (reverb helped), that I had to design some upbeat gamlan elevator music (see audio demo). I think we should install this in the new Schulz elevator when no-one's looking.

Badass FM Synth
I tried expand the concept by using five oscillators linked in a chain so each oscillator modulates the next, so I suppose there are four modulators and one carrier. Each modulation stage has it's own modulation depth envelope. You can take audio feeds from the last three oscillators in the chain and mix them in stereo.

It did create some complex noisey textures, however I found it hard to create anything that was particularly real world by making use of the extra oscillators.

Reference: Christian Haines. "Additive Synthesis." Lecture presented at the Electronic Music Unit, University of Adelaide, 14 October 2008.

Tuesday, October 28, 2008

Week 10 Audio Arts - Additive Synthesis

Can I have the award for artistic patching? Impressionism vs bidule layouts....

Audio Demo


It's a midi controlled additive synth. You get 6 oscillators, and you get to choose the ratio of their frequencies to the midi note, their amplitude, and their wave type. There is a simple amplitude envelope to control the shape of the sound. It's polyphonic.

More Extreme Additive Synth (pictured)
There are 16 oscillators, each contained in a group. The oscillators are automatically mapped as harmonics in relation to the defined fundamental, however if you want a rougher tone, they can be scattered slightly using the "Freq Freakout Factor".

The amplitude of each oscillator is determined by a set ratio of the previous oscillator, creating a decreasing exponential curve. For example, if the "amp taper factor" is at 0.5, then every harmonic will be half the value of the previous one.

The amplitude of odd and even harmonics can be boosted and cut.

Frequency, pan, and amplitude can be varied individually using random value generators for each oscillator, creating evolving textures.

It's all pretty processor intensive, cause there are a total of 64 oscillators running simultaneousy (including the random modulators).

Reference: Christian Haines. "Additive Synthesis." Lecture presented at the Electronic Music Unit, University of Adelaide, 21 October 2008.

Saturday, October 25, 2008

Week 8 Audio Arts - Amplitude/Ring Modulation


Image (it won't insert properly)


I created an amplitude modulation patch and a ring modulation patch.

The amplitude modulation patch is basically for modulating oscillators that process a carrier in series one after the other.

My second patch (pictured) uses three oscillator. In each "iteration", the three possible combinations of two signals are ring modulated, and three new signals are created. The process is performed 8 times, however there is rarely much signal left at this point. An iteration selector fades between the 8 different stages so you can dynamically move from slight modulation to grainy noise. The 3 outputs can be individually panned to create a true rich stereo signal where the different channels are different but related. 

I wasn't very successful in creating organic sounds, however many types of rich machine sounds were easily accessible. An engine noise from this patch should be usable for the major project. Putting noise through the system created some interesting buffeting wind. The stereo output should be useful in a soundscape.

Reference: Christian Haines. "Amplitude and Ring Modulation." Lecture presented at the Electronic Music Unit, University of Adelaide, 7 October 2008.

Week 9 Creative Computing - Integrated Setup



Jamie and I created a piece by improvising using an integrated setup involving Bidule, Live, the Novation controller, an acoustic guitar, an SM57, and the Mackie mixer.

Within the computer, the Bidule patch was something I have been experimenting with at home in preparation for the major project - this is the first time I've used it with live audio. The string sound is from a pleasing sample we quickly loaded into the simpler in live (with reverb and delay). Bidule was the rewire host, and live was simply rewired in to provide the string sound.

I played synth strings via Novation and tweaked minor parameters of the Bidule patch during the performance while Jamie played guitar. I would have liked to have more control over Bidule, but that will require some further tweaking. I think that using the software with live audio input where we could experiment and interact with it helped us gain better results.

Obviously we used much pre-existing patching in the setup, however I think that we still created an interesting live performance by adding sound and interacting with the patch.

Reference: Christian Haines. "Integrated Setup I." Lecture presented at the Electronic Music Unit, University of Adelaide,  7 October 2008.

Thursday, October 16, 2008

Week 10 Music Tech Forum - Honours Presentations


Presentations from the honours students!

I really loved the animation projects of the first speaker. The animations were really pretty and I think the music fitted them perfectly. I loved the piano style and the vocals and I think I could listen to it on CD for enjoyment. A very comfortable contrast to what we normally hear in forum... but there is a place for that too.

The concept of the program that sonifies network data was totally awesome. I really like the way that it artistically represents the modern world where we're surrounded by machines that are automatically sending each other huge amounts of information before we even ask them to fetch us anything. It's a noisy society where we're bombarded with information from all angles, so why not convert it into music?

I guess what the program currently lacks on the musical front is a way to give the results some kind of macro form...

I was also wondering if the program can pull much meaning from the data values? For example, is it very different to modulating the values of the synth with a random oscillator? But I think these problems can be solved (don't ask me how). 

Reference: Stephen Whittington. "Week 10 Music Technology Forum - Honours Student Presentations." Lecture presented at the Electronic Music Unit, University of Adelaide, 16 October 2008.

Wednesday, October 15, 2008

Week 9 Forum - 3rd Year Presentations!

The third year presentations were all really impressive. I'll just describe that ones that come to mind first.....

Probably my favourite listening experience was the surround supercollider experience. This is partly because this is not something I'm used too but I reckon it's a worthwhile endeavor. I think that the current consumer craze is a little stupid cause I'd much prefer decent stereo sound than poor surround and a lower mids gap between the sub and sattelite speakers....

But luckily the Blue Skys have no such problems.... I loved being able to close my eyes and imagine I was in a jungle of crazy machines. I enjoy Super Collider's glassy sound sometimes (during 3rd year presentations). I'd like to hear it used with other contrasting elements.

The piano chord generator really interested me. I understand the chord choosing algorithm, but I didn't quite get how the rhythm generator works. I think was a really impressive example of this kind of patch. The output sounded really quite musical, I think because of the clever chord voicing system.

The melody warper was a rad idea that worked well I think however I was a bit distracted by all the changes of tempo (which were a good idea). I think it would be awesome to record the midi and render it elsewhere...

Reference: Stephen Whittington. "Week 9 Music Technology Forum - 3rd Year Presentations." Lecture presented at the Electronic Music Unit, University of Adelaide, 9 October 2008.