Tag Archives: AudioKit

Audio Visualisation for iOS with AudioKit & ParticleLab


via Audio Visualisation for iOS with AudioKit & ParticleLab.

Following my recent post regarding ParticleLab, my high performance GPU based Swift particle system component, I’ve spent a happy few days working with Aurelius Prochazka from AudioKit creating an iOS app for visualising audio with particles.

The video above demonstrates how the particles react to birdsong and the effects are, in my humble opinion, quite striking. The visuals are reminiscent of a cloud chamber, with loops and spirals exploding and rapidly decelerating. The system in this video contains 4,000,000 particles and runs at over 40fps on my iPad Air 2.

The link between AudioKit and ParticleLab is AKAudioAnalyzer, this class exposes two properties,trackedAmplitude and trackedFrequency, and I use the value of these to control the position, mass and spin of ParicleLab’s four gravity wells. The code for doing this may look a little arcane, but let me step through it to demystify it.

Radiating Visual Pulses: Visualising AudioKit Sounds in SpriteKit with SKAction


via Radiating Visual Pulses: Visualising AudioKit Sounds in SpriteKit with SKAction.

As I continue to experiment with my AudioKit/SpriteKit physics based audio generation app, I wanted a way to visually feedback on collisions.
The effect I wanted to recreate was an animated pulse radiating from the boxes; a cloned box that would grow and fade out with each collision.
I’ve implemented this effect in my TouchEnabledShapeNode. It exposes a displayCollision() method that accepts a single colorargument.

Spritely: Physics Based Computer Generated Music With AudioKit and SpriteKit


via FlexMonkey/Spritely · GitHub.

Using SpriteKit events to trigger AudioKit sounds

The blog post describing this project is here: http://flexmonkey.blogspot.co.uk/2015/03/physics-based-computer-generated-music.html