Started working on new visuals for GRAIN noir a/v live set.
I wanted to setup a simple communication between Ableton and openFrameworks via OSC to trigger visual effects according to sound variation.
I already did something similar before with Processing using MIDI data only but this time I wanted a more detailed system with specific frequency range affecting corresponding visuals in real time.
Each track in Ableton got a Max for Live Analysis Grabber loaded to that send both trigger and follower data for low, mids and high frequencies to openFrameworks on a specific port defined in the master GrabberSender.
In order to map the sound data to openFrameworks parameters the follower message is checked and updated using ofxOscMessage:
After checking the data type of the received trigger message it’s now possible to pass the sound follower data into oF parameters: