Portal

Portal is an audiovisual installation using motion controls to interact with an imaginary entity.
The hand movements triggers sounds and visual accordingly, becoming more intense when hands are getting closer to it.

I wanted to create the illusion of opening a portal to an arcane world only by the means of both hands.

Leap motion finger tracking are mapped to oF parameters (camera position, opacity and noise function) using ofxLeapMotion:

testApp.cpp
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
if( leap.isFrameNew() && simpleHands.size() ){
leap.setMappingX(-230, 230, -ofGetWidth()/2, ofGetWidth()/2);
leap.setMappingY(90, 490, -ofGetHeight()/2, ofGetHeight()/2);
leap.setMappingZ(-150, 150, -200, 200);
for(int i = 0; i < simpleHands.size(); i++){
for(int j = 0; j < simpleHands[i].fingers.size(); j++){
int id = simpleHands[i].fingers[j].id;
ofPoint pt = simpleHands[i].fingers[j].pos;
icoSphere.setRadius( width * 0.5);
cam.setDistance(abs(pt.y)*10);
if(cam.getDistance() > 5000) {
cam.setDistance(5000);
}
noise = pt.z;
if(noise > 0) {
noise = 0;
}
noiseParticle = ofMap(abs(noise), 0, 255, 0., 2.);
colorY = ofMap(abs(pt.y), 0, 490, 0, 255);
if (colorY > 255) {
colorY = 255;
} else if (colorY < 0) {
colorY = 0;
}
colorZ = ofMap(pt.z, 0, 150, 0, 255);
if (colorZ < -255) {
colorZ = -255;
} else if (colorZ > 0) {
colorZ = 0;
}
radiusY = ofMap(abs(pt.y), 0, 490, 0, 1.5);
// camera rotation
float mx = pt.z/(float)ofGetWidth();
float my = pt.x/(float)ofGetHeight();
ofVec3f des(mx*360.0,my*360.0,0);
cameraRotation += (des-cameraRotation)*0.5;
fingersFound.push_back(id);
}
}
}
leap.updateGestures();
leap.markFrameAsOld();




The main pad sound in the background is looped and modulated with feedback grain delay + overdrive effect parameters mapped to the leap MoDULAtion max app by Masayuki Akamatsu.



A/V live OSC test

Started working on new visuals for GRAIN noir a/v live set.
I wanted to setup a simple communication between Ableton and openFrameworks via OSC to trigger visual effects according to sound variation.

I already did something similar before with Processing using MIDI data only but this time I wanted a more detailed system with specific frequency range affecting corresponding visuals in real time.

Each track in Ableton got a Max for Live Analysis Grabber loaded to that send both trigger and follower data for low, mids and high frequencies to openFrameworks on a specific port defined in the master GrabberSender.

In order to map the sound data to openFrameworks parameters the follower message is checked and updated using ofxOscMessage:

introTrack.cpp
1
2
3
4
5
6
7
8
9
10
11
12
void introTrack::findOSC(OSC &osc) {
while(osc.receiver.hasWaitingMessages()){
// get the next message
ofxOscMessage m;
osc.receiver.getNextMessage(&m);
// check for follower message
if(m.getAddress() == "/introKickFollower"){
introKickFollower = m.getArgAsFloat(0);
}
}
}


After checking the data type of the received trigger message it’s now possible to pass the sound follower data into oF parameters:

introTrack.cpp
1
2
3
4
5
6
7
8
9
10
11
void introTrack::oscTrigger() {
for(int i = 0; i < NUM_MSG_STRINGS; i++){
ofDrawBitmapString(msg_strings[i], 10, 40 + 15 * i);
if(msg_strings[i] == "/introKickTrigger: int32:1"){
// map sound data when trigger message is received
ofBackground(255, 255, 255);
ofSetColor(0,0,0);
ofCircle(ofGetWidth()/2, ofGetHeight()/2, introKickFollower);
}
}
}



interactive stained-glass window

I recently worked for “La Cité du Vitrail” in french city called Troyes on an interactive installation. The original installation wasn’t working anymore and as the original developer was currently living abroad I was asked to debug it or eventually develop a whole new software.

The setup is quite simple, a webcam tracks the position on a red pointer in real time, a stained-glass window is then projected in the chapel whenever the pointer is put upon the corresponding one on the surface of the table.

alt text

The original program was made with Processing, one application was dedicated to the detection of the pointer and another one was dealing with the display of the correct stained-glass window. Those two applications communicates via server sockets, leading to some signal loss at random interval. On top of that, the camera detection was pretty sensitive to the ambient luminosity of the space, that’s why the program wasn’t working always as expected.

alt text

I first attempted to make just one Processing application but it wasn’t stable enough after the debugging part. Even though I had only 4 days to make it for the inauguration of the new collection, I decided to build an entirely new software in C++ with openFrameworks. I used the OpenCV library for detecting blobs from the camera image and assigned coordinates for each of the 61 triangles represented on the table. When the pointer is above the right coordinates, it triggers an animation with the ofxTween library, displaying the stained-glass window.

alt text

The installation is now stable and it only needs one application for all of the processes.