Performing Sensor32 at Cafebar Mona  Munich, DE with Karina Erhard "Roboterjazz" March 25, 2023

 

Sensor 32 Entry video for the Guthman Musical Instrument Competition 2023 (reached finalist state for now). KlickEntryVideo. Live Video stream.

Sensor 32 Live experimental Stream (dark) from Ircam Forum at NYU 2022  at Garage Brooklyn

Same Soundtrack, however the picture data from the stream transformed to robot-style creature conducting..

FAQ Sensor 32

 

Presentation More Parameters for Your Music in Realtime, held a MOXSonic 2023

Experimental Work for sensor32.com

This application uses the entire data space of the independent sensor outputs. If you recognize gestures instead, you reduce the amount of information to symbols, an information loss. For me, this represents a limitation of the expression possibilities.

Shots from development of 32 sensor array for performance at Ircam Forum NYU Oct 2022

 

The new sensor array allows the control of numerous parameters simultaneously in real time (polyphonic). This seems to me essential for improvisation. Hands or legs and the upper body can be used. IR distance sensors based on the triangulation principle generate analogue voltages that are transmitted to MIDI continuous controllers. The array features 32 up to 48 controllers. Most of them feed algorithms. For the player's orientation, LED bars with 10 to 20 display levels are placed close to the sensor.
In the first phase, resynthesis from NI Reaktor was used for the sonification.

 

The mechanical construction in the manner of a construction kit allows the arrangement of the sensor boards in the 3 dimentional space to be adapted.
Potential: besides composition algorithms, or DY-ing, automata (I built some), spatial effects, light and video synthesizers could also be controlled. So we are looking at a kind of half-a-conductor (with the hardware made of semiconductors). Gesture recognition or AI are not planned, I prefer direct cause and effect access. I want to learn myself. Of course, a lot of configuration work has to be done for each composition. And rehearsal effort on the part of the performer.

 

I do not start from the paradigm of universal gestures. Rather, the playing (ad hoc composing) of a complex instrument is imitated: operating an organ with hands and feet produces gestures as a side effect, but they always depend on the purpose of the sound production and the construction of the console. In this respect, a traditional approach that relates to highly developed performance technique. The benefit of my system is that it makes numerous parameters simultaneously available to computer algorithms of all kinds. Gestures also arise, of course, which is composed visualisation of sound.

(Program Notes for Ircam Forum at NYU 2022)