accelerometed expressive pedal

it is possible to build an expressive pedal using the 3-axis MMA7361 accelerometer (you can buy this on ebay for 3$ CAD). we are dealing with a limited angle (1110mv to 1300mv) so the output of the x-axis needs to be subtracted and amplified: differential amplifier & op-amp voltage and gain calculator. the next step is to convert the analog signal to digital (using the ADC from your microcontroller) and finally interpret that in pure data (in my case). there’s a hardware low-pass filter but i am also using infinite impulse response low-pass filter in puredata (iir). feel free to write me directly if you need help reproducing this setup.

differentialgainopamp

Videos

Alternatives

4 years ago i built an expressive pedal using a led and a photo-resistor:

there is also this solution using capacitance sensor and v-usb:
http://www.ise.pw.edu.pl/~wzab/MIDI/pedal/index.html

of course, the traditional potentiometer solution:
http://philaudio.wordpress.com/projects/phi-t/phi-t-control/

02

10 2013

exercise bike

so i needed to get in shape (still do): borrowed an exercise bike from a friend. i wanted to see the biggest cities in the world while doing so: google earth. all that was missing was a way to control google earth from the bike: kinect and the speech recognition sdk.

Video

Source

my solution is a modification of this project: Kinect Excercise. added speech recognition (let me ride… a big city) and some keyboard shortcut and mouse automation for google earth (using http://www.autoitscript.com) + OSC for an upcoming project (small game based on this concept). you can download the hacky c# project.

20

08 2013

guitar neck tracking & gesture recognition

i finally found something useful to do with my kinect: tracking the neck of a guitar and using gesture recognition to control the FX rack of a pure data patch.

Video

Guitar neck tracking

i used the natural interaction middleware hand tracking example (PointViewer) and added open sound control (liblo). latency is 33ms. you can download the source and the executable for linux (64bit).

Gesture recognition

i am using the neat gesture recognition toolkit by Nick Gillian. using the DTW (Dynamic Time Warping) example (coded in openframeworks), i simply added open sound control to send the predicted gesture to pure data. you can download the source and the executable for linux (64bit).

Pure Data

nothing fancy here, just a patch to send the tracking via osc to the gesture recognition i get back the result from it, apply some FX to an incoming signal using X, Y, Z. you can download the patch.

09

07 2013

theremin à crayon

The “Not-just-for-sci-fi electronic instrument” that is played without being touched + a graphic tablet on top & some very simple electronics in the case (power / convert the theremin via USB). Both antennas (control voltage for volume and pitch) are routed to PureData.

The patch is really just a bridge (open sound control) to MyPaint (open-source graphics application for digital painters). Right now the volume is linked to the diameter of the brush and the pitch is linked to the brightness color (this can be changed in the code see below).

BTW this is the beauty of the open source movement: had the idea in the morning, talk to some people on #mypaint in the afternoon, hack the source for my needs during the night and went to bed with a working prototype. Ready-made Solutions Require Ready-made Problems; For Everything Else There Is Open Source Software!

Video

Source

MyPaint: share/gui/document.py -> pyliblo server (receive from pd)

import liblo, sys
 
class Document (CanvasController):
	def __init__(self, app, leader=None):
		global created
		if(created == False):
			self.server = liblo.Server(9997)
			self.server.add_method("/mp/radius", 'f', self.oscradius)
			self.server.add_method("/mp/zoom", 'f', self.osczoom)
			self.server.add_method("/mp/rotate", 'f', self.oscrotate)
			gobject.timeout_add(20, self.pollcheck)
			created = True
 
    def oscradius(self, path, args):
        adj = self.app.brush_adjustment['radius_logarithmic']
        adj.set_value(args[0])
 
    def oscv(self, path, args):
        h, s, v = self.app.brush.get_color_hsv()
        v = args[0]
        if v < 0.005: v = 0.005
        if v > 1.0: v = 1.0
        self.app.brush.set_color_hsv((h, s, v))
 
    def osczoom(self, path, args):
        self.tdw.set_zoom(args[0])
 
    def oscrotate(self, path, args):
        self.tdw.set_rotation(args[0])
 
    def pollcheck(self):
        self.server.recv(10)        self.finished = False
        Stroke.serial_number += 1
        self.serial_number = Stroke.serial_number
        return True

MyPaint: share/mypaint/lib/stroke.py -> pyliblo client (send pressure, x, y to pd)

import liblo, sys
    def __init__(self):
        self.target = liblo.Address(1234)
 
    def record_event(self, dtime, x, y, pressure, xtilt,ytilt):
        self.tmp_event_list.append((dtime, x, y, pressure, xtilt,ytilt))
        liblo.send(self.target, "/mypaint/pressure", pressure)

PureData patch:
bridgepd

25

03 2013

heavy box

NFAQ

how heavy?
32lb (14.5 kg) roughly a medium-sized dog

what are you going to do with it?
a centralized place for my digit/art project and my analog output. think of it as an effects unit or dsp in a box, but it’s also a silent computer with good processing power (as the time of this writing). the main project is to remix videos on the fly while playing an instrument.

what os/software are you using?
ubuntu studio with a low-latency kernel; pure data (effect rack, video player); sooperlooper (looping station); control (osc android application) all open source project

what kind of wood did you use?
okoumé (marine plywood). it’s a light wood and i was able to cut it with a x-acto!

i want more information
sure! source of the project (mainly related to the electronics) are here

13

01 2013