It’s a media center built in an amplified box with speech recognition capabilities. La boîte can understand as many languages as Kiku offers (currently English, Japanese, German & Portuguese) and since Kiku works offline, there’s no need to be connected to the internet.
a custom built actuated guitar controlled by pure data. 3 micro servo are strumming the strings and pure data take care of the global pitch shifting. the humbucker is really far from the bridge to avoid getting noise from the magnetic field (there’s still some). included is a relay to turn on high voltage stuff (a light, a disco ball and whatnot).
i think i made this project to showcase puredata fx capabilities, for this i had to make it controllable from the internet (streaming webcam / audio on pdpatchrepo). finally there’s a chatbot (alice) / tts when only one person is connected to the stream.
it is possible to build an expressive pedal using the 3-axis MMA7361 accelerometer (you can buy this on ebay for 3$ CAD). we are dealing with a limited angle (1110mv to 1300mv) so the output of the x-axis needs to be subtracted and amplified: differential amplifier & op-amp voltage and gain calculator. the next step is to convert the analog signal to digital (using the ADC from your microcontroller) and finally interpret that in pure data (in my case). there’s a hardware low-pass filter but i am also using infinite impulse response low-pass filter in puredata (iir). feel free to write me directly if you need help reproducing this setup.
4 years ago i built an expressive pedal using a led and a photo-resistor:
The “Not-just-for-sci-fi electronic instrument” that is played without being touched + a graphic tablet on top & some very simple electronics in the case (power / convert the theremin via USB). Both antennas (control voltage for volume and pitch) are routed to PureData.
The patch is really just a bridge (open sound control) to MyPaint (open-source graphics application for digital painters). Right now the volume is linked to the diameter of the brush and the pitch is linked to the brightness color (this can be changed in the code see below).
BTW this is the beauty of the open source movement: had the idea in the morning, talk to some people on #mypaint in the afternoon, hack the source for my needs during the night and went to bed with a working prototype. Ready-made Solutions Require Ready-made Problems; For Everything Else There Is Open Source Software!
MyPaint: share/gui/document.py -> pyliblo server (receive from pd)
import liblo,sysclass Document (CanvasController):
def__init__(self, app, leader=None):
created =Truedef oscradius(self, path, args):
adj.set_value(args)def oscv(self, path, args):
h, s, v =self.app.brush.get_color_hsv()
v = argsif v <0.005: v =0.005if v >1.0: v =1.0self.app.brush.set_color_hsv((h, s, v))def osczoom(self, path, args):
self.tdw.set_zoom(args)def oscrotate(self, path, args):
Stroke.serial_number +=1self.serial_number= Stroke.serial_numberreturnTrue
MyPaint: share/mypaint/lib/stroke.py -> pyliblo client (send pressure, x, y to pd)
self.target= liblo.Address(1234)def record_event(self, dtime, x, y, pressure, xtilt,ytilt):
self.tmp_event_list.append((dtime, x, y, pressure, xtilt,ytilt))