Archive for the ‘Pure Data’ category

Phimatics

Just a quick video of the phibow connected to the phimatics project (not yet documented).

Thanks to Ouellet Fredericks family for exposing (nice frame) my PCB in their home!

09

03 2018

Bassynth & drum

Had an old bass laying around, lots of love & hot glue later the bassynth was born. Will document a little bit later (including the electronic drum), just wanted to share what my friend confettis created:

09

03 2018

Boîte à oiseaux

La Boîte à oiseaux is a combination of 3 projects:

It’s a media center built in an amplified box with speech recognition capabilities. La boîte can understand as many languages as Kiku offers (currently English, Japanese, German & Portuguese) and since Kiku works offline, there’s no need to be connected to the internet.

Video

24

01 2014

strumio

a custom built actuated guitar controlled by pure data. 3 micro servo are strumming the strings and pure data take care of the global pitch shifting. the humbucker is really far from the bridge to avoid getting noise from the magnetic field (there’s still some). included is a relay to turn on high voltage stuff (a light, a disco ball and whatnot).

i think i made this project to showcase puredata fx capabilities, for this i had to make it controllable from the internet (streaming webcam / audio on pdpatchrepo). finally there’s a chatbot (alice) / tts when only one person is connected to the stream.

Photo

Video

Source

i don’t think it is useful without the actuated guitar, but here’s the patch:
http://pdpatchrepo.info/patches/patch/40

02

11 2013

pure data patch repository

pdpatchrepo
http://www.pdpatchrepo.info/

Pure Data have a repository for abstractions and externals, but not for patches. Not anymore, I took the time to code one. There’s multiple ways of searching a patch: by platforms, tags (adc~, notein), is audio, is video, is generative…

The site also feature a live stream (video & audio) with networked gui so that multiple visitors can interact with the streaming patch. Of course there’s a latency in the feed when playing with the knobs (betweeen 3-5 seconds) but it is still a fun way to jam with others. The bandwidth is provided by the Institute of Electronic Music and Acoustics.

RSS feeds

Patch (when a new patch is added)
Stream (when a new patch is stream)

Screenshot

31

10 2013

accelerometed expressive pedal

it is possible to build an expressive pedal using the 3-axis MMA7361 accelerometer (you can buy this on ebay for 3$ CAD). we are dealing with a limited angle (1110mv to 1300mv) so the output of the x-axis needs to be subtracted and amplified: differential amplifier & op-amp voltage and gain calculator. the next step is to convert the analog signal to digital (using the ADC from your microcontroller) and finally interpret that in pure data (in my case). there’s a hardware low-pass filter but i am also using infinite impulse response low-pass filter in puredata (iir). feel free to write me directly if you need help reproducing this setup.

differentialgainopamp

Videos

Alternatives

4 years ago i built an expressive pedal using a led and a photo-resistor:

there is also this solution using capacitance sensor and v-usb:
http://www.ise.pw.edu.pl/~wzab/MIDI/pedal/index.html

of course, the traditional potentiometer solution:
http://philaudio.wordpress.com/projects/phi-t/phi-t-control/

02

10 2013

guitar neck tracking & gesture recognition

i finally found something useful to do with my kinect: tracking the neck of a guitar and using gesture recognition to control the FX rack of a pure data patch.

Video

Guitar neck tracking

i used the natural interaction middleware hand tracking example (PointViewer) and added open sound control (liblo). latency is 33ms. you can download the source and the executable for linux (64bit).

Gesture recognition

i am using the neat gesture recognition toolkit by Nick Gillian. using the DTW (Dynamic Time Warping) example (coded in openframeworks), i simply added open sound control to send the predicted gesture to pure data. you can download the source and the executable for linux (64bit).

Pure Data

nothing fancy here, just a patch to send the tracking via osc to the gesture recognition i get back the result from it, apply some FX to an incoming signal using X, Y, Z. you can download the patch.

09

07 2013

theremin à crayon

The “Not-just-for-sci-fi electronic instrument” that is played without being touched + a graphic tablet on top & some very simple electronics in the case (power / convert the theremin via USB). Both antennas (control voltage for volume and pitch) are routed to PureData.

The patch is really just a bridge (open sound control) to MyPaint (open-source graphics application for digital painters). Right now the volume is linked to the diameter of the brush and the pitch is linked to the brightness color (this can be changed in the code see below).

BTW this is the beauty of the open source movement: had the idea in the morning, talk to some people on #mypaint in the afternoon, hack the source for my needs during the night and went to bed with a working prototype. Ready-made Solutions Require Ready-made Problems; For Everything Else There Is Open Source Software!

Video

Source

MyPaint: share/gui/document.py -> pyliblo server (receive from pd)

import liblo, sys
 
class Document (CanvasController):
	def __init__(self, app, leader=None):
		global created
		if(created == False):
			self.server = liblo.Server(9997)
			self.server.add_method("/mp/radius", 'f', self.oscradius)
			self.server.add_method("/mp/zoom", 'f', self.osczoom)
			self.server.add_method("/mp/rotate", 'f', self.oscrotate)
			gobject.timeout_add(20, self.pollcheck)
			created = True
 
    def oscradius(self, path, args):
        adj = self.app.brush_adjustment['radius_logarithmic']
        adj.set_value(args[0])
 
    def oscv(self, path, args):
        h, s, v = self.app.brush.get_color_hsv()
        v = args[0]
        if v < 0.005: v = 0.005
        if v > 1.0: v = 1.0
        self.app.brush.set_color_hsv((h, s, v))
 
    def osczoom(self, path, args):
        self.tdw.set_zoom(args[0])
 
    def oscrotate(self, path, args):
        self.tdw.set_rotation(args[0])
 
    def pollcheck(self):
        self.server.recv(10)        self.finished = False
        Stroke.serial_number += 1
        self.serial_number = Stroke.serial_number
        return True

MyPaint: share/mypaint/lib/stroke.py -> pyliblo client (send pressure, x, y to pd)

import liblo, sys
    def __init__(self):
        self.target = liblo.Address(1234)
 
    def record_event(self, dtime, x, y, pressure, xtilt,ytilt):
        self.tmp_event_list.append((dtime, x, y, pressure, xtilt,ytilt))
        liblo.send(self.target, "/mypaint/pressure", pressure)

PureData patch:
bridgepd

25

03 2013

heavy box

NFAQ

how heavy?
32lb (14.5 kg) roughly a medium-sized dog

what are you going to do with it?
a centralized place for my digit/art project and my analog output. think of it as an effects unit or dsp in a box, but it’s also a silent computer with good processing power (as the time of this writing). the main project is to remix videos on the fly while playing an instrument.

what os/software are you using?
ubuntu studio with a low-latency kernel; pure data (effect rack, video player); sooperlooper (looping station); control (osc android application) all open source project

what kind of wood did you use?
okoumé (marine plywood). it’s a light wood and i was able to cut it with a x-acto!

i want more information
sure! source of the project (mainly related to the electronics) are here

13

01 2013

face 2 blender

21

07 2009

gesture recognition in pure data with easystroke

14

07 2009

chess music

A simple bridge between glchess and pure data:

02

07 2009

theremin 2 pure data

My first electronic project (2005):

02

07 2009

stock market music

30

06 2009