NetGlitch

This is a big one. It’s a packet analyzing network sniffer that outputs crazy rhythms, mapped to crazy sounds. We (me and @0xtosh) set it up at HAR2009 were people could log in to a linux VM, run some scripts, ours or their own, to scan the network and netglitch broadcasted a shoutcast stream to which you could tune into to listen to your own packets.

APC update

I hardly have time lately to code anything at home. I just dusted of the APC and I made slight progress going almost no sound to some sound…

That is in “dongle” mode. In stand alone mode it works fine. I even added a set of capacitors so you can switch between several sorts of sounds and a light sensitive resistor for that look-without-touching-I-can-play-this-APC-touch

I was successful in attaching it to the computer with the regular standard arduino’s + firmata and playing it from
puredata.But this is an arduino pro mini (5v 16Mhz, from sparkfun). Apparently firmata isn’t entirely the same on these mini’s. Although Since arduino 0017 I can connect and send data, it’s not without errors. First of all


[arduino]_WARNING_currently_unsupported: 176
[arduino]_WARNING_currently_unsupported: 160
[arduino]_WARNING_currently_unsupported: 176
UNKNOWN_INPUT_COMMAND: 0 10263
UNKNOWN_INPUT_COMMAND: 8 12339
UNKNOWN_INPUT_COMMAND: 11 9710

Is what I see when I try to run the pd patch. lot’s of those, thousands…

Another weird thing is that even though pin 13 is unconnected on the arduino, It’s the only pin that seems to work. I really don’t get it. Is there some sort of different mapping? normally only pin 9-10-11 are supposed to work.

any ideas anyone?

Arduino controlled APC

I pimped an ordinary Atari Punk Console with some extra LEDs , a light sensitive resistor and … an arduino. It’s still a build in progress, I will have some pics up soon, but it WORKS.

I had some basic arduino sketch running, and I was fiddling with the pins on the 556 when suddenly a sound occurred. I was reproducible and so the synth was born.
I’m now working on the software which will either firmata/pduino or midi if I get the midi-in working. A third option is the netshield and puchabe.

I’ll keep you posted…

blender augmented reality

Various techniques and software exists to bring more interaction and integration with reality. As this is a fairly new domain most of the projects are still in research phase. Some are already easy to implement, like motion tracking based on colorful objects or camera motion based on movements (accelerometers and other sensors). Blender is a good experimentation environment because it’s open and you can interface it with virtually everything using python.

some things already look stable and usable, like levelhead. This application is more then a proof of concept. It proves interesting new concepts are possible using these techniques. The tools are getting easier to use and  the CPUs and GPUs are finally up to realtime processing and rendering.

current usable tools include:

  • osgart: a library that simplifies the development of Augmented Reality or Mixed Reality applications by combining the well-known ARToolKit tracking library with OpenSceneGraph.
  • ARToolKit: a software library for building Augmented Reality (AR) applications. Open source, with professional support from artoolworks inc. if needed.
  • Studierstube Tracker: a computer vision library for detection and pose estimation of 2D fiducial markers. It looks great, but it’s not open source.
  • libraries for processing: by Bryan Chung
  • MotionTrack: by akta (Marco Rapino)
  • GridFlow: A Multidimensional data flow processing library for PureData, designed for interactive multimedia
  • Voodoo: A tool for the integration of virtual and real scenes, not intended for real time processing
  • Eyesweb: for gestural analysis, could be used to get feedback from the reality,  speaks OSC

BlenderTrack Image based tracking BGE Demo from mike pan

If you know of good tools in the AR-area, please leave a note. Once the workflow is on track the developing of applications will be easier. Combining gestural data with 3d modeling tools so real time animation of 3D-models using the body of an actor as input is one that comes to mind. Another one could be to overlay your newspaper with interactive data, like graphs or models of the new product being reviewed. Already mobile phone apps exist that let you view additional information about Landmark X when you point them at Landmark X. But also advertising comes to mind. Or using it as a try before you buy tool. Imagine you could put that new sofa  on the spot where you always wanted it, without having to move it, assemble it and then notice the colors don’t match… just don’t try to sit on it.

axiome

Axiome is a mlr (monome style) pd patch. I’m still looking for a good beatslicer to built my own custom controller. This one is a candidate for sure. Provided I still don’t own a monome, I will have to hack it up so it becomes compatible with my yet-to-be-designed-budget-friendly-midi-and-osc-controller. Lucky it’s GPL’ed