touchpad controller

I excavated a touchpad from an old laptop. I’m almost ready for my first reverse engineering project. Altough a quick google search allready returned me a possible similar attempt, I hope the electric gods ar a bit cooporative on this. This wil make a super nice and compact MIDI-controller.
It say’s “ALPS 94v-0” and “JCI-S1S” on the PCB and “1CAO17A” on the chipset. Zero results on google…

we’ll see , I hope I don’t blow it…

midi in

Narbotic adds midi in to my wishlist of urgently to be done stuff. I have been playing with the idea for midi-out on arduino, but this seems a pretty good solution for finally adding the midi capabilities to my WSG and my WP-20. This should also eliminate my needs for a hardware sequencer, although I’d love to build one.

arduino-midi-in by narbotic
arduino-midi-in by narbotic

The main thing I hove to achieve is being able to turn the knobs on my synths faster. This way I can really program presets, or sequence them. Probably I will be able to break some stuff too…

Continue reading “midi in”

blender augmented reality

Various techniques and software exists to bring more interaction and integration with reality. As this is a fairly new domain most of the projects are still in research phase. Some are already easy to implement, like motion tracking based on colorful objects or camera motion based on movements (accelerometers and other sensors). Blender is a good experimentation environment because it’s open and you can interface it with virtually everything using python.

some things already look stable and usable, like levelhead. This application is more then a proof of concept. It proves interesting new concepts are possible using these techniques. The tools are getting easier to use and  the CPUs and GPUs are finally up to realtime processing and rendering.

current usable tools include:

  • osgart: a library that simplifies the development of Augmented Reality or Mixed Reality applications by combining the well-known ARToolKit tracking library with OpenSceneGraph.
  • ARToolKit: a software library for building Augmented Reality (AR) applications. Open source, with professional support from artoolworks inc. if needed.
  • Studierstube Tracker: a computer vision library for detection and pose estimation of 2D fiducial markers. It looks great, but it’s not open source.
  • libraries for processing: by Bryan Chung
  • MotionTrack: by akta (Marco Rapino)
  • GridFlow: A Multidimensional data flow processing library for PureData, designed for interactive multimedia
  • Voodoo: A tool for the integration of virtual and real scenes, not intended for real time processing
  • Eyesweb: for gestural analysis, could be used to get feedback from the reality,  speaks OSC

BlenderTrack Image based tracking BGE Demo from mike pan

If you know of good tools in the AR-area, please leave a note. Once the workflow is on track the developing of applications will be easier. Combining gestural data with 3d modeling tools so real time animation of 3D-models using the body of an actor as input is one that comes to mind. Another one could be to overlay your newspaper with interactive data, like graphs or models of the new product being reviewed. Already mobile phone apps exist that let you view additional information about Landmark X when you point them at Landmark X. But also advertising comes to mind. Or using it as a try before you buy tool. Imagine you could put that new sofa  on the spot where you always wanted it, without having to move it, assemble it and then notice the colors don’t match… just don’t try to sit on it.

axiome

Axiome is a mlr (monome style) pd patch. I’m still looking for a good beatslicer to built my own custom controller. This one is a candidate for sure. Provided I still don’t own a monome, I will have to hack it up so it becomes compatible with my yet-to-be-designed-budget-friendly-midi-and-osc-controller. Lucky it’s GPL’ed