In today’s society we are literally attached to technology, I can’t really remember the last time I didn’t use my iPod, cell phone or laptop. These items have almost become essential for survival, so I figured why not try to incorporate these technologies with what we are learning in processing. Last night I found out that there is a processing app for the iPhone, iPad and iPod touch ( I bought it $4.99). For the final project I’m trying to think of a way that we can incorporate motion tracking (Arduino or Kinect camera), interactivity (iPod, iPhone, iPad) and graphics (processing code each of us or a groups produce) to produce and interactive installation – I’m thinking a simple structure (box installation) which students/visitors move through and interact with using their specific device to produce a graphic which is then projected on the floor or walls of the structure.
The other part of this idea is to create graphics using processing which are triggered by an exterior sensor (in a mousePressed – like fashion) as students enter and exit the installation space, the graphics will also be projected onto the floors,walls,etc. The sensor would trigger the processing program as students entered and would then change/adapt as they moved through.
I’m not completely sure of the logistics of this/these ideas but I know it is possible. Here are some preliminary sketches