“Soak, Dye in Light” kinect + processing

Lately my searches consist of “kinect”, “processing”, or kinect + processing” and “pong”. While trying to find examples of how people have used the kinect plus processing, I came across an interesting project entitled “Soak, Dye in Light”. Basically the project is a blank canvas, that once touched becomes suffused with vivid colors, resembling  fabric absorbing dye.

[Source]

See video

Advertisements

Sandboxed (no-kinect) version of Final Project for Prototyping

Download the non-kinect version of the final project point of departure.  The four classes, GroupManager, InputInf, SampleGroupObject, SampleObject are all identical, literally the same file.  The only change is to the root sketch which removes the kinect functionality altogether, and replaces its input with the mouse.  This should enable you to do all of your work and test it with the mouse without hassle.

Final Project . version 1

Today in class we will do a walk through of the core framework you will be provided with for the generation of final projects.  The files have been documented to provide you with a sample and an initial body of coments to guide you to the start.  I will continue to work on the framework to provide additional functionality for the groups as it is needed. 

It is not required, but highly recommended that you partner with a fellow classmate.  Pair programming can be an optimal way to catch bugs and work through logic problems more rapidly than by yourself.  Of course the scope of the projects should reflect two minds and bodies.

The project will use the Microsoft Kinect hardware for input and will be projected at a large scale.  We will talk through working on the project in absence of having a Kinect unit.  Of course if you are interested an able or already have a Kinect you can follow the installation instructions to run the device on your own computer.  Even if you are not running the kinect hardware on your computer you should follow the same instructions to install the libraries, etc. so that the processing code will properly compile. 

If you are working out of the lab, we will discuss other methods for developing your final projects.

Final Project Files

Project Idea

The idea I have been thinking about for the final project is to visually represent the intensity of sound made by a person through an array of dots that either increase or decrese in size. Sound could be detected by a series of SPL sensors located at the corners and perhaps midpoints of the array that would sense sound intensity. A gradient would be made from the sensor detecting the higher sound level to the sensors with the lower sound levels. The same idea could be accomplished by using the Kinect and representing the persons position in front of the array, similar to this project…http://www.oobject.com/12-moving-building-facades-videos/aperture-facade-installation/4278/

Proj3ct Id3A

In today’s society we are literally attached to technology, I can’t really remember the last time I didn’t use my iPod, cell phone or laptop. These items have almost become essential for survival, so I figured why not try to incorporate these technologies with what we are learning in processing. Last night I found out that there is a processing app for the iPhone, iPad and iPod touch ( I bought it $4.99). For the final project I’m trying to think of a way that we can incorporate motion tracking (Arduino or Kinect camera), interactivity (iPod, iPhone, iPad) and graphics (processing code each of us or a groups produce) to produce and interactive installation – I’m thinking a simple structure (box installation) which students/visitors move through and interact with using their specific device to produce a graphic which is then projected on the floor or walls of the structure.

The other part of this idea is to create graphics using processing which are triggered by an exterior sensor (in a mousePressed – like fashion) as students enter and exit the installation space, the graphics will also be projected onto the floors,walls,etc. The sensor would trigger the processing program as students entered and would then change/adapt as they moved through.

I’m not completely sure of the logistics of this/these ideas but I know it is possible. Here are some preliminary sketches

Processing, Kinect

In my last post I briefly touched on the Kinect, well here is an interesting music video I found which was produced using processing, kinect, cinema 4D and after effects.

via notcot.org

From the author…

“We started with the Kinect interface library developed for Processing and made available by Daniel Shiffman. Some modifications were introduced, to get our 3D data into Cinema 4D. Each file represents one frame with a coordinate map (point index, x, y, z lines) in plaintext. A threshold filter was added to enable us to filter out any points that were too far. This way we could remove the living room entirely from the sequence. Importing files manually into Cinema 4D would be painful/impossible, so we wrote a little python script that read these files in real-time, allowing us to tweak playback speed, resolution, etc. After importing all our scenes into Cinema 4D, the rendered sequences were taken into After Effects and Premiere. Business as usual from here on.”

Detailed walkthrough is available here including Cinema 4D files download and Processing code.

Moullinex – Catalina from Moullinex on Vimeo.

Fantasy vs. Reality

Often times I have come across a really cool or unique concept, technology, or graphic portrayed in a movie which then makes me think could this really exist? what is the inspiration behind it? Film is one of the most popular forms of entertainment that exist in our world today, and as film makers are pushing the limits with modern technology it really makes you begin to wonder what is really possible in our world today. Two movies in particular have caught my eye, Minority Report starring Tom Cruise and Iron Man starring Robert Downey Jr. In Minority Report, Tom Cruise’s character basically uses an interactive UI (user interface) to catch potential criminals before they have committed a crime. This may present some questions and concerns on many levels but for the sake of this post I am only interested in the character’s interaction with the digital interface. I believe one day we may begin to design and create buildings “hands free”, this leads into the next movie Iron Man. In Iron Man, Tony Starks interacts with his computer digitally, with interactive technology, Stark’s is able to pull apart his Iron Man suit and solve any design and maintenance issues that may be presented. As you can see in the video’s below these two examples are fantasy.

Fantasy pt. 1

Fantasy pt. 2

Now that we have seen the movie side of this technology, let’s take a look at two examples which are taking technology like this and making it a reality. Here is the first example from FAST, it is pretty much a replication of  pre-crime from the movie. Part two of this “Minority Report”, is less infringing upon privacy but captures the same type of UI interaction which we see in the movie, the technology being developed is called g-speak – a spatial operating environment. Check out the video below and see previous link to website for more info.

Reality


g-speak overview 1828121108 from john underkoffler on Vimeo.

The second video begins to push the envelope even further and the video gaming environment begins to invade our own. The technology I speak of is called the kinect for the X-Box gaming system. Many hackers have began to break this technology down and really explore what its capabilities are besides playing games, most interesting to me is how one hacker began to interact with AutoCAD “hands free”.