fabricate yourself.

Presented at the Tangible, Embedded and Embodied Interaction Conference, the setup turns a Kinect into 3-D scanner. The Kinect is hooked up to a Mac, and users can pose in front of it and see a real-time wire-frame representation onscreen. When they see what they like, they hit a button and they are captured in an STL (stereolithography) file. This file is sent to the 3-D printer, where a small, low-resolution model is finally spat out.

Pukas sensor surfing.

Surfing is still a sport governed by feelings. The driving forces behind this joint project, PUKAS and TECNALIA, aim to “turn feelings into facts and figures” and provide as yet unquantified data that can be directly applied to improve the features of surfboards, the technical performance of surfers and/or measurement of parameters during competition.

kinected conference.

What we can do if the screen in videoconference rooms can turn into an interactive display? With Kinect camera and sound sensors, We explore how expanding a system’s understanding of spatially calibrated depth and audio alongside a live video stream can generate semantically rich three-dimensional pixels containing information regarding their material properties and location. Four features are implemented, which are “Talking to Focus”, “Freezing Former Frames”, “Privacy Zone” and “Spacial Augmenting Reality”.