Thursday, November 15, 2012
Integers between 1 and 10 seemed to be the most simple data that I could turn into a three dimensional model and so I decided to use the number gestures using a single hand that I had noticed when in China. I used KinectToStl to get the hand models rendered as STL mesh files from Kinect depth data. After editing the STL files in Meshlab, I outputted the combined files to the MakerBot to print.
Wednesday, November 14, 2012
Saturday, November 10, 2012
Tuesday, November 6, 2012
Using OpenFramewoks, the challenge for this assignment was to find the 'fore point' from the Kinect's Depth Image and treating it as the tip of a pencil.
I've merged the ribbon example by James Jorge with the Kinect point cloud example by Kyle McDonald.
Saturday, November 3, 2012
For the first week's assignment ("make a 3d scanner"), Manuela and I decided to scan and render an apple, taking two approaches.
1) Milk Scan (non invasive)
We cut an apple in half, put it in a container, and covered it with milk in layers as tall as a sheet of acrylic that we planned to use to reconstruct it later. Of course this was a serious scientific experiment, and as you can see in the photos we marked registration points, measured the layers of milk, and made sure that the table was level.
The one problem that we had –the lighter lower half of the apple started to float–, we fixed by punching metal nails into it. So the scan became quite invasive after all...
Here is a sample of the photos we took:
Once we had the images, we processed them to get the outlines to send to the laser cutter, but since the machine was down, we continued with our second plan.
2) Knife Scan (invasive)
For our second scan we cut an apple in thin slices, lit from below so that the details of the texture were visible:
, made a Processing sketch to reconstruct it: