menu

Scan Your Stuff Wrapup link

Wooden sculpture above, by Charles Myers

On April 7-8, the IRC held its inaugural Scan Your Stuff event, where we invited members of the UMBC community to bring in objects to be scanned using our photogrammetry rig. People could bring in anything they wanted, as long as it was bigger than a basketball and smaller than a suitcase. In return, the objects would be photographed by 94 cameras simultaneously, and then the images would be stitched together into a 3D model. By scanning a wide variety of items, with different sizes, shapes, colors, and textures, and seeing what best converts to 3D geometry, the IRC will be able to refine the algorithms that govern the photogrammetry process and build more complete, more accurate models.

Over the two days, 19 people brought in a total of 39 objects to be scanned. These included 8 dolls or stuffed animals, and three handmade sculptures brought in by their makers. We took a trip down memory lane with an 8mm film projector, several Apple II components, and Star Wars models from the 1970s. The oldest item we scanned was a two-handled cup from the 6th century BCE, known as a bucchero, while the newest was probably a tin of Blistex. We scanned ice skates and a cowboy hat. The Athletic Department brought over the 2008 Men’s Basketball trophy. Two people brought in large paper wasp nests—one that was about 50 years old, and one that was collected recently. And two people brought ukuleles.

UMBC Men's Basketball Trophy, 2008

We learned a lot from processing all of these images. First, we learned that our 94 cameras might not be enough. They do a great job of capturing the sides of objects, but not as well with the tops of things. We may need to restructure the rig (and buy a few more cameras) in order to get better density of coverage from the top down.

We were surprised to find that specularity (reflectivity) was less of an issue than we had anticipated. We did spray some shiny objects (like the blades of ice skates) with powder so that they would be more matte, but in general the software handled it well. We were surprised to find that the algorithms didn’t reject the white background of the rig as completely as we might have expected. We also found more instances of shelling, where the model had thin echoes or copies that didn’t fully match up. These look like splashes of texture coming off the objects.

Finally, we discovered that disabling CUDA (which allows us to run code on the GPU rather than the CPU) generated different results than when it is enabled. We generally use the GPU to reconstruct the 3D models, because it is significantly faster. However, for several of these scans we relied on the CPU exclusively. We found that the model was more accurate, but took much longer to be constructed.

All in all, Scan Your Stuff gave us a lot to think about for the future of photogrammetry at UMBC. We plan on holding a similar event in the fall, so start thinking about what you want to see scanned.