The Smithsonian opened a virtual museum last year. The Smithsonian X 3D Explorer allows users to take a virtual tour of (and even 3D print) high-definition digital models of artifacts like Lincoln’s life mask or the Wright Brother’s plane. (Video)
It’s a brilliant concept.
Once digital, artifacts can be easily accessed and explored in detail by researchers, students, and ordinary museum goers. Whereas, most museums only keep a fraction of their inventory on display—a digital museum can show it all off all the time. And of course, digitizing objects saves them for posterity should they be lost or damaged.
But there’s a problem. The Smithsonian project? Laborious and expensive.
The team took over eight months to scan and 3D model just twenty objects. Meanwhile, the Smithsonian collection comprises 137 million artifacts. As the Smithsonian’s Günter Waibel writes, “Capturing the entire collection at a rate of 1 item per minute would take over 260 years of 24/7 effort.”
Clearly, at the current slower rate, it would take decades to digitize even a tiny fraction of the world’s cultural heritage. But the problem, that the process is slow and labor intensive, is a familiar challenge with a familiar solution. Robotics and automation.
Recently, the German visual computing research institution, Fraunhofer IGD, unveiled an automated 3D scanning system, CultLab3D, that they hope will increase the rate at which we can 3D scan artifacts for a tenth or twentieth of today’s cost.
How does it work? Objects on the machine’s conveyor belt are positioned in the middle of two concentric imaging arcs. A ring of lights and high resolution cameras scan the object from all angles, and after the software spot checks the resulting 3D model, a scanner on a robotic arm moves in to re-image any gaps in the model.
Beyond recording geometry, CultLab3D can reconstruct and record texture and optical materials properties with sub-millimeter detail. The process is fully automated and takes roughly five minutes to complete. (Although much faster than manual methods, it would still take a concerted effort over many years to digitize the world’s most valuable pieces.)
Currently, the system can scan artifacts weighing up to 110 pounds that are no more than 2 feet tall and 2 feet in diameter. But the team has plans for automated solutions to image larger objects too—a robotic arm on wheels, for example, and for the biggest objects like buildings or monuments, robotic drones. (These latter devices may prove more difficult to develop.)
Last month, a prototype CultLab3D scanner conducted a test run in Frankfurt’s Liebieghaus Skulpturensammlung museum. Fraunhofer hopes to run further tests this year but says they’ll be ready to begin production and marketing in 2015.
Whether CultLab3D is the best machine for the job remains to be seen. What’s more certain is that it’s a job worth doing.
The loss or destruction of priceless cultural artifacts by natural disaster or war is all too common. Of course, most people know about the tragic burning of the ancient world’s Alexandria Library and its hoard of knowledge. But similarly destructive events occur to this day, whether it’s destruction by fire, earthquake, or at the hands of soldiers.
Beyond simply preserving artifacts, however, creating high resolution digital copies allows for any number of otherwise impossible applications.
These include instant access to artifacts from anywhere in the world. Today, you might tour them on a screen, tomorrow on an Oculus Rift or other virtual reality device. Observers can get as up close and personal as they like without harm.
And just as we can copy, store, share, and sample masterpieces of music digitally—the same would be true of the world’s great sculptures and monuments. They might be used to populate future virtual worlds or re-materialize, picture perfect under the nozzle of a 3D printer or a digitally guided milling machine.
We’ve already begun digitally backing up and decentralizing copies of humanity’s accumulated knowledge in print—it’s about time we began to back up all the rest.
Via Singularity Hub