MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) is leading a new chapter in 3D printing—one where printed objects don’t just look good, but also feel realistic, move like living organisms, and even carry built-in electronics. These advances aren’t happening in isolation; they’re part of a larger shift toward smarter, more interactive, and more sustainable design and manufacturing.

In recent years, CSAIL has unveiled a range of projects that blend artificial intelligence, materials science, and automation to push the boundaries of what’s possible with additive manufacturing. These innovations are transforming how we interact with 3D-printed objects—making them more tactile, mobile, intelligent, and accessible.

One major advancement is TactStyle, a system that brings the sense of touch into digital modeling. Traditional 3D models focus on visual appearance, but TactStyle allows users to design both how something looks and how it feels, just by uploading an image. Developed by PhD student Faraz Faruqi and Associate Professor Stefanie Mueller, the system separates the way something looks from the way it feels physically. For example, a user can 3D print an object that not only looks like a woven basket but also mimics its texture. Introduced in March 2025 and presented at CHI2025 in Japan, TactStyle is especially valuable for applications in home decor, product prototyping, and education for the visually impaired.

While TactStyle focuses on texture, another project, Xstrings, brings movement into 3D printing. Traditionally, objects that move—like robotic limbs or wearable tech—require multiple components and tedious assembly. Xstrings changes that by embedding motion systems directly into a single print. Developed by postdoctoral researcher Jiaji Li and senior author Stefanie Mueller, this method prints objects with built-in tensioned cables that allow them to curl, twist, or grip—all straight off the print bed. Using common FDM 3D printers, the system can create flexible devices like robotic lizards or kinetic sculptures, with motion types customizable during the design process. The printed cables can endure over 60,000 pulls, and the approach can cut production time by up to 40%. Xstrings opens the door for use in soft robotics, wearable devices, and even space applications where tools are limited and functional prints are needed on demand.

CSAIL is also tackling one of the biggest challenges in 3D printing: electronics. In late 2024, researchers led by Luis Fernando Velásquez-García and graduate student Jorge Cañada demonstrated the first fully 3D printed, semiconductor-free logic gates. These devices, made from copper-infused biodegradable polymer, can perform basic control tasks like switching a motor on or off. While not as powerful as silicon chips, they represent an important step toward locally fabricated, low-cost electronics that can be printed without cleanrooms. Published in Virtual and Physical Prototyping, this work could help decentralize electronics manufacturing, especially during global supply disruptions.

Another direction CSAIL is exploring is sustainability and surface customization through a method called speed-modulated ironing, developed in partnership with TU Delft. Introduced in October 2024, the technique uses a dual-nozzle 3D printer—one to lay down the filament, and a second heated nozzle to smooth the surface at varying speeds. Faster passes create glossy finishes, while slower ones produce matte or textured results. This allows users to create visually and tactilely distinct areas without switching materials or adding post-processing steps. The technique works especially well with wood-filled or heat-sensitive plastics and can even embed QR codes into the print using texture alone. It’s a promising move toward more efficient, sustainable 3D printing with less material waste.

Making customization easier for non-experts is another CSAIL priority. In 2023, the lab introduced Style2Fab, an AI-powered system that lets users personalize 3D models using simple text prompts. Instead of relying on complex CAD software, users can type in requests—like making a vase look like a seashell—and the system automatically modifies the design while preserving its function. Developed by Faraz Faruqi, Stefanie Mueller, and collaborators including Megan Hofmann from Northeastern University, Style2Fab uses deep learning to separate a model into functional and aesthetic parts. This ensures that visual changes don’t interfere with performance, making it ideal for customizing practical items like assistive devices. The goal is to make digital fabrication accessible to anyone, regardless of technical skill level.

Together, these projects illustrate CSAIL’s broader mission: to make 3D printing more intelligent, intuitive, and impactful. By combining AI with creative design and material science, the lab is helping shape a future where printed objects are not just static models but smart, dynamic tools that fit seamlessly into everyday life.

By Impact Lab