I'm a researcher at Carnegie Mellon, where I build shape-changing robotic architecture. These structures look like everyday surfaces (such as walls or floors), but they can physically reconfigure themselves to in order to support human activities. The goal is to imbue the built environment with a new type of ambient intelligence, and enable fluent collaboration between people and their surroundings.
Recently, I've been developing new frameworks for orchestrating these systems — leveraging multimodal machine learning to recognize user intentions and dynamically compose robotic behaviors.
Before my PhD, I worked on a variety of creative mechatronic projects for small startups and independent artists (electric motorcycles, musical synthesizers, interactive installations, and more).
My undergraduate background is in Mechanical Engineering, from the University of Maryland.
Publications
Constraint-Driven Robotic Surfaces, At Human-Scale