
MedusAI
Recently built interactive exhibit using strings, drums, and robotic arms to create a robotic musician that uses vision and audio input to modify lights, robot movemnents, and string playing.
Robotic arm dancing based on human motion capture data
During my masters thesis, I worked on degree of freedom reduction to have robot arms respond gesturally to human dances. We drew inspiration from disneys pillars of animation to make our robots move lifelike. So we used the mocap to create “follow through.”
Different movements were applied for different cases with dancers to create unique artistic mappings.
We took dancers movements and looked at the way humans move fluidly. We then figured out a way to manipulate the damping value of impedance controlers to make robots move more like living creatures. We compared robots natural follow through with humans and learned
Robots should move fluidly based on their architecture, not human architecture
RoboGroove: Creating Fluid Motion for Dancing Robotic Arms
Amit Rogel, Richard Savery, Ning Yang, Gil Weinberg
MOCO conference 2022