Irritable Lamp in a Garden

Samantha C Ho
4 min readNov 21, 2020

This project was created and executed in collaboration with Benjamin Stern, Avi Rudich, Elton Spektor, Tom Scherlis, and Abel Tesfaye for 16–375 Robotics for Creative Practices at Carnegie Mellon University. The project itself was largely self-led with the only prompt: how can we use robotics to create tension and convey emotion.

This project began as an exploration of personifying robots. We thought of making a lamp that would both give and command attention from it’s viewers.

The show itself consisted of three main components: the lamp, the players, and the visitors. Our visitors are invited to enter the ecosystem that the lamp and players already exist in. As the lamp gives every player their turn to perform, by shining light directly on themOur robot stood in the middle and shined light on it’s performers until a viewer interrupted the show.

The Players

In terms of fabrication, we knew that our system of robots would be a conglomeration of lasercut and 3D printed parts. We began thinking of forms and characteristics each robot would take on. Originally we thought of expressions of emotions like laughter and fighting, the concept being that the lamp would be the surveyor and patron of these performances in a garden. from there we found our way to animals both conceptual and literal. These supporting robots eventually played roles in our garden, with our lamp standing in the middle. We repurposed robots from our past show in addition to creating new conceptual robots to take on personas that the lamp would be supervising.

kinematic models that inspired the motion of the garden players
these are our garden players

Fabrication

The lamp itself was made of a combination of 3D printed and laser-cut parts. We decided to remain consistent with the two link arms that inspired all of the motion we used, thus our form followed that style as well.

Here is the CAD for the animals: https://drive.google.com/file/d/1nqYG1tPH75_7_2r2FtzmMv3LGiha8_5C/view?usp=sharing

Here is the CAD for the lamp: https://drive.google.com/file/d/1uvq7704unAnncukpAVNLdFRt23_n9zOe/view?usp=sharing

The Mo-Cap Integration

This project coordinates an “intelligent” lamp that can rotate continuously, tilt, turn on/off, and control a set of three background robot automata. The robot senses through a motion capture system setup in the performance gallery. The robot’s control loop is closed through the motion capture system, so it is required for operation. A simple simulator is provided to test the system without a motion capture system, including testing the robot hardware “simulator-in-the-loop” with a simulated mocap system and robot.

As for production, we using motion capture to triangulate where users were so when they “interrupted” , the lamp would know where they were. This took on quite the technical challenge that drove the rest of our project. While we understood this was a risk, by the time the final show came around, the risk was extremely worth it. The point of this project was to create an interactive experience that intrigued and surprised users. The accuracy and reliability of the integration of motion capture and ROS led to great reactions from the audience. While this project implemented challenging tech and somewhat complex construction, it can’t necessarily be said that that was the focus of the performance. In fact, it could be said that the complex tech added another level of wonderment to the performance.

I encourage you to take a look at the GitHub repository: https://github.com/Toms42/Disruption-ROS

Final Performance

All in all, the team is satisfied with the result and excited about the opportunity to challenge ourselves. In addition to that the performance led to deeper more nuanced understanding of how tech can be integrated into art.

--

--