Open CV

controlling a robot through a hacked webcam

 
We attached a Sharpie to our robot's arms with a 3D-printed bracket, and some screws. It took several tries and a heavy use of a soldering iron to make the bracket fit the pen. Calibration helped make the lines smoother.

We attached a Sharpie to our robot's arms with a 3D-printed bracket, and some screws. It took several tries and a heavy use of a soldering iron to make the bracket fit the pen. Calibration helped make the lines smoother.

 
 

In one weekend, my three teammates and I learned and implemented Python's Open CV libraries in order to control a 3D-printed robotic arm. By moving a coloured block, the robot could be made to draw any shape created within the set boundaries. We implemented our code in our Linear Actuator Haptic Interface project, and scored one of the highest marks in the project course in 2017.

 
 
We designed our robot in SolidWorks, and 3D-printed the arms. The brackets were made of waterjet-cut aluminum. We used slot detectors to find the direction of motion and speed of the motor arms. The linear actuators responded to this input appropriately.

We designed our robot in SolidWorks, and 3D-printed the arms. The brackets were made of waterjet-cut aluminum. We used slot detectors to find the direction of motion and speed of the motor arms. The linear actuators responded to this input appropriately.

 
 

A linear actuator is a motor that moves forward and back in a line, rather than around in a circle. A haptic interface is one which responds to human input. Putting the two of these together, three classmates and I created a two-dimensional robot that would simulate various environments, and restrict movement added by a human. All parts were either waterjet-cut or 3D printed through SolidWorks. The PID software and GUI were coded in Python, as well as an Open CV interface. Slot detectors were used to sense direction of movement, and linear actuators were custom-designed and built by myself and Tommy Lau, my teammate.