Mecha-Patti: the Cake Claw Is a Three Appendage Mechanical Claw That Uses Computer Vision

Mecha-Patti: the Cake Claw Is a Three Appendage Mechanical Claw That Uses Computer Vision

By: Ken Orkis

Mecha-Patti: The Cake Claw is a three appendage mechanical claw that uses computer vision to mimic hand gestures much like the game patty cake. It will consist of two fingers with three degrees of freedom and a fully opposable thumb.
The fingers will be comprised of single directional joints with springs on the backside so the fingers are fully extended while at rest. The interior side of each digit of each finger will be connected to individual cables that are also connected to linear actuators at the base of the hand.
The two smallest digits of the thumb will be match the finger joint and actuator style of the fingers. The base of the thumb will rest on a ball joint, the digit will be controlled by four cables with linear actuators to make it fully opposable while simultaneously replicating the normal mechanical structure of the opposable thumb joint.
The fingertips and palm will be outfitted with pressure sensors to give the claw sensory “feeling” to allow the fingers to bend enough to grab an object without crushing it.
The claw will be controlled with a combination of a multi-color polka dotted glove, a CMU camera, a range finder, and a sophisticated computer algorithm. The algorithm will measure the distance the gloved human hand is from the camera before entering a look up table that contains known measurements of the expected spaces between finger joints. The algorithm will use the known values in conjunction with the camera to calculate the orientation of the hand and fingers.
The camera/range finder will be combined in a single wireless package that can be set on a flat surface near the user to remotely control the claw.


  • Arduino Duemilanove Microcontroller
  • Linear Actuators
  • Piezo Pressure Sensors
  • CMU Camera
  • Acoustic Range Finder
  • Xbee Series 2 wireless module

Future Development and Applications:

Once completed Mecha-Patti could easily be upgraded from a simple robotic hand to a mechanical prosthetic with BCI controls. While a gloved exoskeleton with flex sensors at finger joints would allow for a greater level of precision, using a camera as a control would allow for easier feedback in BCI training applications where a user might not have a second hand to help insert a bulky glove. Given a powerful enough shape recognition algorithm, a glove may not even be necessary. The camera would allow easy unobtrusive movement of the left hand while the robotic right hand mimics it. BCI algorithms involve learning algorithms that control a hand so general movements combined with pressure sensors would offer more than enough control.