A joint project of the Graduate School, Peabody College, and the Jean & Alexander Heard Library
Title page for ETD etd-03272008-144850
|Type of Document
||Begley, Sean Michael
|Author's Email Address
||Gesture Recognition and Mimicking in a Humanoid Robot
||Master of Science
|Richard Alan Peters II
|D. Mitchell Wilkes
- visual servoing
- gesture recognition
- Androids -- Design and construction
- Machine learning
|Date of Defense
As robots become more complex it becomes necessary to create more effective methods of imparting them with knowledge. One area where complexities are apparent is in learning to manipulate. Vanderbilt University’s Intelligent Soft Arm Control robot, ISAC, designed with two 6 degree-of-freedom arms, actuated by McKibben artificial muscles was designed for interaction with people. Programming specific movements for that purpose quickly becomes impractical. It would be more practical for ISAC to learn motions by observing them. The field of Imitation Learning is devoted to that problem.
As a precursor to Imitation Learning and as a base for human-robot interaction I have designed a system by which ISAC recognizes human arm gestures and repeats them back. The system requires minimal equipment to be worn by the user. A simple pair of brightly colored gloves is all that is required. This is in contrast to other approaches that require the user to don a set of encoders on his, or her, arms to record precise joint angles. This is impractical in many real world settings.
Here I present the motivations for the creation of the system and describe several tools that simplified its construction. The methods by which the system tracks, filters, compresses, and translates the motion of the human operator’s hands are also discussed.
|| Approximate Download Time
| 28.8 Modem
|| 56K Modem
|| ISDN (64 Kb)
|| ISDN (128 Kb)
|| Higher-speed Access
If you have more questions or technical problems, please Contact LITS