A joint project of the Graduate School, Peabody College, and the Jean & Alexander Heard Library

Title page for ETD etd-04062004-164409

Type of Document Dissertation
Author Peng, Jian
Author's Email Address jian.peng@vanderbilt.edu
URN etd-04062004-164409
Title Extraction of Salient Features from Sensory-Motor Sequences for Mobile Robot Navigation
Degree PhD
Department Electrical Engineering
Advisory Committee
Advisor Name Title
Richard Alan Peters, II Committee Chair
David C. Noelle Committee Member
Joseph S. Lappin Committee Member
Kazuhiko Kawamura Committee Member
Mitch Wilkes Committee Member
  • imitation-based learning
  • sensory-motor coordination
  • computer vision
Date of Defense 2004-02-24
Availability unrestricted
This dissertation presents a method to extract features salient to a mobile robot navigation task in a specific environment. The extraction process is bootstrapped by a human operator’s tele-operation and is based on the sensory-motor coordination principle. Salient feature extraction consists of three steps: tele-operation, offline association, and evaluation. First, the mobile robot is tele-operated in an environment along a path several times. All sensory data and motor drive commands are recorded. Then these recorded sensory-motor sequences are partitioned into episodes according to the changes in the motor commands. Salient features are then extracted by using two statistical criteria: consistency and correlation with the motor commands within an interval around the episode boundaries. Finally, these features are used to drive the robot in the learned environment. Two sets of experiments, in both indoor and outdoor environments, were performed. The results endorsed this methodology.
  Filename       Size       Approximate Download Time (Hours:Minutes:Seconds) 
 28.8 Modem   56K Modem   ISDN (64 Kb)   ISDN (128 Kb)   Higher-speed Access 
  final_electrical.pdf 3.15 Mb 00:14:35 00:07:30 00:06:33 00:03:16 00:00:16

Browse All Available ETDs by ( Author | Department )

If you have more questions or technical problems, please Contact LITS.