A joint project of the Graduate School, Peabody College, and the Jean & Alexander Heard Library

Title page for ETD etd-04062017-164218

Type of Document Master's Thesis
Author Lan, Ke
Author's Email Address ke.lan@vanderbilt.edu
URN etd-04062017-164218
Title Skill Transfer between Industrial Robots by Sparse learning
Degree Master of Science
Department Electrical Engineering
Advisory Committee
Advisor Name Title
Richard Alan Peters Committee Chair
D. Mitchell Wilkes Committee Member
  • robots
  • demonstration
  • knowledge transfer
Date of Defense 2017-04-25
Availability unrestricted
Recently, by increasing the productivity of industrial manufacture, industrial robots have played a key role in many fields of industry (e.g. automobile production, food production, etc.) However, there are two problems rarely mentioned in this field. First, compared with automatization in other fields, industrial robots are programmed manually by a human operator. Second, because of the physical difference between robots and difference of operating platform, there doesn’t exist a general method to define the skill (motion records) of robots and make it possible to reuse the skills between robots. In this work, we are trying to propose a skill definition of transfer system which combine the strengths of traditional DMP algorithm and deep learning method. Specifically, in our method, a set of motion primitive bases are generated from motion records in different robots. Skills are re-defined by the linear coefficient of the primitive bases and transferred based on motion primitive bases translation between different platforms. Experiment shows that our method can successfully transfer skills between different models with less space requirement.

  Filename       Size       Approximate Download Time (Hours:Minutes:Seconds) 
 28.8 Modem   56K Modem   ISDN (64 Kb)   ISDN (128 Kb)   Higher-speed Access 
  thesis_KeLan.pdf 7.17 Mb 00:33:11 00:17:04 00:14:56 00:07:28 00:00:38

Browse All Available ETDs by ( Author | Department )

If you have more questions or technical problems, please Contact LITS.