با سلام خدمت کاربران در صورتی که با خطای سیستم پرداخت بانکی مواجه شدید از طریق کارت به کارت (6037997535328901 بانک ملی ناصر خنجری ) مقاله خود را دریافت کنید (تا مشکل رفع گردد).
دسته بندی:
رباتیک - robotic
سال انتشار:
2017
عنوان انگلیسی مقاله:
Transferring skills to humanoid robots by extracting semantic representations from observations of human activities
ترجمه فارسی عنوان مقاله:
انتقال مهارت ها به ربات های انسان نما با استخراج نمایه های معنایی از مشاهدات فعالیت های انسانی
منبع:
Sciencedirect - Elsevier - Artificial Intelligence, 247 (2017) 95–118. 10.1016/j.artint.2015.08.009
نویسنده:
Karinne Ramirez-Amaro, MichaelBeetz, ordonCheng
چکیده انگلیسی:
Article history:Received in revised form 22 July 2015 Accepted 19 August 2015Available online 6 September 2015Keywords:Activity recognition Human understanding Knowledge-based Semantic representation Skill transferIn this study, we present a framework that infers human activities from observations using semantic representations. The proposed framework can be utilized to address the difficult and challenging problem of transferring tasks and skills to humanoid robots. We propose a method that allows robots to obtain and determine a higher-level understanding of a demonstrator’s behavior via semantic representations. This abstraction from observations captures the “essence” of the activity, thereby indicating which aspect of the demonstrator’s actions should be executed in order to accomplish the required activity. Thus, a meaningful semantic description is obtained in terms of human motions and object properties. In addition, we validated the semantic rules obtained in different conditions, i.e., three different and complex kitchen activities: 1) making a pancake; 2) making a sandwich; and 3) setting the table. We present quantitative and qualitative results, which demonstrate that without any further training, our system can deal with time restrictions, different execution styles of the same task by several participants, and different labeling strategies. This means, the rules obtained from one scenario are still valid even for new situations, which demonstrates that the inferred representations do not depend on the task performed. The results show that our system correctly recognized human behaviors in real-time in around 87.44% of cases, which was even better than a random participant recognizing the behaviors of another human (about 76.68%). In particular, the semantic rules acquired can be used to effectively improve the dynamic growth of the ontology- based knowledge representation. Hence, this method can be used flexibly across different demonstrations and constraints to infer and achieve a similar goal to that observed. Furthermore, the inference capability introduced in this study was integrated into a joint space control loop for a humanoid robot, an iCub, for achieving similar goals to the human demonstrator online. 2015 Elsevier B.V. All rights reserved.
Keywords:Activity recognition | Human understanding | Knowledge-based | Semantic representation | Skill transfer
قیمت: رایگان
توضیحات اضافی:
تعداد نظرات : 0