The paper focuses on the interaction in human-robot collaboration. On the one hand, the robot assistance system individually aligns itself to the employee and on the other hand, the employee gets an interface which enables him to influence certain robot positions. The aim is to support the employee in assembly tasks. The employee's personal anthropometric data and age-related as well as temporary restrictions in movements are considered by being recorded individually via motion capturing before the workplace is built in a virtual and real environment. Based on the data, task specific movements of the employee are simulated using digital human models for the virtual representation of the employee, combined with an ergonomic analysis within the work environment. The impact of the employee on the assistance robot system is provided by the design of intuitive user interfaces. The positioning of the components in the assembly is done user-specifically by the robot. In addition, the employee gets a graphical user interface and can additionally adjust the position or turn the components. In this paper, preliminary results of this ongoing research project are presented as well as two reference processes from the field of assembly technologies as application examples.
This article is published in the Journal "Procedia CIRP" Volume 44, pp. 275-280.
C. Thomas, L. Stankiewicz, A. Grötsch, S. Wischniewski, J. Deuse, B. Kuhlenkötter:
Intuitive Work Assistance by Reciprocal Human-robot Interaction in the Subject Area of Direct Human-robot Collaboration.
in: Procedia CIRP, Volume 44 2016. pages 275-280, Project number: F 2351, DOI: 10.1016/j.procir.2016.02.098
© Federal Institute for Occupational Safety and Health