Classifying Hand and Arm Movements

For humans to communicate with machines, conventional input devices such as a mouse and keyboard are increasingly being replaced by so-called natural user interfaces (NUI). NUI is a range of input methods based on natural communication between humans, including voice control and communication via gestures. In recent years, gesture recognition has made great advances, particularly in the world of entertainment electronics. However, when it comes to classifying hand and arm movements, research is not solely restricted to gesture control. With the automatic recognition of typical everyday hand and arm movements, the technology is capable of much more than just direct communication with a machine. For example, it can be used to recognize a workflow during a manual assembly process. With the aid of machine learning technologies, hand and arm movements can be classified to determine whether a specific assembly step has been performed. This non-invasive method of task monitoring enables technical systems to be developed which only take up the worker’s time when this is unavoidable, for example if an error is made. Hand and arm movements can also be classified to support diagnostics.

 

MonSiKo: an adaptive assistance system simplifies assembly in Industrie 4.0 processes

The project “Adaptive assembly assistance and interaction system using 3-D scene analysis and intuitive man-machine communication”, abbreviated from the German to “MonSiKo”, focused on aiding assembly processes through the use of modern sensor and communication technologies.