Which production step is the individual employee currently in? How does he/she act at assembly stations and workbenches and with what strategy does he/she solve a task? What work routes does he/she have, how quickly does he/she walk, what ergonomic strain does he/she have?

The more flexible and unpredictable a production becomes and the more unstructured the individual workplace, the more valuable this knowledge becomes. It is the raw material for a production that is dependent on new assistance systems.

PROFACTOR extracts this process knowledge using deep learning methods. The in-house development Assembly Eye makes it possible to track people and their activities without invasive sensors: The movement of the person in space, acting on workbenches and interacting with tools, components and machines. This technology is used for process quality assurance, process documentation and the optimization of workflows. Production systems are ultimately enabled to understand sequences of action and to offer adequate, unobtrusive assistance functions in real time. Another application is operator guidance by means of projection on components: when the employee needs it and limited to the essentials.

The Assembly Eye digitizes and understands human-centered processes.
The Assembly Eye uses a standard camera. With the associated software from PROFACTOR, it can extract the movements of the individual actors from the image data in spatial and temporal context and digitize relevant information for the process flow. Depending on requirements, this can be the basis for (real-time) analysis tools or input for trainable system intelligence (deep learning methodology).

AEYE - Visual Large-scale Industrial Interaction Processing (UbiComp 2019)

ansprechpartner_hbauer_visualcomputing_web

Your Contact

DI(FH) Harald Bauer

Head of Visual Computing

+43 7252 885 302
harald.bauer@nullprofactor.at

We answer…

… your questions