FlExible assembLy manufacturIng with human-robot Collaboration and digital twin modEls (FELICE)

 

FELICE addresses one of the greatest challenges in robotics, i.e. that of coordinated interaction and combination of human and robot skills. The proposal targets the application priority area of agile production and aspires to design the next generation assembly processes required to effectively address current and pressing needs in manufacturing. To this end, it envisages adaptive workspaces and a cognitive robot collaborating with workers in assembly lines. FELICE unites multidisciplinary research in collaborative robotics, AI, computer vision, IoT, machine learning, data analytics, cyber-physical systems, process optimization and ergonomics to deliver a modular platform that integrates and harmonizes an array of autonomous and cognitive technologies in order to increase the agility and productivity of a manual assembly production system, ensure the safety and improve the physical and mental well-being of factory workers. The key to achieve these goals is to develop technologies that will combine the accuracy and endurance of robots with the cognitive ability and flexibility of humans. Being inherently more adaptive and configurable, such technologies will support future manufacturing assembly floors to become agile, allowing them to respond in a timely manner to customer needs and market changes.

FELICE framework comprises of two layers:

  1. A local one introducing a single collaborative assembly robot that will roam the shop floor assisting workers
  2. Adaptive workstations able to automatically adjust to the workers’ somatometries and providing multimodal informative guidance and notifications on assembly tasks, and a global layer which will sense and operate upon the real world via an actionable digital replica of the entire physical assembly line.

Related developments will proceed along the following directions:

  1. Implementing perception and cognition capabilities based on many heterogeneous sensors in the shop floor, which will allow the system to build context-awareness
  2. Advancing human-robot collaboration, enabling robots to operate safely and ergonomically alongside humans, sharing and reallocating tasks between them, allowing the reconfiguration of an assembly production process in an efficient and flexible manner
  3. Realizing a manufacturing digital twin, i.e. a virtual representation tightly coupled with production assets and the actual assembly process, enabling the management of operating conditions, the simulation of the assembly process and the optimization of various aspects of its performance.

FELICE foresees two environments for experimentation, validation, and demonstration. The first is a small-scale prototyping environment aimed to validate technologies before they are applied in a larger setting, provided by the second, industrial environment of one of the largest automotive industries in Europe. It is the view of the consortium that this quest is timely reacting to international competition, trends, and progress, pursuing results that are visionary and far beyond the current state of the art.

.

Project name:
FELICE – Flexible Assembly Manufacturing with Human-Robot Collaboration and Digital Twin Models

Funding:
H2020-EU.2.1.1. (EU Grant ID number: 101017151)

Total Budget:
€ 6 342 975

Duration:
01.01.2021– 30.06.2024

Coordinator:

INSTITUTE OF COMMUNICATION AND COMPUTER SYSTEMS, Greece

Partners:

PROFACTOR GMBH, Austria

CENTRO RICERCHE FIAT SCPA, Italy

FH OO FORSCHUNGS & ENTWICKLUNGS GMBH, Austria

AEGIS IT RESEARCH GMBH, Germany

FORSCHUNGSGESELLSCHAFT FUR ARBEITSPHYSIOLOGIE UND ARBEITSSCHUTZ E.V., Germany

IDRYMA TECHNOLOGIAS KAI EREVNAS, Greece

CAL-TEK SRL, Italy

TECHNISCHE UNIVERSITAT DARMSTADT, Germany

UNIVERSITA DEGLI STUDI DI SALERNO, Italy

FRAUNHOFER GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V., Germany

STANCZYK BARTLOMIEJ, Poland

EUNOMIA LIMITED, Ireland

Your Contact

DI Sharath Chandra Akkaladevi
Scientist

+43 72 52 885 325
sharath.akkaladevi@nullprofactor.at

We answer…

… your questions

While traditional automation is suitable for repetitive executions, it reaches its limits when robots need to act in a Human-Robot Collaboration (HRC) environment. Here challenges like high product and variant diversity, low lot sizes and especially low setup times make it necessary to develop new ways to interact with robots.

The focus of this project is to learn activities by observing the worker. Based on the results of the previous project, methods will be researched how to align visual infor­mation of objects with robotic motion.

 

For this purpose, powerful 2D image processing algorithms based on “deep learning” are synergistically combined with current developments in the field of “reinforcement learning”.

The main research goals of the project are:

*  Visual understanding of a demonstrated process through deep neural networks supported by instrumented tools.

* Generalization of process knowledge through deep reinforcement learning.

* Synthesis of movements for new parts using movement primitives.

 

Project name:
LERN4MRKII: Extended modelling, learning and abstraction of processes for human-robot cooperation.

Funding:
AIT Strategy Research Programme

Duration
01.01.2019 – 31.12.2019

Your Contact

DI Dr. Gernot Stübl
Scientist
Robotics and Assistive Systems

+43 72 52 885 313
gernot.stuebl@nullprofactor.at

We answer …

… your questions

The aim of the project is to enable non-experts with the opportunity to easily teach robots a complex assembly process using natural communication methods. In order for robot systems to learn from a non-expert user, the robot system should first understand the user’s intention. In this project, human intentions are understood both through verbal communication and by observing human actions. In order to recognize human intentions through voice communication, PROFACTOR will work together with LIFEtool on the development of a new communication interface for robots. LIFEtool, with its extensive knowledge of online and offline speech recognition technologies, will support PROFACTOR in developing an interface to communicate human intentions with the robot. PROFACTOR will develop a “portable” activity detection system capable of detecting human actions using low-cost sensors. A feasibility study will also be conducted in the process to partially validate the applicability of such activity/gesture recognition system to other domains (health care). The advantage of such a module would be its applicability to different scenarios requiring the understanding of human intentions.

The strategy to achieve the goal that non-experts can easily teach robot systems is twofold. First, a ‘learning by interaction’ framework is developed in which the robot provides the user with a set of intelligent suggestions during the learning process. The robot uses its knowledge modeling and argumentation skills and takes into account the “current situation” (detected with the activity detection system) of the assembly environment to make these suggestions. A ‘learning by instruction’ framework is then developed that allows the use of ‘Natural Language’ as a communication mode between the user and the robot. Both frameworks are then combined into a bidirectional communication channel between the user and the robot in order to provide feedback or to re-learn (in whole or in part) the assembly process.

Project name:
BRIDGE – HUMAN ROBOT INTERACTION (TEACHBOTS)

Industrial Project

Duration:
04.2019 – 12.2020

Partners:

Profactor GmbH

LIFEtool GmbH

Keywords: Human Robot Collaboration; Programming by Interaction; Programming by Instruction; Artificial Intelligence

Your contact

Akkaladevi Sharath Chandra 
Scientist
Robotics and Assistive Systems

+43 72 52 885 325
Sharath.Akkaladevi@nullprofactor.at

Wir beantworten …

… Ihre Fragen

SYNERGY aims at strengthening currently underdeveloped linkages, cooperation and synergies between companies, industry, research, intermediaries and policy makers in central Europe.  The project will analyse funded and finalised innovation projects and cluster institutions involved in projects into three key areas covering the most promising modern industrial technologies.

These areas include

  • additive manufacturing and 3D printing,
  • micro- and nanotechnology-related processes and materials, as well as
  • the industry 4.0 sector.

Institutions and clusters included in each area will form ‘synergic networks’ based on a novel projects assessment methodology and a ‘synergic consortia matchmaking’ IT online tool. Moreover, the project will define new crowd innovation services and test them in different types of pilot actions. As a result, project activities will boost the creation of innovative services and facilitate transnational cooperation in the industrial sector.

 

Project name:
Synergy – SYnergic Networking for innovativeness Enhancement of central european actoRs focused on hiGh-tech industrY

Funding:

Interreg – Central Europe Programme

 

Duration:  
01.08.2017 – 31.07.2020

Website:
http://www.interreg-central.eu/Content.Node/SYNERGY.html

Ihr Ansprechpartner

DI Christian Wögerer, MSc

International Networks

+43 72 52 885 200
christian.woegerer@nullprofactor.at

Gerne antworten wir…

… auf Ihre Fragen

Collaborative lightweight robots are a trend in industry. They are comparatively cheap. Developments in the field of machine learning make them increasingly flexible and easier to use. The challenges for research are obvious. The machines must be equipped with cognitive intelligence in order to adapt to a changed environment and understand new tasks. The aims of the project go far beyond the state of the art of research.

The focus is on “transfer learning”: from manual human activities to the robot and from a product or process variant to a similar one.

The main research objectives of the project are

1. mapping of human motion to the robot
2. the “understanding” of temporal task correlations and process parameters by the robot
3. adaptability to similar processes with as few new examples as possible

No external cooperation is envisaged for the implementation of the project.

Project name:
LERN4MRK: Modellieren, Erlernen und Abstrahieren von Prozessen für die Mensch-Roboter Kooperation

Funding:
bmvit

Duration:  
01.07.2017 – 30.06.2021

Publikationen

S.C. Akkaladevi, M. Plasch, and A. Pichler, “Skill-based learning of an assembly process” Elektrotech. Inftech. (2017) 134: 312, Springer Vienna. https://doi.org/10.1007/s00502-017-0514-2 

C. Heindl, T. Poenitz, G. Stuebl, A. Pichler, and J. Scharinger, “Spatio-thermal depth correction of RGB-D sensors based on Gaussian Processes in real-time” in The 10th International Conference on Machine Vision, to be published, 2017.

Your contact

DI Dr. Gernot Stübl
Scientist
Robotics and Assistive Systems

+43 72 52 885 313
gernot.stuebl@nullprofactor.at

We answer …

… your questions

Manufacturing industry represents a generator of Research and Development, innovation, growth and employment. Based upon increasing pressure on manufacturers (increased production capacity in low-cost economies and increased level of sophistication of supply chains in high-cost economies), the manufacturers need to embrace novel technologies, principles and approaches.

In other words, manufacturers need to digitize their production, while taking into consideration also improvement in processes and human resource management.

The main objective of the Smart Factory HUB project is to improve framework conditions for innovation in the area of “smart factory”.

Therefore, the project’s goal is to develop R&D and business policy conditions for transnational cooperation in the manufacturing industry.

Result is improved cooperation between R&D and business where based on RIS3 (Research and Innovation Smart Specialization Strategy) centered model, quadruple helix partners will be oriented to find novel solutions in the following three domains: applying novel technologies, applying effective production process and applying effective human resource management system.

 

 

SMART FACTORY COOPERATION PLATTFORM is online.

 

 

 

Project name:
SMART FACTORY HUB – IMPROVING RD AND BUSINESS POLICY CONDITIONS FOR TRANSNATIONAL COOPERATION IN THE MANUFACTURING INDUSTRY

Funding:

199,825 EUR
Interreg – Danuabe Transnational Programme

Project Co-funded by European Union funds (ERDF, IPA, ENI)

Duration:  
01.01.2017 – 30.06.2019

Website:
http://www.interreg-danube.eu/approved-projects/smart-factory-hub

Your Contact

Verena Musikar BA MSc
Project Management Assistance
Corporate Communication

+43 72 52 885 142
verena.musikar@nullprofactor.at

We answer…

… your questions

Partner:

Pomurje Technology Park
Technical University of Cluj-Napoca
Croatian Agency for SMEs, innovations and investments
University of Stuttgart – Institute for Human Factors and Technology Management
PROFACTOR GMBH
University of West Bohemia
Slovak Chamber Of Commerce and Industry
Pannon Business Network Association
Foundation “Cluster Information and Communications Technologies”
Chamber of Commerce and Industry of Serbia
Public Agency for Entrepreneurship, Internationalisation, Foreign Investments and Technology
Stuttgart Region Economic Development Corporation
Ministry of Economy of the Republic of Serbia

Märkte, in denen die Nachfrage nicht immer genau vorhergesagt werden kann, bzw. Unternehmen, die mit kleinen Fertigungslosen arbeiten, profitieren nicht von der erhöhten Produktivität durch Roboter. Der Trend hin zu individualisierten Serienfertigung verschärft die Problematik zusätzlich. Industrieroboter-Anlagen haben großen Platzbedarf und erfordern Sicherheitsinfrastruktur wie automatisierte Zäune und Türen. Das erhöht die Systemkosten sowie die Kosten für die Automatisierung und Programmierung der Anlagen. Produktion in kleinen Losgrößen bis hin zu Losgröße 1 Produkten erfordert Robotersysteme die mobil sind, ohne Schutzzaun arbeiten können – im Idealfall in Mensch-Roboter-Kooperation, und aufgrund sensorischen Fähigkeiten in der Lage sind ihre Programmierung an die a priori undefinierte Umgebungssituation (hinsichtlich Werkstückpräsentation, …) anzupassen. Das erhöht die Systemkomplexität enorm, da zu Anlagenbauern und Roboterprogrammierern im Extremfall Bildverarbeitungsspezialisten und Spezialisten zur Programmierung Sensordatenabhängiger Robotersoftware hinzukommen.

Im Projekt FlexRoP werden Systeme entwickelt, die flexibler und einfacher zu programmieren und die lernfähig sind.

Projektziele im Detail:

  • Integration einer Assistenzroboterplattform mit erweiterten sensorischen Fähigkeiten
  • Definition einer universellen Darstellung für die Fähigkeit (Skill), eine Montageaufgabe zu bewältigen.
  • Implementierung automatischer und halbautomatischer Fähigkeiten zur Parametrierung der Skills über visuelle und kinästhetische Beobachtung von Menschen.
  • Techniken zur Verallgemeinerung, um die in der Anlernphase erworbenen Fähigkeiten in unterschiedlichen Situationen anwenden zu können.
  • Implementierung von Algorithmen zur Aktionssynthese, um Bearbeitungsprogramme aus Sensordaten ableiten zu können.

Projektname:
Flexible, assistive robot for the customized production.

Förderung:
FFG –  ICT of the Future

Laufzeit:  
1.09.2016 – 31.08.2018

Your Contact

Markus Ikeda
Scientist
Robotics and Assistive Systems

+43 72 52 885 308
markus.ikeda@nullprofactor.at

We answer …

… your questions

The complexity of assembly processes and manufacturing processes in general is in-creasing with respect to flexibility, frequent changes of products and markets as well as shorter product life cycles. Currently, the adaptation of manufacturing processes requires significant resources and time. Instead of creating more and more complex technical systems the approach in industry is to strengthen the cooperation between humans and machines. This project focuses on the interaction between humans and robots in production environments.

Research deals with the development and evaluation of concepts for cognitive assistance systems that are based on a mutual understanding of the task and how it is to be split among humans and machines. This requires a suitable representation of the environment, the task and the objects and in a second stage methods and concepts to extend, parameterize, and adapt the system for future tasks.

Also, the machine has to be able to combine basic actions to complex tasks. This top-ics has not yet been addressed in production Research.

Project Name:    
Cooperation models for assistive human-machine interaction in the production process

Funding:    
BMVIT

Duration:   
01.07.2014 – 30.06.2017

Publikationen

2016
  • Sharath Chandra Akkaladevi, Matthias Plasch, Andreas Pichler, Bernhard Rinner, Human Robot Collaboration to Reach a Common Goal in an Assembly Process, accepted for publication at ECAI 2016
  • Sharath Akkaladevi, Martin Ankerl, Christoph Heindl, Andreas Pichler, Tracking multiple rigid symmetric and non-symmetric objects in real-time using depth data, ICRA 2016
  • Sriniwas Chowdhary Maddukuri, Gerald Fritz, Sharath Chandra Akkaladevi, Matthias Plasch, Andreas Pichler, Trajectory planning based on activity recognition and identification of low-level process deviations, Austrian Robotics Workshop 2016
  • Sharath Chandra Akkaladevi, Martin Ankerl, Gerald Fritz, Andreas Pichler, Real-time tracking of rigid objects using depth data, Austrian Robotics Workshop 2016
2015
  • Sharath Akkaladevi, Christoph Heindl, Action Recognition for Human-Robot Interaction in Industrial Applications, IEEE International Conference on Computer Graphics, Vision and Information Security (CGVIS),  3. Nov. 2015
  • Sharath Akkaladevi, Christoph Heindl, Alfred Angerer, Juergen Minichberger, Action Recognition in Industrial Applications using Depth Sensors, Austrian Robotics Workshop 2015, May 07 – 08, 2015
  • Martijn Rooker, Sriniwas Chowdhary Maddukuri, Jürgen Minichberger, Christoph Feyrer, Helmut Nöhmayer and Andreas Pichler, Interactive Workspace Modelling for Assistive Robot Systems with the Aid of Ultrasonic Sensors, Proc. of the International Conference on Flexible Automation and Intelligent Manufacturing (FAIM), 23 – 26 June 2015

Your Contact

Dr. Christian Eitzinger
Head of Machine Vision

+43 7252 885 250
christian.eitzinger@nullprofactor.at

We answer …

… your Questions

The aerospace industry typically has to handle very large parts that remain stationary in a workcell for several days until a process is completed. Conventional robots cannot be successfully implemented in this environment.

The European research project “VALERI” addresses this problem by developing a mobile robot platform that is able to perform similar tasks in different workcells together with humans. This also requires safe human-robot interaction. Improving the ergonomic aspects of a workplace is also a goal of the Project.
The project aims at the solution of problems that have so far prevented the use of mobile robots. This includes e.g. machine vision to support navigation and motion planning of the robot.
Process models will be used for path planning and the size of aerospace parts requires that the mobile platform is also considered as part of the kinematics, resulting in truly mobile manipulation.

In this project PROFACTOR is developing a machine vision system for the inspection of the parts, and contributes to human-machine interaction.

The applicability will be tested on two typical use cases of the aerospace industry: quality control of large parts and application of sealant along a groove. These tasks are applied throughout the whole production line.

Project Name:
Validation of Advanced, Collaborative Robotics for Industrial Applications

Funding:
EU – FP7-2012-NMP-ICT-FoF

Duration:
01.11.2012 – 31.10.2015

Your Contact

Dr. Christian Eitzinger
Head of Machine Vision

+43 7252 885 250
christian.eitzinger@nullprofactor.at

We answer …

… your Questions