Currently, the use of bio-composites is limited to less critical applications that do not have significant requirements in terms of mechanical performance. However, the use of synthetic composites made from carbon or glass fibre has several difficulties in terms recycling and in terms of dependence on third countries. About 98% of these synthetic composites still end up in landfills and about 80% of the raw materials are currently manufactured outside of Europe.

To improve this situation, the project addresses the challenges of using bio-composites for structural parts and aims to increase the range of applications in which bio-composites can be used. This will be achieved by developing an accurate draping process to control fibre orientation, by creating material models that capture the natural variability of the material and by integrating nano-structured, bio-based sensors for load monitoring. Through the increased accuracy and additional control loops in the manufacturing process the consortium expects to achieve predictable properties and constant quality.

Within the project use cases from wind energy and boat-building will be investigated, aiming at the manufacturing of a full size rotor blade and a ship hull to demonstrate the technical feasibility and achieving TRL7 for the manufacturing technologies. In addition to the end users, the consortium consists of partners from automation, machine building, measurement technology, material manufacturing and simulation software to cover all aspects of the developments. Based on the predicted growth of the bio-composites market, which is expected to increase by a factor of 2.5 by 2030, the consortium expects a market potential of about 100M€ by 2030.

PROFACTOR will not only take care of the Project coordination including the data-, risk- and innovation-management  but will also design, construction and  build the hardware setup of a sensor for the inspection of natural fibre materials. The adaption of an existing draping robot cell for the usage with natural fibre materials and the design, development and manufacturing of a load sensor are additional tasks that will be carried out.

 

Project name: BioStruct

Project website: www.biostruct-project.eu/

Funding: HORIZON-IA

Project volume: €5,495,530

Project duration: 01 Jan 2024 – 31 Dec 2024

Project partners:

 

Contact

DI Daniela Kirchberger
Machine Vision

+43 7252 885 319
daniela.kirchberger@nullprofactor.at

We are happy to answer…

…Your Questions

A data-driven remanufacturing process for sheet metal and thermoplastic composites (COMPASS)

The COMPASS project is driven by the needs to on the one hand increase the efficiency of recycling and remanufacturing processes (for sheet metal parts) and on the other hand by the need to find a solution for large quantities of thermoplastic fiber-reinforced components at their end of life. In both cases the proposed approach is to take a shortcut by remanufacturing the components at the level of sheet metal and composite panels instead of converting them to secondary raw material. This will be achieved through (thermo-)forming processes that allow the re-shaping of parts and components to give them a second and third life.

These remanufacturing processes will be supported by a set of digital tools that build upon a digital component passport. The tools will enable efficient dismantling processes to extract sheets or panels e.g. from an aircraft and will help to collect relevant information about the components during their lifetime. The main use cases are from the aerospace and automotive sector and include key actors along the value chain. By using digital information about defects, re-work or repair is done during the lifetime of the components. The quality of the resulting, remanufactured output parts will be optimized. A remanufacturing process planning software will also optimize the match between input components and targeted output parts. The COMPASS project will initiate more than 6M€ of private investments in technology development and implementation of remanufacturing process in the use cases investigated in the project.

The overall goal is to enable the remanufacturing of about 30% of sheet metal parts and thermoplastic composite panels.

Expected outcomes:

  • The reforming processes will enable the re-manufacturing of alloy and composite panels, without converting them to raw materials.
  • The digital component passport will provide a solution to support the remanufacturing of components. The data provided by the digital passport and the tools built upon them will enable more targeted and flexible disassembly and dismantling processes.
  • For carbon fiber (est. 10.000t per year) the recycling rate will increase to about 30% from currently 2% through the direct re-manufacturing of panels in other applications.
  • For companies involved in the disassembly and dismantling of aircraft, a new business model will become available based on the re-manufacturing of material. Currently, the business model is focused on sourcing and selling of spare parts, while structural materials are just a cost factor and something to “get rid of”. Recovering parts at component level will increase their value by a factor of 10 and make re-manufacturing economically interesting.
  • COMPASS will provide technical means to close the knowledge gap that exists between different actors along the whole value chain, including the component manufacturer, the OEM, the MRO operation company, the dismantling company and the users of the re-manufactured components.

 

Project duration:

01012024- 31.01.2026

Project budget:

5 701 244.13 €

Funding:

HORIZON-CL4-2023-TWIN-TRANSITION-01-04

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement No 101136940

Projektwebseite:

www.compass-horizon.eu

Your Contact

Denis Krajnc
Machine Vision

+43 7252 885 853
Denis.Krajnc@nullprofactor.at

The H2020 research project DrapeBot aims at the development of a human-robot collaborative draping process for carbon fiber composite parts. The robot will drape the large, less curved areas, while the human will drape the areas of high curvature that are difficult to reach. The transfer of large patches of fabric, which can be several meters long, is done jointly by the robot and the human.

The project will put specific emphasis on the efficiency of the collaboration, so that a real increase over human-only and robot-only draping processes is achieved. For this purpose vision sensors as well as force and torque sensors will provide input to a real-time feedback control loop that adjusts the robot’s motions.

To ensure that the draping results are of good quality a specific modular and flexible gripper will be designed. The gripper includes sensors for measuring the position of the patches and for determining fiber orientation during the draping process. The development of the gripper will also include requirements coming from the human-robot collaboration aspects of the project.

To ensure usability of the draping robot in real-world applications, experimental studies will be done to assess usability and trust. These results will be fed back to the design to make sure that the human-robot collaboration also works at the non-technical level.

 

Project title:

Collaborative Draping of Carbon Fiber Parts

 

Call:

H2020-ICT-46-2020

 

Duration:

01.01.2021 – 31.12.2024

Ihr Ansprechpartner

Dr. Christian Eitzinger
Head of Machine Vision

+43 7252 885 250
christian.eitzinger@nullprofactor.at

We answer…

…Your Questions

FlExible assembLy manufacturIng with human-robot Collaboration and digital twin modEls (FELICE)

 

FELICE addresses one of the greatest challenges in robotics, i.e. that of coordinated interaction and combination of human and robot skills. The proposal targets the application priority area of agile production and aspires to design the next generation assembly processes required to effectively address current and pressing needs in manufacturing. To this end, it envisages adaptive workspaces and a cognitive robot collaborating with workers in assembly lines. FELICE unites multidisciplinary research in collaborative robotics, AI, computer vision, IoT, machine learning, data analytics, cyber-physical systems, process optimization and ergonomics to deliver a modular platform that integrates and harmonizes an array of autonomous and cognitive technologies in order to increase the agility and productivity of a manual assembly production system, ensure the safety and improve the physical and mental well-being of factory workers. The key to achieve these goals is to develop technologies that will combine the accuracy and endurance of robots with the cognitive ability and flexibility of humans. Being inherently more adaptive and configurable, such technologies will support future manufacturing assembly floors to become agile, allowing them to respond in a timely manner to customer needs and market changes.

FELICE framework comprises of two layers:

  1. A local one introducing a single collaborative assembly robot that will roam the shop floor assisting workers
  2. Adaptive workstations able to automatically adjust to the workers’ somatometries and providing multimodal informative guidance and notifications on assembly tasks, and a global layer which will sense and operate upon the real world via an actionable digital replica of the entire physical assembly line.

Related developments will proceed along the following directions:

  1. Implementing perception and cognition capabilities based on many heterogeneous sensors in the shop floor, which will allow the system to build context-awareness
  2. Advancing human-robot collaboration, enabling robots to operate safely and ergonomically alongside humans, sharing and reallocating tasks between them, allowing the reconfiguration of an assembly production process in an efficient and flexible manner
  3. Realizing a manufacturing digital twin, i.e. a virtual representation tightly coupled with production assets and the actual assembly process, enabling the management of operating conditions, the simulation of the assembly process and the optimization of various aspects of its performance.

FELICE foresees two environments for experimentation, validation, and demonstration. The first is a small-scale prototyping environment aimed to validate technologies before they are applied in a larger setting, provided by the second, industrial environment of one of the largest automotive industries in Europe. It is the view of the consortium that this quest is timely reacting to international competition, trends, and progress, pursuing results that are visionary and far beyond the current state of the art.

.

Project name:
FELICE – Flexible Assembly Manufacturing with Human-Robot Collaboration and Digital Twin Models

Funding:
H2020-EU.2.1.1. (EU Grant ID number: 101017151)

Total Budget:
€ 6 342 975

Duration:
01.01.2021– 30.06.2024

Coordinator:

INSTITUTE OF COMMUNICATION AND COMPUTER SYSTEMS, Greece

Partners:

PROFACTOR GMBH, Austria

CENTRO RICERCHE FIAT SCPA, Italy

FH OO FORSCHUNGS & ENTWICKLUNGS GMBH, Austria

AEGIS IT RESEARCH GMBH, Germany

FORSCHUNGSGESELLSCHAFT FUR ARBEITSPHYSIOLOGIE UND ARBEITSSCHUTZ E.V., Germany

IDRYMA TECHNOLOGIAS KAI EREVNAS, Greece

CAL-TEK SRL, Italy

TECHNISCHE UNIVERSITAT DARMSTADT, Germany

UNIVERSITA DEGLI STUDI DI SALERNO, Italy

FRAUNHOFER GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V., Germany

STANCZYK BARTLOMIEJ, Poland

EUNOMIA LIMITED, Ireland

Your Contact

DI Sharath Chandra Akkaladevi
Scientist

+43 72 52 885 325
sharath.akkaladevi@nullprofactor.at

We answer…

… your questions

While traditional automation is suitable for repetitive executions, it reaches its limits when robots need to act in a Human-Robot Collaboration (HRC) environment. Here challenges like high product and variant diversity, low lot sizes and especially low setup times make it necessary to develop new ways to interact with robots.

The focus of this project is to learn activities by observing the worker. Based on the results of the previous project, methods will be researched how to align visual infor­mation of objects with robotic motion.

 

For this purpose, powerful 2D image processing algorithms based on “deep learning” are synergistically combined with current developments in the field of “reinforcement learning”.

The main research goals of the project are:

*  Visual understanding of a demonstrated process through deep neural networks supported by instrumented tools.

* Generalization of process knowledge through deep reinforcement learning.

* Synthesis of movements for new parts using movement primitives.

 

Project name:
LERN4MRKII: Extended modelling, learning and abstraction of processes for human-robot cooperation.

Funding:
AIT Strategy Research Programme

Duration
01.01.2019 – 31.12.2019

Your Contact

DI Dr. Gernot Stübl
Scientist
Robotics and Assistive Systems

+43 72 52 885 313
gernot.stuebl@nullprofactor.at

We answer …

… your questions

The aim of the project is to enable non-experts with the opportunity to easily teach robots a complex assembly process using natural communication methods. In order for robot systems to learn from a non-expert user, the robot system should first understand the user’s intention. In this project, human intentions are understood both through verbal communication and by observing human actions. In order to recognize human intentions through voice communication, PROFACTOR will work together with LIFEtool on the development of a new communication interface for robots. LIFEtool, with its extensive knowledge of online and offline speech recognition technologies, will support PROFACTOR in developing an interface to communicate human intentions with the robot. PROFACTOR will develop a “portable” activity detection system capable of detecting human actions using low-cost sensors. A feasibility study will also be conducted in the process to partially validate the applicability of such activity/gesture recognition system to other domains (health care). The advantage of such a module would be its applicability to different scenarios requiring the understanding of human intentions.

The strategy to achieve the goal that non-experts can easily teach robot systems is twofold. First, a ‘learning by interaction’ framework is developed in which the robot provides the user with a set of intelligent suggestions during the learning process. The robot uses its knowledge modeling and argumentation skills and takes into account the “current situation” (detected with the activity detection system) of the assembly environment to make these suggestions. A ‘learning by instruction’ framework is then developed that allows the use of ‘Natural Language’ as a communication mode between the user and the robot. Both frameworks are then combined into a bidirectional communication channel between the user and the robot in order to provide feedback or to re-learn (in whole or in part) the assembly process.

Project name:
BRIDGE – HUMAN ROBOT INTERACTION (TEACHBOTS)

Industrial Project

Duration:
04.2019 – 12.2020

Partners:

Profactor GmbH

LIFEtool GmbH

Keywords: Human Robot Collaboration; Programming by Interaction; Programming by Instruction; Artificial Intelligence

Your contact

Akkaladevi Sharath Chandra 
Scientist
Robotics and Assistive Systems

+43 72 52 885 325
Sharath.Akkaladevi@nullprofactor.at

Wir beantworten …

… Ihre Fragen

SYNERGY aims at strengthening currently underdeveloped linkages, cooperation and synergies between companies, industry, research, intermediaries and policy makers in central Europe.  The project will analyse funded and finalised innovation projects and cluster institutions involved in projects into three key areas covering the most promising modern industrial technologies.

These areas include

  • additive manufacturing and 3D printing,
  • micro- and nanotechnology-related processes and materials, as well as
  • the industry 4.0 sector.

Institutions and clusters included in each area will form ‘synergic networks’ based on a novel projects assessment methodology and a ‘synergic consortia matchmaking’ IT online tool. Moreover, the project will define new crowd innovation services and test them in different types of pilot actions. As a result, project activities will boost the creation of innovative services and facilitate transnational cooperation in the industrial sector.

 

Project name:
Synergy – SYnergic Networking for innovativeness Enhancement of central european actoRs focused on hiGh-tech industrY

Funding:

Interreg – Central Europe Programme

 

Duration:  
01.08.2017 – 31.07.2020

Website:
http://www.interreg-central.eu/Content.Node/SYNERGY.html

Ihr Ansprechpartner

DI Christian Wögerer, MSc

International Networks

+43 72 52 885 200
christian.woegerer@nullprofactor.at

Gerne antworten wir…

… auf Ihre Fragen

Collaborative lightweight robots are a trend in industry. They are comparatively cheap. Developments in the field of machine learning make them increasingly flexible and easier to use. The challenges for research are obvious. The machines must be equipped with cognitive intelligence in order to adapt to a changed environment and understand new tasks. The aims of the project go far beyond the state of the art of research.

The focus is on “transfer learning”: from manual human activities to the robot and from a product or process variant to a similar one.

The main research objectives of the project are

1. mapping of human motion to the robot
2. the “understanding” of temporal task correlations and process parameters by the robot
3. adaptability to similar processes with as few new examples as possible

No external cooperation is envisaged for the implementation of the project.

Project name:
LERN4MRK: Modellieren, Erlernen und Abstrahieren von Prozessen für die Mensch-Roboter Kooperation

Funding:
bmvit

Duration:  
01.07.2017 – 30.06.2021

Publikationen

S.C. Akkaladevi, M. Plasch, and A. Pichler, “Skill-based learning of an assembly process” Elektrotech. Inftech. (2017) 134: 312, Springer Vienna. https://doi.org/10.1007/s00502-017-0514-2 

C. Heindl, T. Poenitz, G. Stuebl, A. Pichler, and J. Scharinger, “Spatio-thermal depth correction of RGB-D sensors based on Gaussian Processes in real-time” in The 10th International Conference on Machine Vision, to be published, 2017.

Your contact

DI Dr. Gernot Stübl
Scientist
Robotics and Assistive Systems

+43 72 52 885 313
gernot.stuebl@nullprofactor.at

We answer …

… your questions

Technologielabor für die humanzentrierte, assistive Produktion der Zukunft

Ziel des Projekts ist der Aufbau eines vernetzten Technologie-Laboratoriums zur Entwicklung und Erprobung neuartiger, assistiver Technologien, Methoden und Konzepte für eine künftige digitale humanzentrierte Produktion.

Hauptzielsetzung ist der Aufbau wissenschaftlicher Expertise in den folgenden Themenfeldern:

  • Intuitive Interaktion und innovative BedienerInnen Assistenz
  • Kognitive Systeme im industriellen Produktionsprozess
  • Situative Entscheidungsunterstützung für die Systemauslegung
  • Mensch Roboter Kooperation in der Fertigung

Im Smart Factory Labor ist ein Demonstrationsarbeitsplatz aufgebaut, welcher neue Möglichkeiten zur Bedienerführung bzw. Interaktionsmöglichkeiten zwischen Mensch und Maschine zeigt. Besondere Bedeutung hierbei hat die ‚Useability‘ für den Mitarbeiter in der Produktion. Die Entlastung  des Mitarbeiters ist oberstes Ziel.

Exemplarische Arbeitsabläufe im Produktionskontext werden durch dynamische Projektion direkt auf den Arbeitsbereich oder das Bauteil angeleitet. Der Mitarbeiter wird geführt und ‚die Bedienungsanleitung für den aktuellen Bauplan‘ wird im direkt im Arbeitsbereich eingeblendet. Für zusätzliche Systemintelligenz dient eine 3D Sensorik, welche mittels Objeklageerkennung bzw. Objekttracking (2D oder 3D) permanent das Bauteil bzw. die Hände des Werkers im Auge behält und somit dynamisch den Projektionsinhalt verändern kann. Alternative bzw. komplementäre Technologien werden laufend evaluiert und gegenübergestellt; das Grundkonzept mit neueren Endgeräten umgesetzt basiert beispielsweise auf der Kombination aus einem Trackingsystem und einer Datenbrille.

Für intuitivere Benutzereingaben (Mensch zu Maschine Kommunikation) werden Tasten und Eingabemöglichkeiten in den unmittelbaren Arbeitsbereich (z.B. mittels Projektion) des Produktionsmitarbeiters verlagert, welcher dann durch einfache Gesten oder ganz natürlichen Bewegungen die entsprechenden Maschinen und (Assistenz-)Werkzeuge triggern kann.

Im Allgemeinen wird in diesem Themenbereich großer Wert darauf gelegt mit low-cost Ansätzen und entsprechender Systemintelligenz dem Mitarbeiter komplexe Produktion möglichst einfach zu gestalten.

Für die visuelle Prüfung von kompliziert geformten Bauteilen wird ein Inspektionsroboter eingesetzt, der den Bauteil unter dem jeweiligen Prüfsystem positioniert und in ein einer kontinuierlichen Scan-Bewegung die gesamte Oberfläche erfasst. In der Prüfzelle sind verschiedene Sensoren integriert:

Der Faserwinkelsensor „FScan“ prüft die Verläufe von Fasern auf Verbundbauteilen aus Karbon- oder Glasfaser und ermöglicht die Detektion von Defekten, wie Einschlüssen im Bauteil.

Der Sensor „LScan“ wird für die Überwachung von Ablegeprozessen („Automated Fiber Placement“) von CFK-Bauteilen verwendet, und detektiert typische Abweichungen im Prozess wie gaps, overlaps, twisted tows oder fuzzballs.

Der „TP-Scan“ Sensor wird für die Prüfung von metallischen Oberflächen eingesetzt und erlaubt eine sichere Unterscheidung von Lunkern, Kratzern oder Beschädigungen, die als Fehler erkannt werden sollen, und Verschmutzungen oder Verfärbungen, die akzeptable Abweichungen darstellen. Die Roboterbewegung für die jeweilige Prüfaufgabe wird auf Basis des CAD Modells des Bauteils und eines physikalischen Modells des Prüfprozesses automatisch kollisionsfrei geplant.

In diesem Themenfeld werden Technologien der Verteilten Künstlichen Intelligenz auf ihre Anwendbarkeit in Produktionsprozessen untersucht. Insbesondere die Teilbereiche Multi-Agenten-Systeme, Actor-Systeme, Holonik und Semantische Interoperabilität werden hier untersucht und ein Demonstrator entwickelt.

Im Mittelpunkt steht die Interaktion von Mensch und Maschine / Robotersystemen im gemeinsamen Arbeitsraum .

Die Demonstratoren im  Smart Factory LAB ermöglichen das schnelle und einfache Erstellen von verschiedener Arbeitsprozesse mit dem Roboter. Ziel ist es, die Systeme binnen wenigen Minuten auch von Nicht-Experten programmiert werden können. Interaktive Bedienung durch Mixed Reality Interaktion aber auch einfache  „Drag & Drop“ Funktionen erleichtern das Erstellen und Einfügen neuer Skills (Fähigkeiten) am Robotersystem.

Ihr Ansprechpartner

DI (FH) Harald Bauer
Head of Visual Computing

+43 7252 885 302
harald.bauer@nullprofactor.at

Dieses Projekt wird aus Mitteln des Europäischen Fonds für regionale Entwicklung kofinanziert.

Nähere Infos finden Sie unter: www.efre.gv.at 

Gerne antworten wir…

… auf Ihre Fragen