Skip to main content

HUMAN-ROBOT COLLABORATION

(Production Line)

Human-Cobot Collaboration for Robust Quality Inspections

The pilot will exploit STAR’s modules and technologies in order to alleviate the complexities and inefficiencies of existing quality inspection processes in the Philips factory in Drachten (the Netherlands).

 

 

Deployment will take place in a setting comprising: (i) A Variety of Products: Products will vary in shape, colour, decoration pattern, level of gloss, etc. Moreover, the product orientation could differ between products when offered for quality inspection; (ii) Quality Inspectors (typically lower educated workers): These workers are in charge of performing quality inspections, while also reconfiguring the Cobot for part handling. Furthermore, they will provide instructions to the visual quality inspection solution for human supervised learning; (iii) Cobots (handling parts): These cobots will have cycle times that match the human operator and will be able to handle different products alternately. They should be easily adapted (reconfigured), by lower educated workers, to handle new products; (iv) Cobots (for quality inspection): These cobots will use AI-based (i.e. deep neural networks) solutions for visual quality inspection, which should be capable to deal with high gloss multi-curved products and to inspect extremely high visual requirements like dust, minor scratches, colour differences due to variation in (plastic) material thickness (e.g. supporting structure around a screw cavity) or any irregularities visible from 30cm with the human eye. Moreover, this cobot solution will be equipped with an easy to use HMI that will enable human supervised learning and will indicate to the human when input is required. Emphasis will be put in the training speed of the visual quality inspection solution (n < 50).

 

 

  • UC1 Easy reconfiguration for automated part handling: This UC will focus on scenarios where the cobot is reconfigured (by the workers) in order to automatically handle new parts. The processes should be easy, safe and it should enable fast knowledge acquisition based on techniques like active learning and SR, but also XAI for ensuring/auditing the proper operation of the cobot.

  • UC2 Human supervised learning (visual quality inspection):  As part of the UC the workers will provide instruction to the cobot regarding the visual quality inspection based on a proper HMI (including NLP). The UC shall demonstrate that human supervision can be performed in an easy and straightforward way by workers with lower education;

  • UC3 Safe Collaboration between human and cobot: This use case will emphasize on the safety of interactions between humans and cobots during quality inspection tasks. In this direction, digital twins that take into account the condition of the workers will be used in order to identify potential safety issues and to plan for mitigating them. Security and safety should be provided to all use cases as prerequisites for their graceful operation.