Towards Industry 5.0: Intelligent Cognitive Assistance in the Workplace
Article by the COALA squad (*) at TUD Delft: Samuel Kernan Freire, Sarath Surendranadha Panicker, Jasper Henny, Jeroen Schuncselaar, Pim Verhoeven, and Dr. Evangelos Niforatos.
Artificial Intelligence (AI) is the key holder of value creation and technological leadership for the European Union (EU) in 2020s. Besides robotics and autonomous vehicles, AI is revolutionizing the field of data analytics too. Machine learning, natural language processing, and computer vision automate data analysis normally conducted by data scientists. This can greatly accelerate knowledge-intensive activities that are critical for companies and their management. Knowledge-intensive activities are performed by experienced personnel and are notorious to automate. Retirement, job change and the increasing requirements for people employed in Industry 4.0 have a dire impact on the value creation in the manufacturing industry. Our squad at IDE, TU Delft has teamed up with BIBA at University of Bremen (Germany), ICCS at the National Technical University of Athens (Greece), and ANITI at the Federal University of Toulouse (France) to tackle this challenge in the COALA H2020 EU project, with support from technical and industrial partners.
Cognitive assisted manufacturing with trustworthy AI
COALA at IDE, TU Delft
COALA acronym stands for “COgnitive Assisted agile manufacturing for a LAbor force supported by trustworthy AI.” Our squad at IDE, TU Delft is developing a solution for cognitive assistance in industrial manufacturing settings that combines trustworthy-AI components with a voice-enabled intelligent digital assistant as an interface. The solution supports operators in troubleshooting, facilitates on-the-job training for novices, and provides production analytics to production managers. In particular, the AI-assisted on-the-job training for novice workers encompasses a set of Digital Intelligence Assistance (DIA) functions. The DIA functions connect with the cognitive advisor service that accesses shop-floor data (e.g., machinery location, production rates, production issues, etc.) and adapts the learning progress to the user. For example, a novice worker can interact with the COALA cognitive assistant in natural language and request solutions to problems based on knowledge collected from the experienced workers. This will enable machine operators and production-line managers to become effective faster, which will speed up changes in manufacturing (e.g., agile manufacturing). Complementary to the technology, we are also developing an education and training concept that focuses on building blue-collar worker competencies in human-AI collaboration. Ultimately, the COALA solution will transform how workers perform their jobs while enabling companies to maintain, or even increase, the quality of their production processes and their products.
Simulation of manufacturing industrial settings
The COALA Applied Labs
The COALA squad has a new home in the revamped Applied Labs space (Room 32-B-0-81) at IDE. The Applied Labs at IDE host a large array of (manufacturing) equipment (e.g., 3D printers) and support a wide range of maker activities, that can simulate aspects of how manufacturing is conducted in industrial settings (e.g., assembling components a 3D-printed product). Thus, the Applied Labs space is the ideal environment to prototype and evaluate our cognitive assistance interventions in a semi-controlled fashion with a plethora of participants and roles. Some of the use cases we have are related, but not limited, to 3D-printing and textile-making activities:
- Support on-site knowledge collection and training for 3D printing and textile-making.
- Adopt machine vision for operator activity recognition.
- Collect Augmented Manufacturing Analytics (AMA) to unveil tacit knowledge to transform into best practices based on objective measures (e.g., 3D-printing times, 3D-re-printing attempts, etc.).
- Trial different modalities for delivering cognitive assistance (e.g., tablet/mobile, smart-speaker, headphones, heads-up displays).
- Monitor the cognitive “fingerprint” (e.g., perceived workload) of our intervention using physiological measures (e.g., Electrodermal activity, Electroencephalography, eye-tracking, etc.).
Activity-recognition machine vision models
We will install the latest version of the Zed2 stereoscopic camera(s) (Zed2i) by Stereo Labs on the ceiling above and in front of machinery that requires knowledge to operate at the Applied Labs. A tablet placed next to the machine will embody the cognitive assistant and it will be available for interactions to everyone who seeks advice on how to operate it. The Zed2i supports object detection and tracking, including 18-point skeleton tracking at a depth perception from 0.2 meters up to 20 meters. This feature is particularly relevant for human activity recognition, while preserving the privacy of those recorded: the captured imagery resembles a human-skeleton frame that cannot be identified by the human eye. The stereoscopic video data will be collected and stored locally for training activity-recognition machine vision models to enable context-aware interventions. For example, a machine vision model can infer what a user in front of the 3D printer(s) is trying to achieve and invoke our cognitive advisor for offering assistance (e.g., instructions on how to replenish plastic). Our aim is to test our models and interventions both in dedicated user studies with recruited participants and over time with users who operate complex machinery (e.g., 3D printers, textile-makers, etc.).
(*) The COALA squad
Dr. Evangelos Niforatos is a Principal Investigator in the COALA project and an Assistant Professor of AI-Powered Human Augmentation at the Faculty of IDE, TU Delft
Samuel Kernan Freire is a PhD candidate in the COALA project doing research on Human-Centred AI and Natural Language Interfaces at the Faculty of IDE, TU Delft.
Sarath Surendranadha Panicker is a Research Assistant in the COALA project working on Industry 4.0, Machine Learning and Data Visualisation at the Faculty of IDE, TU Delft.
Jasper Henny is a Research Assistant in the COALA project working on dialog design for Conversation AI systems at the Faculty of IDE, TU Delft.
Jeroen Schuncselaar (MSc Student)
Pim Verhoeven (MSc Student)
The publication of this article has been provided by TU Delft