Allianz Trade & Inclusive Brains join forces to foster the inclusion of people with disabilities thanks to AI and neurotechnologies.
Allianz Trade, the world’s leading trade credit insurer and Inclusive Brains, a French start-up developing a new generation of neural interfaces powered by generative Artificial Intelligence (AI), have partnered to develop Prometheus, a new kind of brain-machine interface that transforms diverse neurophysiological data (brainwaves, heart activity, facial expressions, eye-movements) into mental commands. The goal of this innovative assistive technology is to help individuals who can no longer use their hands or speak to operate workstations, connected objects and to navigate digital environments without the need to type on a keyboard, to touch a screen, or to use vocal commands. Eventually, the Allianz Trade and Inclusive Brains partnership will accelerate the development of AI-powered assistive solutions that give people with disabilities more autonomy and facilitate their access to education and to the workforce.
AI for good: When generative AI-powered solutions improve inclusion
Inclusive Brains was founded in 2022 by Professor Olivier Oullier, a neuroscientist turned AI entrepreneur, and Paul Barbaste, a cyber security and AI expert, with an ambitious and clear mission: to leverage the combination of generative AI and Brain-Computer Interfaces (BCI) to improve the inclusion of people who lost the ability to move because of life accidents or neurodegenerative diseases. This is why Inclusive Brains began developing the Prometheus BCI. “Inclusive Brains’ generative AI models and multimodal neural interfaces will benefit everybody, regardless of their physicality, abilities, and needs, however special they might be. True inclusion means developing solutions that assist each and everyone of us, with no discrimination whatsoever. Being able to control a computer with your mind, your eyes or by blinking or clenching will be life changing for a lot of people with paralysis as it will enable them to communicate with the world. So far, generative AI’s Large Language Models (LLM) are great at understanding human language, but words do not capture all the nuances of human-machine interactions, even more when a disability prevents you from saying them out loud or from typing them. We therefore train our AI models with
various kinds of neurophysiological data such as brainwaves, facial muscles, eye-movements or heartbeats. It’s the only way to truly empower machines and digital environments to adapt to how unique each user is, and to how they feel in real time”, states Olivier Oullier, Co-founder & CEO of Inclusive Brains.