Sensory technologies could improve quality of life for deafblind people
Researchers at Leeds are exploring how wearable smart technologies that use the sense of touch could help deafblind people better adjust to their environment.
As many as 2.5m people in Europe have deafblindness, a condition that seriously impairs a person’s ability to interact in daily life, often leading to social isolation.
Up to now, little has been done to address this major challenge of limited communication and accessibility: Many existing devices can assist visual and audial problems. But, the potential for using sensory technologies that use ‘haptic’ properties such as vibration, temperature, and pressure has not yet been investigated fully.
Dr Raymond Holt and colleagues are among the University of Leeds researchers investigating how wearable haptic technologies, in the form of a smart textile, can send signals to deafblind people to communicate with them.
SUITCEYES (Smart, User-friendly, Interactive Cognition-Enhancer Yielding Extended Sensosphere) is a €2.4m, EU-funded collaboration between seven partners.
Part of the research will involve developing a smart textile that can communicate information about the environment through 'haptic' signals, using pressure, vibration, and temperature to communicate information.
Dr Holt said: "Technologies for delivering haptic communication are developing at a rapid pace, particularly in areas such as virtual reality.
“We hope to develop new insight into a solution that will make it possible for people who are deafblind to be more independent. This is important not only to the 2.5m people with deafblindness in the EU, but also for those who have a limited ability to communicate.
Ultimately, we want to equip people who are deafblind with the tools to be able to take an active part in the community.
“By conducting our research, we are taking the first steps to address this major problem. Ultimately, we want to equip people who are deafblind with the tools to be able to take an active part in the community.”
The research involves interviews with deafblind people and an analysis of user needs, to make sure the technology is developed in a way that will be most useful for people with deafblindness. It also involves an analysis of policy, as this can impact the development of and access to technology.
First, the research will build on machine learning, sensory technology, image and signal processing, psychophysics, and affective computing. This is to develop a technological platform.
In this part of the research, several studies will be conducted to develop customisable design elements.
Second, it will aim to improve the user’s ability to perceive and engage with their environment. This will involve using wearable sensors to test the distance and positioning of the user in relation to objects, proximity and movement.
The project has an advisory group that includes expert advisors who are actively involved in shaping the research as it is put into practice.
Limited communication
While is it rare at birth, deafblindness can be attributed to several causes and can develop throughout life in both youth and age. The condition is not limited to one group of people, so part of the challenge involves finding out about the needs of deafblind people in a variety of situations.
People who are deafblind can have differing degrees of visual and audial impairment. A person who has partial impairment of both senses may experience cognitive impairment, as one ability is unable to compensate for the other. This means their ability to relate to and navigate their surroundings is limited.
Likewise, the solution is not limited to deafblindness. These communication modes can be useful for additional problems in typical communication by sight or hearing senses.
SUITCEYES will deliver a haptic intelligent, personalised, interface (HIPI) that will form a significant first step to addressing the problem. It will give deafblind people the tools to be more independent and take more of an active role in society.
Further information
SUITCEYES involves seven partners from different European countries who together will tackle the eight different work packages of the project.
The University of Leeds has world-leading expertise in medical technologies and the largest group of disability studies in the UK.
Dr Bryan Henson, Dr Sarah Woodin, and Dr Bryan Matthews are leading University of Leeds’ role in SUITCEYES alongside Dr Holt.
Research Fellows Zhengyang Ling and Adriana Atkinson are also among the University of Leeds staff steering the project.
SUITCEYES is a three-year Horizon 2020 project that addresses the research gap in developing assistive technologies for limited auditory and visual communication.
The project partners are:
University of Borås, Sweden (project coordinator)
Centre for Research & Technology Hellas, Greece
Offenburg University of Applied Sciences, Germany
Vrije Universiteit Amsterdam, Netherlands
Les Doigts Qui Rêvent, Talant, France
Harpo Sp. z o.o., PoznaĆ, Poland