Menu

Search

  |   Technology

Menu

  |   Technology

Search

Cybathlon will showcase what bionics could do for millions with disabilities

Groundbreaking new technologies are finally leaving the lab. Alessandro Della Bella/ETH Zurich

Following the Olympic Games and Paralympic Games, this year will see the arrival of the Cybathlon, the world’s first competition for parathletes and people with severe disabilities who compete with the aid of bionic implants, prosthetics and other assistive technology.

The Cybathlon will include six disciplines, each one specialised to the competitors’ type of physical need. Agility courses test those with bionic arms and legs, while races for powered wheelchairs and powered wearable exoskeletons include tackling obstacles such as flights of stairs. There is also a bike race for paralysed competitors using electronic muscle stimulation to move their legs, and a competition for those who have lost the ability to move their bodies but who are put back in control by means of a brain-computer interface.

It’s true that the Cybathlon is unlikely to feature the sort of athletic prowess found at the Olympics or Paralympics. But it will demonstrate what the technology is capable of, instead of it staying hidden in research labs, and focus effort and enthusiasm on improving it in order to revolutionise the lives of those with severe disabilities and life changing injuries. The organisers, ETZ University in Zurich, will bring together 54 teams of users, researchers, and the tech manufacturing industry to think about what is really needed to make technology that solves the everyday problems those living with disabilities face.

It’s this focus on practical problems that has informed the design of the challenges. For example, the prosthetic arm race includes a station where the parathletes must slice a loaf of bread or pour a cup of coffee, and another where they must walk through a door while carrying a tray of objects. These are everyday activities taken for granted by most of us, but for the 15m people the World Health Organisation estimates are living with disabilities, they may be difficult or impossible.

While examples of technology such as bionic arms may be familiar, the brain-computer interface competition will be a surprise to most. A brain-computer interface is a system that interprets a person’s brain activity into one of several possible commands for equipment fitted to the competitor. This allows severely paralysed people whose cognitive and sensitive abilities are nevertheless intact to control equipment that can help them move or communicate.

It’s rare such interface systems leave a research lab, and many exist only in theory on the pages of research journals. They may seem like science fiction, yet they have existed in one form or another for decades.

Paralysed racers use electrical stimulation to move their legs to power reclining cycles. Alessandro Della Bella/ETH Zurich

Brain as machine controller

There are several components to a brain-computer interface. The first one is of course the brain of the person. Electrical impulses in the brain are detected through electroencephalogram (EEG) sensors attached non-invasively to the scalp, very much as they are in a hospital setting. These signals quite often include interference from muscular movement such as from the eyes, so the first step is to isolate the useful signal from the noise.

The signals are then processed in a step known as feature extraction. Approaches vary, but a common technique is for the user to imagine he or she is performing a movement, such as clasping and opening a hand. This mental imagery generates a particular pattern in the brain’s motor cortex which appears as an EEG signal that is easily recognisable and distinct from the background EEG activity.

The EEG signals are processed during feature extraction to make them more easily understood by the next component, the classifier, which identifies the intention of the user. A classifier identifies how the signal patterns differ when the user thinks of moving their left or their right hand, for example, or how these differ from signals generated as the user makes mental calculations. A good classifier learns these differences and identifies the most likely intention the user had, achieved through pattern matching and machine learning algorithms.

The Cybathlon race will test competitors of the brain-computer interface race by means of a video game, in which the participants will map up to four different actions from the brain that need to be understood by the classifier of the system. The competitors must send the correct decision at the right time in order to race each others' avatars represented in the game. The best system will be the one that most accurately recognises and quickly responds to its user’s brain activity, selects the right command and so allows he or she to win the race.

The appearance of brain-computer interfaces at Cybathlon is a rare opening outside the lab, that requires the systems’ developers to considerably improve them over those that need only function in lab experiments, for example by making them more reliable and able to cope with the user getting distracted.

Current systems aren’t yet ready for those whose lives they could so radically change. But the new developments of the last few years, which Cybathlon is encouraging further, will not only improve this technology but make it more suited to use by people living outside the lab – finally closing the loop on a technology that has been in the making for over 20 years.

The ConversationAna Matran-Fernandez is leader of BrainStormers, a team that will compete in the Brain-Computer Interface race at Cybathlon on behalf of Essex University.

Ana Matran-Fernandez, PhD researcher, University of Essex

This article was originally published on The Conversation. Read the original article.

The Conversation

  • Market Data
Close

Welcome to EconoTimes

Sign up for daily updates for the most important
stories unfolding in the global economy.