Interpreting Speech and Sounds from Neural Activity, a Brief Overview

Date

2020

Authors

Ryan, Andrew

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

For people who are mute, or are completely paralyzed, one of the primary problems they have to deal with is being able to communicate. One potential solution to compensate for decreased communication functions is by using a brain-computer interface (BCI). The idea would be to quantify neural activation in the brain that correlates to imagined speech from the patient, and decode that into legible text that can be interpreted by the receiver. Due to the intricacy of speech interpretation, direct access to regions of the brain and individual neurons isrequired. As a result, many tests done on BCI speech interpretation involve using ECoG sensors on epilepsy patients when they are available. Some approaches used to analyze these signals for feature extraction include word based classification, and phoneme based classification. One approach mentioned less in the literature, is if there is a method to pull a sound signal directly from the activated regions of the brain. Advancement of the technology has potential use as a speech replacement for people suffering from paralysis, as well as in prosthetics.

Description

Keywords

Neural engineering, Brain-computer interface

Citation