Restoring Speech with Brain-to-Computer Interfaces - Dan Rubin, MD, PhD & Kristina Simonyan, MD, PhD
Watch Full Conversation
In this episode, we learn about how brain-to-computer interfaces (BCIs) are helping restore speech in individuals living with ALS and Laryngeal Dystonia. Dr. Dan Rubin discusses the research behind decoding speech intention from brain activity using motor cortex mapping and real-time phoneme prediction. Dr. Kristina Simonyan discuss her research on EEG-based neurofeedback therapy and VR to restore normal speech in patients with laryngeal dystonia.
Dr. Dan Rubin is an Assistant Professor of Neurology at Harvard Medical School. His research looks into how placing micro-electrode arrays directly into the brain can help restore communication in patients with ALS and other disorders leading to speech paralysis. His team uses BCIs to record electrical activity from individual neurons in the speech motor cortex to decode the “intent’ to move speech muscles. The patient’s internal intent to speak is then translated and reproduced through computer systems to restore communication.
Dr. Kristina Simonyan, Professor of Otolaryngology at Harvard Medical School, focuses primarily on restoring normal speech in patients with laryngeal dystonia. Laryngeal Dystonia is a neurological disorder that causes involuntary spasms of the vocal cord muscles making it difficult for patients to speak. The BCIs Dr. Simonyan uses involve high density EEG caps and neurofeedback to “retrain” a patient’s ability to speak.
Dr. Rubin received his MD and PhD from Columbia University and completed his residency and a fellowship in Neurocritical Care at Massachusetts General Hospital.
Dr. Simonyan completed her medical degree and residency in Otolaryngology at Yerevan State Medical University in Armenia and Georg-August University in Germany. She holds a PhD in Neurobiology from the University of Hannover.
