Generalizable spelling using a speech neuroprosthesis in an individual with severe limb and vocal paralysis

SL Metzger, JR Liu, DA Moses, ME Dougherty… - Nature …, 2022 - nature.com
Neuroprostheses have the potential to restore communication to people who cannot speak
or type due to paralysis. However, it is unclear if silent attempts to speak can be used to …

[HTML][HTML] Music can be reconstructed from human auditory cortex activity using nonlinear decoding models

L Bellier, A Llorens, D Marciano, A Gunduz… - PLoS …, 2023 - journals.plos.org
Music is core to human experience, yet the precise neural dynamics underlying music
perception remain unknown. We analyzed a unique intracranial electroencephalography …

Deep brain–machine interfaces: sensing and modulating the human deep brain

Y Sui, H Yu, C Zhang, Y Chen, C Jiang… - National Science …, 2022 - academic.oup.com
Different from conventional brain–machine interfaces that focus more on decoding the
cerebral cortex, deep brain–machine interfaces enable interactions between external …

Ultrathin crystalline-silicon-based strain gauges with deep learning algorithms for silent speech interfaces

T Kim, Y Shin, K Kang, K Kim, G Kim, Y Byeon… - Nature …, 2022 - nature.com
A wearable silent speech interface (SSI) is a promising platform that enables verbal
communication without vocalization. The most widely studied methodology for SSI focuses …

Multisensory subtypes of aphantasia: Mental imagery as supramodal perception in reverse

AJ Dawes, R Keogh, J Pearson - Neuroscience Research, 2024 - Elsevier
Cognitive neuroscience research on mental imagery has largely focused on the visual
imagery modality in unimodal task contexts. Recent studies have uncovered striking …

[HTML][HTML] Decoding semantic relatedness and prediction from EEG: A classification method comparison

T Trammel, N Khodayari, SJ Luck, MJ Traxler… - NeuroImage, 2023 - Elsevier
Abstract Machine-learning (ML) decoding methods have become a valuable tool for
analyzing information represented in electroencephalogram (EEG) data. However, a …

Theta‐gamma phase‐amplitude coupling in auditory cortex is modulated by language proficiency

M Lizarazu, M Carreiras, N Molinaro - Human brain mapping, 2023 - Wiley Online Library
The coordination between the theta phase (3–7 Hz) and gamma power (25–35 Hz)
oscillations (namely theta‐gamma phase‐amplitude coupling, PAC) in the auditory cortex …

Representation of internal speech by single neurons in human supramarginal gyrus

SK Wandelt, DA Bjånes, K Pejsa, B Lee, C Liu… - Nature human …, 2024 - nature.com
Speech brain–machine interfaces (BMIs) translate brain signals into words or audio outputs,
enabling communication for people having lost their speech abilities due to diseases or …

The nested hierarchy of overt, mouthed, and imagined speech activity evident in intracranial recordings

PZ Soroush, C Herff, SK Ries, JJ Shih, T Schultz… - NeuroImage, 2023 - Elsevier
Recent studies have demonstrated that it is possible to decode and synthesize various
aspects of acoustic speech directly from intracranial measurements of electrophysiological …

Rethinking CNN architecture for enhancing decoding performance of motor imagery-based EEG signals

SJ Kim, DH Lee, SW Lee - IEEE Access, 2022 - ieeexplore.ieee.org
Brain–computer interface (BCI) is a technology that allows users to control computers by
reflecting their intentions. Electroencephalogram (EEG)–based BCI has been developed …