Generalizable spelling using a speech neuroprosthesis in an individual with severe limb and vocal paralysis
Neuroprostheses have the potential to restore communication to people who cannot speak
or type due to paralysis. However, it is unclear if silent attempts to speak can be used to …
or type due to paralysis. However, it is unclear if silent attempts to speak can be used to …
[HTML][HTML] Music can be reconstructed from human auditory cortex activity using nonlinear decoding models
Music is core to human experience, yet the precise neural dynamics underlying music
perception remain unknown. We analyzed a unique intracranial electroencephalography …
perception remain unknown. We analyzed a unique intracranial electroencephalography …
Deep brain–machine interfaces: sensing and modulating the human deep brain
Y Sui, H Yu, C Zhang, Y Chen, C Jiang… - National Science …, 2022 - academic.oup.com
Different from conventional brain–machine interfaces that focus more on decoding the
cerebral cortex, deep brain–machine interfaces enable interactions between external …
cerebral cortex, deep brain–machine interfaces enable interactions between external …
Ultrathin crystalline-silicon-based strain gauges with deep learning algorithms for silent speech interfaces
A wearable silent speech interface (SSI) is a promising platform that enables verbal
communication without vocalization. The most widely studied methodology for SSI focuses …
communication without vocalization. The most widely studied methodology for SSI focuses …
Multisensory subtypes of aphantasia: Mental imagery as supramodal perception in reverse
Cognitive neuroscience research on mental imagery has largely focused on the visual
imagery modality in unimodal task contexts. Recent studies have uncovered striking …
imagery modality in unimodal task contexts. Recent studies have uncovered striking …
[HTML][HTML] Decoding semantic relatedness and prediction from EEG: A classification method comparison
Abstract Machine-learning (ML) decoding methods have become a valuable tool for
analyzing information represented in electroencephalogram (EEG) data. However, a …
analyzing information represented in electroencephalogram (EEG) data. However, a …
Theta‐gamma phase‐amplitude coupling in auditory cortex is modulated by language proficiency
The coordination between the theta phase (3–7 Hz) and gamma power (25–35 Hz)
oscillations (namely theta‐gamma phase‐amplitude coupling, PAC) in the auditory cortex …
oscillations (namely theta‐gamma phase‐amplitude coupling, PAC) in the auditory cortex …
Representation of internal speech by single neurons in human supramarginal gyrus
Speech brain–machine interfaces (BMIs) translate brain signals into words or audio outputs,
enabling communication for people having lost their speech abilities due to diseases or …
enabling communication for people having lost their speech abilities due to diseases or …
The nested hierarchy of overt, mouthed, and imagined speech activity evident in intracranial recordings
Recent studies have demonstrated that it is possible to decode and synthesize various
aspects of acoustic speech directly from intracranial measurements of electrophysiological …
aspects of acoustic speech directly from intracranial measurements of electrophysiological …
Rethinking CNN architecture for enhancing decoding performance of motor imagery-based EEG signals
Brain–computer interface (BCI) is a technology that allows users to control computers by
reflecting their intentions. Electroencephalogram (EEG)–based BCI has been developed …
reflecting their intentions. Electroencephalogram (EEG)–based BCI has been developed …