FEATURES

Give It Some Thought

Imagine operating a smartphone. Or a drone. Or a computer that speaks. Just imagine.

March 2024

Reading time min

Illustration of a person using their brain to communicate with and use a smartphone, a drone and a computer. The computer is telegraphs "Hello" from the person.

Illustrations by Brian Stauffer

Dennis DeGray’s path to the extraordinary world of brain-computer interfaces began with a mishap during the most mundane of chores: taking out the trash. He was running to the curb on a rainy night when he went flying head over heels. In the murk of semiconsciousness, he thought he’d been bound by someone breaking into his house. It wasn’t until the next day that a neighbor heard his calls to be freed. “Dennis,” the man told him, “you’re not tied up.”

DeGray had simply slipped, breaking his neck between the second and third vertebrae. The machinist and former volunteer firefighter was paralyzed from the neck down. The week before his injury, he’d been on a guys’ trip in Northern California, shooting and fishing; the next, he says, his world had shrunk to bed and constant TV. “It’s amazing how one minute’s bad decision can really change everything,” he says. “You just lay there like a slug, waiting to die, until you have a reason to move forward. Then that reason becomes everything.”

DeGray’s everything is his leading role in an ongoing Stanford study of experimental devices that allow brains that can no longer fully communicate with their bodies to instead communicate with computers. In August 2016, nearly a decade after the accident, Stanford professor of neurosurgery Jaimie Henderson implanted a pair of electrode arrays the size of baby aspirins in the region of DeGray’s brain that is dominant for controlling his right hand—or was, before the accident throttled that communication. The so-called Utah arrays, each resembling a tiny bed of a hundred 1-millimeter nails, pierced just far enough into DeGray’s motor cortex to eavesdrop on surrounding neurons and relay the information to outside computers poised to decode it. 

A month after surgery, DeGray made his first attempt to use the device. By visualizing moving his hand—essentially willing it to do what it no longer could—DeGray transmitted the neural signals that allowed him to gain control of a computer cursor in 37 seconds. Shortly thereafter, he was tasked with hitting 50 targets on the monitor before him. “I got them,” he says. “I got all 50 of them.” He remembers silence from the scientists in the room. “They’re a dry bunch,” he says. But it was an auspicious meeting of man and machine. Over the past seven years, DeGray has devoted himself to pushing the research further. In 2017, he set a mental typing record of eight words a minute by imagining himself hunt-and-pecking on a virtual keyboard. In 2020, he tried a new method—imagining writing letters out by hand, so that researchers could attempt and assess the decoding of mental handwriting. To train the algorithms, DeGray spent days visualizing holding a pen to a yellow notepad and picturing the act of writing out thousands of letters, stroke by stroke. “It was like punishment, but I did it religiously,” he says. He describes the task as a combination of writing sentences in after-school detention and being walloped by a personal trainer at the gym. “It’s a workout,” he says. “It requires me to attempt the movements. I’m as tight as I can be and as flexed as I can be. My blood pressure goes up. I have to be reminded to breathe.” The results—converted to type by computer—validated the effort. DeGray more than doubled his own record, to 18 words a minute.

‘I like to think of it like we’re developing the alphabet that other people will use to write books.’

The findings were proof of concept, not medical product. DeGray could tap his new powers only in the presence of researchers who calibrate and run a complex system that requires a trolley of computers that plug into pedestals attached to his skull. But they were eye-catching evidence of the potential for BCIs—brain-computer interfaces—to transcend the barrier between the interior of the brain and the external world, a leap that may one day enable people with a wide variety of neurological conditions to regain function in movement, communication, and vision, and that ultimately may provide a novel platform for treating and monitoring brain health and recovery. DeGray doesn’t expect that future to come quickly enough to change his life, but he has dedicated himself to its promise. “I like to think of it like we’re developing the alphabet that other people will use to write books.”  

Sense and Sensibility

When Henderson, the doctor who operated on DeGray, joined the Stanford faculty in 2004, he brought expertise in deep brain stimulation, which delivers tiny jolts of electricity to the brain as a treatment for several conditions, including Parkinson’s disease. He’d been trying in vain to find a partner to explore the emerging world of BCI when, shortly after his arrival at Stanford, he was introduced to Krishna Shenoy, then an assistant professor of electrical engineering. Shenoy was dedicated to decoding the language of neurons, the voltaic pulses that send information throughout the nervous system. He had developed algorithms increasingly adept at deciphering the neural commands that control hand and arm movements in monkeys. His goal was to translate that work to humans—exactly what Henderson was looking for. It was the beginning of a relationship that would result in the formation of the shared Neural Prosthetic Translational Lab in 2009 and one that would last until Shenoy’s death from pancreatic cancer last year at 54. “It was chemistry,” Shenoy told Stanford Medicine in 2017. “Two people who just clicked.”

The pair met at a crucial time for BCIs. The first in-person studies were just beginning, after decades of animal testing. In 2004, researchers from Brown University and several other institutions performed the first human implementation of a Utah array, the spiky sensor that remains the gold standard for academic research in the field. That study implanted a sensor in the brain of a 24-year-old man who had been paralyzed by a knife to the neck, giving him basic cursor control as well as the ability to open and close a prosthetic hand and move a robotic arm. It was a vivid illustration not only that the brain retains its ability to issue orders years after the body stops receiving them, but that a BCI can provide it an attentive new audience. While the performance was groundbreaking, it was also rudimentary. A New York Times piece, published the same day the study appeared in Nature, noted the cursor control was wobbly and slow—taking 2.5 seconds, on average—and that the participant could only “somewhat” control the robotic arm. The reporter, however, cited another BCI study from the same issue of the journal, this time tested in monkeys, that reportedly operated about four times as fast. The work was from the Shenoy lab.

Clockwise from top left: Henderson and Shenoy; Obama and Copeland fist pump; DeGray in front of computer screen hitting targets. ‘PEOPLE WHO JUST CLICKED’: Clockwise from top: Henderson and Shenoy; Obama and Copeland; DeGray at work hitting targets. (Photos, clockwise from top: Paul Sakuma; Pete Souza/Official White House Photo; PBS NewsHour/Courtesy NewsHour Productions LLC)

In the years to follow, much of the excitement around BCIs centered on the potential for brain-controlled robotic limbs. In 2016, President Barack Obama fist-bumped with a robotic arm controlled by Nathan Copeland, a 30-year-old with paralysis. Copeland not only controlled the fist but also sensed the bump, thanks to electrodes implanted in a region of the brain that processes sensory information from the body. The Stanford research focused on areas that were less visually demonstrative, but graceful, intuitive, and effective. In a 2018 study led by Paul Nuyujukian, MS ’11, PhD ’12, MD ’14, now a Stanford assistant professor of bioengineering and of neurosurgery, participants used their thoughts to peruse music, search YouTube, and compose emails. It was all done with BCIs that connected via Bluetooth with generic computer tablets fresh from Amazon. The humdrum hardware belied the fiendish complexity of the process: Nuyujukian compared the job of decoding neural commands to listening to a hundred people speaking a hundred different languages. But in a world where there’s an app for everything, researchers saw the power in creating ways to seamlessly control the consumer electronics that dominate everyday life. “We had to persevere in the early days, when people said, ‘Ah, it’s cooler to do a robotic arm—it makes a better movie,’” Shenoy told MIT Technology Review in 2021. But “if you can click, then you can use Gmail, surf the web, and play music.” 

Stanford’s lead researchers understood how vital a role BCIs could play in communication. Shenoy said his work was influenced by his maternal grandfather—a World War II–era U.S. Marine—whose multiple sclerosis had affected his ability to walk, talk clearly, and move his hands effectively. Henderson was 5 when his father sustained severe and lasting injuries in a car accident, including serious brain trauma. “He would try to express himself really, really hard,” Henderson says. “It was hard to understand what he said. Eventually, we would usually figure out what silly pun he was trying to make, or that he was proud of us for something.” Henderson says his childhood imbued him with an awareness of the power of communication, a value mirrored in the lab’s goals. “For me, that’s the most important thing.”

When Henderson and Shenoy started collaborating, the idea of using BCIs to decode speech seemed distant indeed. Primates provide a model for motor studies, but no lab animal is relevant to speech, a uniquely human process controlled by a blizzard of electrical pulses to 100-some muscles in the cheeks, lips, jaw, tongue, and larynx. But in more recent years, a series of scientific strides—including a better understanding of the geography of the brain, improved surgical procedures, and, most prominently, the rise of machine learning—transformed the possibilities. In 2021, a team from the lab of Edward Chang, a neurosurgeon at UCSF, published a groundbreaking paper detailing the use of a BCI that decoded the speech of a former field worker who had had a stroke 16 years earlier. The average American knows about 42,000 English words and speaks perhaps 150 of them per minute. At 18 words a minute and limited to a 50-word vocabulary, the BCI breakthrough was front-page news in the New York Times. “Not to be able to communicate with anyone, to have a normal conversation and express yourself in any way, it’s devastating, very hard to live with,” the research participant said via email in the piece, later adding, “It’s very much like getting a second chance to talk again.”

‘So many years of not being able to communicate and then suddenly the people in the room got what I said.’

The Stanford lab began to publish its own speech work last year, pushing the frontier even further. One of the key participants was Pat Bennett. A dozen years earlier, her words had begun to slur after she drank a glass of wine, prompting friends to suspect that the daily jogger and regular equestrian was hiding a drinking problem. In fact, Bennett had amyotrophic lateral sclerosis, or ALS, a progressive neurodegenerative disease that often results in death within five years. Bennett’s disease moved more slowly, but it was quick to attack her power of speech. 

After hearing about Stanford’s BCI research from her medical team, Bennett volunteered to participate. In March 2022, Henderson implanted four sensors in two areas of her brain associated with speech. About a month later, she began working with Stanford scientists who cued her to recite thousands of sentences over the following four months. As Bennett read the prompts, machine learning algorithms began to correlate her brain signals with the sounds she intended. The results were fed into a sophisticated autocorrect system not unlike those on a smartphone. By the end of training—some 10,850 sentences later—the software was deciphering Bennett’s speech into text at more than 60 words a minute using a 125,000-word vocabulary. The error rate of 23.8 percent was significant, but Bennett was delighted. “When the study advanced enough that I actually saw my garbled incomprehensible vocal noises translate to what I was saying, it was joyous,” Bennett wrote in a recent email. “So many years of not being able to communicate and then suddenly the people in the room got what I said. I don’t remember what I exactly said after the prescribed script finished, but it had to be along the lines of ‘Holy shit, it worked, I’m so happy, and you guys did it.’” 

“I overloaded the memory on my phone because I would take videos of it every single time,” says doctoral student Erin Kunz, MS ’20, one of three lead authors of the paper, who had often decoded her father’s speech for others before he died of ALS. “I don’t want to delete them, because I want to remember it.”

Signal Boost

The Bennett paper was published in the same issue of Nature as a paper from Chang’s UCSF lab, which had used a different type of BCI in another participant unable to speak due to stroke. (It also created a digital avatar that modeled the woman’s emotions.) Their decoder was able to decipher that woman’s speech to text at 78 words per minute with a 1,000-word vocabulary and a 25.5 percent word error rate. By themselves, the two studies were obvious milestones of how quickly speech decoding research was moving, but just six weeks later a team led by scientists at UC Davis won the 2023 BCI Award with their demonstration of a BCI that reported decoding speech with better than 90 percent accuracy with a 125,000-word vocabulary on the second day of use. (Henderson and Kunz are among six Stanford co-authors on the study, which at press time had not yet been published in a journal.)

In fact, speech BCIs may be the first type available to the public, says UC Davis assistant professor of neurological surgery Sergey Stavisky, PhD ’16, a senior author on the winning study and a former student of Shenoy’s. The neural decoding required for control of robotic limbs—his initial focus at Stanford—is actually simpler, Stavisky says. But effectively executing those commands incurs other challenges, including the engineering of responsive, reliable, and mobile robotics. Similar challenges face researchers working on BCIs that could enable patients with severed spinal cords to move their arms and legs. Control of an appendage isn’t just a motor command; it also requires proprioception, or the sense of one’s own body in space. Think of how strange it can be to move an arm that’s fallen asleep or to chew after having Novocain at the dentist. Once decoded, however, speech can be expressed relatively easily using consumer electronics. Stavisky imagines a fast-approaching future when people carry speech BCIs on their laps or belts. “I think within the next five years there will be approved medical devices for restoring communication.” (Less is known about the potential of BCIs to enable speech for those who have never spoken. “We haven’t taken that leap yet because we wanted to first show that our approach works well for the easier challenge of restoring lost speech,” Stavisky says. “It’s definitely something that’s on our radar and is one of the directions we aspire to investigate in the future.”)

Stavisky and Henderson are among the nine principal investigators of the BrainGate Consortium, a group of universities and academic medical centers studying BCIs. The collaboration has also enabled researchers to investigate the devices’ safety. A recent study of 14 BCIs implanted by BrainGate institutions, including two at Stanford, did not find any adverse effects that resulted in deaths, increased disabilities, or infections to the nervous system, or that required removal of the device.

One of the most remarkable things about the rise of BCIs is that they do so much with so little. In a three-pound organ containing billions of neurons, the sensors in studies like those involving DeGray and Bennett may be reading signals from just dozens of neurons. “It is really fascinating this works at all,” says Cindy Chestek, PhD ’10, a former student in Shenoy’s lab and an assistant professor of biomedical engineering at the University of Michigan. Indeed, the Bennett experiment succeeded even though two of the four arrays did not provide relevant signals. Even so, realizing the full potential of BCIs—like enabling more naturalistic speech and movement—will depend on reading out much more data from the brain than currently possible. “It’s going to get a lot better when you have hundreds or even thousands of neurons,” Chestek says. 

That requires new hardware, a likely prospect as companies jockey to create improved BCI products that could be approved for public use. The company with an inside lane uses a minimally invasive approach. Synchron—which last year became the first company to begin human trials of an implanted BCI in the United States—feeds a stent-like sensor up the jugular vein to the motor cortex, where it lines the wall of a blood vessel.

The device’s remove from neurons means it isn’t nearly as powerful as an implant—the current model allows participants with ALS to scroll and click a mouse, says Tom Oxley, the company’s CEO. But he thinks people will prefer the less invasive approach, and even these capabilities offer a transformative opportunity. “If you can navigate your way through an iPhone, you can do a bunch of meaningful tasks that we take for granted: shopping, ordering food, ordering your medication, jumping on a call, sending a message,” Oxley says. “That stuff gives you your independence back.” 

Top photo: Shenoy, Nuyujukian, and Henderson (back row), and Kunz (far right). Bottom photo: Bennett (front) with researchers in front of screen.LAB COLLAB: Researchers such as Shenoy, Nuyujukian, and Henderson (top, back row), and Kunz (bottom, far right) have long worked together across disciplines to restore speech in research participants like Bennett (bottom, front). (Photos, from top: PBS NewsHour/Courtesy NewsHour Productions LLC; Steve Fisch/Stanford Medicine)

Other companies are refining the Utah array model—creating implantable chips with more electrodes that will read out information wirelessly and use more bio-friendly designs. Existing BCIs in participants like Bennett have tended to decline in performance over time, due to either the brain’s resistance to a foreign body or the device’s degradation. Paradromics, an Austin, Texas–based company, is developing wireless implants that have more than four times the number of electrodes as a Utah array, says Vikash Gilja, MS ’10, PhD ’10, the company’s chief science officer. At the same time, he says, the devices are made of more durable material with thinner, less obtrusive electrodes. “The smaller we get them, the closer they are to being invisible to the body,” he says. The company expects to get FDA approval for clinical trials this year. Neuralink, a company co-founded by Elon Musk, is pursuing a similar track.

If these companies—or others like them—succeed, they could provide a platform for new approaches across a wide range of medical needs, Chestek says. “You’re interfacing with the brain at a neuronal level,” she says. “You can imagine a future of medicine where a lot of what you do is interacting with neurons and getting the body’s own control system to do things.” Conversely, BCIs could play a brain-monitoring role. Nuyujukian’s lab, for example, is looking at the potential for BCIs to shed light on stroke recovery. Each year, hundreds of thousands of Americans survive a stroke, often requiring intense physical rehab that occurs without any reliable window on how well it’s working. “We don’t have any scientific understanding into what changes at the neural-circuit level postinjury,” he says. A BCI could provide “a real-time readout of the state the brain” that guides how patients are treated. BCIs could ultimately offer similar insights for conditions such as epilepsy, depression, and Alzheimer’s. 

Taking Flight

From his bed in a Menlo Park nursing home, DeGray continues to help researchers demonstrate what is possible. Last summer, he cast aside his imaginary pen and took to the air. Two miles away, a drone was taking off, flying, and landing, all under the command of DeGray’s thoughts. The research was gathering data on 4-D control— up/down, forward/backward, left/right, and rotation—but it was also simply and undeniably about fun, a symbol of the freedoms that BCIs promise. “You have to get him to quit,” says Henderson. “It’s like ‘OK, Dennis. We’ve been at this for hours. You’re going to get tired. We have to stop for today.’”

The work continues without Shenoy, which weighs on Henderson’s mind. “It’s very tough because it grew organically and it was truly a joint venture,” he says. Shenoy was both a visionary whose work transformed the field and a beloved mentor to a generation of scientists who continue to push its boundaries. Before his first cancer surgery, in 2011, he began to bank recommendation letters for his students, which he would update whenever he felt his health decline. At his memorial service, there were nearly 20 tenured or tenure-track faculty who’d been his advisees, a remarkable output for a small lab, Stavisky says. “He was probably the best adviser I have ever even heard of,” Chestek says. “We’re not going to see another Krishna, but maybe all of us together can keep all of this going.” For Henderson, that means sticking to the vision he and Shenoy developed together, thanks to the collaboration of other engineering faculty.

DeGray will keep helping show the way. He’s contributed to thousands of hours of research and been central to a score of academic papers. Eight years after his surgery, the signals from his implants have remained serviceable, and his commitment unflagging. He works with Stanford researchers two days a week, and says he’d add a third if he had more energy. He’ll always be processing what he lost that day when he was hurrying to take out the trash, he says. “It’s so big you can’t really address it.” But he’s gained something too. “I’ve been given a great gift of being able to help other people,” he says. “Somewhere out there, there’s a guy who hasn’t even fallen down yet and when he falls down, he’s not going to have to go through what I’ve gone through. When he wakes up in the morning, his life will be substantially different than mine. And that’s a good thing.”


Sam Scott is a senior writer at Stanford. Email him at sscott3@stanford.edu.

You May Also Like

© Stanford University. Stanford, California 94305.