Last spring, Tzyy-Ping Jung was all over the news. MIT Tech Review, the Huffington Post and a dozen other outlets and blogs were buzzing about his new headband, capable of reading your thoughts and transferring them to a cell phone.

Imagine, a cell phone you could dial with your mind. One outlet called it “the end of dialing”; another said, “The bar for hands-free technology has officially been raised.” Jung, however, just sighs and says they missed the point.

“It’s a demonstration of a [brain interface] system that could be applied to daily life. It’s not really the end goal,” says Jung. “Who needs a phone that dials using brain waves if they can actually dial with their hands?”

Jung is associate director at the Swartz Center for Computational Neuroscience at UC San Diego, where researchers lead a new field called Brain Computer Interface, or BCI. The emerging area is littered with impressive toys and dazzling gadgets, like robots that move with a thought and artificial arms that respond at will, almost like real ones.

Tzyy-Ping Jung

Tzyy-Ping (left) and a group at the National Chiao Tung University in Taiwan have developed headgear and software that monitors brainwaves, collects data and transfers a thought process to a mobile device.

But while high-tech wizardry makes for fun headlines, UC scientists are poised to make a subtler yet fundamental change to the face of medicine. Using a technology somewhat overlooked for more than a decade, scientists are building a two-way conversation between your brain and the many computers that surround it every day.

Scott Makeig works with Jung as the director of the Swartz Center. For more than 20 years he has studied electroencephalogram (EEG) technology. EEGs, recognizable by their funny skullcaps dotted with electrode sensors, measure the electrical signals emitted by a subject’s scalp from the brain beneath. While fast and relatively mobile, over the past decade EEG research has been eclipsed by giant fMRI machines, which use huge magnets to track blood movement within the brain. It’s a slower, less direct measure of brain activity, but unlike EEG, which mainly focuses on the outer layers of the brain, it can pierce all the way through.

“EEG has dwindled to a low point in its use in medicine after MRI came out,” Makeig says. “And it was more or less ignored in neurophysiology.”

But hold your pity for poor EEG. In the meantime, scientists have been refining the bulky caps to the point where some take up less room than a pair of headphones. Jung has partnered with his alma mater, National Chiao Tung University in Taiwan, to develop headpieces that collect phenomenal amounts of data in a fraction of a second and broadcast it to a laptop or cell phone. Whereas previous EEG caps required gels to be smeared on a user’s scalp, today’s sleeker “dry” electrodes are so advanced that several companies have even created brain-operated children’s toys.

But the skullcap is just half of the brain-sensing equation; you also need to know what all that data means.

“If someone records data from the scalp they immediately realize how messy it is,” Jung says. “It’s very noisy.”

This is the so-called “cocktail party problem” — EEG brain recordings are like noisy gatherings, where dozens of conversations blend with background noises into confusing slurry. Separating which signals are related to a given thought process is daunting.

In the mid-'90s, Makeig and Jung, plus Terry Sejnowski and Anthony Bell at the Salk Institute, pushed through this problem by teasing apart the EEG signals using a clever analysis borrowed from French theoreticians. Before long, they were able to discriminate specific brain area sources within the crowded and overlapping brainwave and EEG signals coming from working brains.

This, along with a great deal of other work around the world, has opened the way for scientists to now link computers directly to commands from the brain. Although EEGs cannot pierce deep into the brain, the outermost layers — the brain’s cortex — generally are where what we call higher reasoning occurs, making it ideal for operating machines.

Naturally, scientists are aiming to build devices to help people with disabilities who are unable to operate wheelchair, computers and phones. But Makeig says brain interfaces have a much broader potential if used the other way — eavesdropping rather than taking commands. For instance, Makeig and Jung have done research into alertness monitoring for the military. He says soon we may be able to give simple headbands to air traffic controllers to alert them when they are nodding off.

Valuable for patient care

William Mobley, a UC San Diego neurologist who has worked on degenerative neurological disorders and Down syndrome, goes even further. He and Jung head up the Center for Advanced Neurological Engineering, which aspires to create a suit that could relay all kinds of information about a patient.

“We envision a time very soon in which a patient's vital signs, EEG, EKG and movements can be recorded 24/7 and sent wirelessly to a remote location for review by a physician,” said Mobley. “The suit might well be deployed to allow neurologists a much more complete assessment of patients with a variety of disorders, in the process collecting many thousands of times as much data as is currently the case.” 

This is not science fiction. The most sophisticated EEG devices (which cover the head with a bulky cap) can parse out underlying brain signals from the admixture of data recorded from up to 256 places on the scalp. However, with today’s gadgets you don’t need that kind of precision. With just a dozen channels or so Jung and Makeig can easily detect something as simple as a drowsy air traffic controller.

Tuning in on emotions

With more channels, Makeig also can get a pretty good sense of emotion. He says that a simple EEG device could someday become another tool for psychiatrists to give them a clue into the inner world of their patients. To demonstrate the technology, Makeig and graduate student Tim Mullen last year put on an unusual quartet. Makeig was on the violin and two other researchers took the cello and clarinet while Mullen played, well, his brain. (See photo at the top.) He began before the concert, playing musical notes and carefully cultivating the emotions they inspired in his own mind.

“On the night of the performance, I can sit down and reimagine that state — the state that was evoked by a particular note,” Mullen says. “And when I imagine that particular emotion my brain dynamics will be recreated again and the machine detects it and it plays that note that originally evoked that emotion in me.” 

The resulting call and response performance, like the brain dialing, is a stunning demonstration of the underlying potential of EEG-related brain interface. Can we expect a first chair EEG-ist next year at the Metropolitan Opera? No, probably not, but Makeig and Jung say that the important lesson is that scientists can now reliably track specific emotions as well as thoughts.

This, the researchers agree, is how BCI will actually integrate into our lives, as it still lags behind fingers for dialing numbers and surfing the Internet. By using the interface to listen in on the mind, scientists can make tools to reshape medicine, along with the clever toys and fodder for the occasional headline.