50th Anniversary logo sans tag for Print

Newsroom – California State University, Northridge

No Longer Just Science Fiction, CSUN Researchers Tap into Brain’s Power to Control Wheelchair

Media Contact:


(818) 677-2130

(NORTHRIDGE, Calif., Apr. 4th, 2011) ―

Imagine, with just a simple thought, a paraplegic will be able to control his or her wheelchair and send it in any direction he or she wants without assistance from anyone else.

The concept of brain-computer interface (BCI)—harnessing brainwaves to control external devices remotely with a computer serving as intermediary—is usually reserved for science fiction. But a team of Cal State Northridge researchers are making it a reality, and in the process hope to provide a new level of independence to those with physical and cognitive disabilities.

Mechanical engineering professor C.T. Lin and seven graduate and undergraduate students from Northridge’s College of Engineering and Computer Science have spent the past eight months working with a neurologist at Olive View-UCLA Medical Center in Sylmar on an ambitious BCI project designed to make movement easier for people confined to wheelchairs.

They already have a functioning prototype and plan to spend the next 10 months working out the kinks. At the end of that period, Lin said he expects to have a model ready for manufacture.

“That’s what we do, what engineering is all about—finding solutions to problems to the benefit of the greater community,” he said. “In this instance, we have created a solution for an important segment of our community, people with disabilities.”

Lin said the goal is to help people with physical impairments move around more easily in dynamically changing environments—whether they are traveling down a crowded corridor on the way to class or on a stroll with friends through a park.

“This wheelchair gives them the ability to control where they go by themselves. It’s a level of independence many people in wheelchairs do not have yet,” Lin said.

In addition to Lin, the CSUN research team includes mechanical engineering graduate students Craig Euler, Alfie Gil and Yunsong Shen, electrical and computer engineering senior David Prince, mechanical engineering seniors Ara Mekhtarian and Joseph Horvath and computer science graduate student Lee Hern.

Their prototype is a modified motorized wheelchair outfitted with a laser sensor a couple feet from the ground, a stereo camera mounted on a frame over the chair’s occupant, a laptop computer and an EEG headset designed to read the occupant’s brainwaves.

The sensor and the camera continually scan the areas in front and about 270 degrees around the wheelchair and send the data to the onboard computer which processes the information, noting any obstacles—from the legs of a table or chair to somebody standing in the way—that may exist and any openings that can accommodate the wheelchair. The EEG headset sends the wheelchair user’s brainwaves—which are focused on such commands as forward, back, right or left—to the computer.

The computer is programmed to process the data from both sets of commands—one created by the user and the other generated by a sensor-based analysis by the onboard computer.

Lin said the wheelchair can then run in autonomous mode, where the computer makes all the decisions, or in hybrid mode, where commands from the user are augmented by sensor analysis. He said the hybrid mode was created, in part, to help those who may have some visual or cognitive impairment that would interfere with their ability to see everything in their path.

“It will be up to the user to decide which mode they want,” Lin said. “Concentrating is hard work, so the person in the wheelchair may want to do a combination of both, starting out with the hybrid mode as he or she makes their way through a crowd down a corridor to class. But once they are on a clear path, such as strolling through the park with friends, they may decide to switch over to the autonomous mode so they can relax and enjoy the company of the people they are with and the walk.”

Lin said training people to use the wheelchair only takes a few days while the onboard computer’s software learns to adapt to the EEG headset’s wearer.

“Each person is unique, but the software is designed to gather data and continue to adapt to the person who is using the wheelchair,” he said. “The machine is always learning.”

Lin said he and his team hope to complete the project within 10 months. At that time, they expect to deliver a “fully intelligent” wheelchair that will be assessed by the district office of the State Department of Rehabilitation. Eventually, they would like to see the wheelchairs massed produced and available to those who need them.

For a demonstration of the prototype, see the video below:

  • Share this article:
  • E-mail
  • Delicious
  • Digg
  • Stumble
  • Facebook
  • Twitter
  • Technorati