Story by Lee Siegel

If drivers are yakking on cell phones and don’t hear spoken instructions to turn left or right from a passenger or navigation system, they still can get directions from devices that are mounted on the steering wheel and pull skin on the driver’s index fingertips left or right, a University of Utah study found.

The researchers say they don’t want their results to encourage dangerous and distracted driving by cell phone users. Instead, they hope the study will point to new touch-based directional devices to help motorists and hearing-impaired people drive more safely. The same technology also could help blind pedestrians with a cane that provides directional cues to the person’s thumb.

“It has the potential of being a safer way of doing what’s already being done – delivering information that people are already getting with in-car GPS navigation systems,” says the study’s lead author, William Provancher, an assistant professor of mechanical engineering at the University of Utah.

In addition, Provancher says he is “starting to meet with the Utah Division of Services for the Blind and Visually Impaired to better understand how our technology could help those with vision impairments. It could be used in a walking cane for the blind,” with a moving button on the handle providing tactile navigation cues to help the person walk to the corner market, for example.

The system also could help hearing-impaired people get navigation information through their fingertips if they cannot hear a system’s computerized voice, says University of Utah psychology Ph.D. student Nate Medeiros-Ward, the study’s first author. “We are not saying people should drive and talk on a cell phone and that tactile [touch] navigation cues will keep you out of trouble.”

Medeiros-Ward is scheduled to present the findings Tuesday, Sept. 28 in San Francisco during the Human Factors and Ergonomics Society’s 54th annual meeting.

The study “doesn’t mean it’s safe to drive and talk on the cell phone,” says co-author David Strayer, a professor of psychology at the University of Utah. “It was a test to show that even in situations where you are distracted by a cell phone, we can still communicate directional information to the driver via the fingertips even though they are ‘blind’ to everything else.”

Provancher, Medeiros-Ward and Strayer conducted the study with Joel Cooper, who earned his psychology Ph.D. at the University of Utah and now works in Texas, and Andrew Doxon, a Utah doctoral student in mechanical engineering. The research was funded by the National Science Foundation and the University of Utah.

‘Channels’ Carry Information to the Brain

Provancher says the study was based on a “multiple resource model” of how people process information, in which resources are senses such as vision, hearing and touch that provide information to the brain.

You can only process so much,” he says. “The theory is that if you provide information through different channels, you can provide more total information. Our sense of touch is currently an unexplored means of communication in the car.”

But does humanity really need yet another way to provide information to drivers who already are blabbering on cell phones, texting, changing CDs or radio stations, looking at or listening to navigation devices and screaming kids – not to mention trying to watch and listen to road conditions?

The point is, it will help everybody,” Provancher says. “We all have visual and audio distractions when driving. Having the steering wheel communicate with you through your fingertips provides more reliable navigation information to the driver.”

Provancher says motorists already get some feedback through touch: vibration from missing a gear while shifting or a shimmying steering wheel due to tire problems.

“You can’t look at two things at the same time,” says Strayer. “You can’t look at graphic display of where you should go and look out the windshield. It [touch-based information] is a nicer way to communicate with the driver without interfering with the basic information they typically need to drive safely. They need to look out the window to drive safely. They need to listen to the noise of traffic – sirens, horns and other vehicles. This tactile device provides information to the driver without taking their attention away from seeing and hearing information they need to be a safe driver.”

The new study says automakers already use some tactile systems to warn of lane departures by drowsy drivers and monitor blind spots. But these devices generally twist the steering wheel (assisted steering), rather than simply prompting the driver to do so.

Drivers on Cell Phones Often Don’t Hear Directions, but Can Feel Them

shear feedback on steering wheelThe study was conducted on a driving simulator that Strayer has used to demonstrate the hazards of driving while talking or texting on a cell phone. Two of Provancher’s devices to convey information by touch were attached to the simulator’s steering wheel so one came in contact with the index finger on each of the driver’s hands.

During driving, each index fingertip rested on a red TrackPoint cap from an IBM ThinkPad computer – those little things that look like the eraser on the end of a pencil. When the drivers were supposed to turn left, the two touch devices gently stretched the skin of the fingertips to the left (counter clockwise); when a right turn was directed, the TrackPoint tugged the skin of the fingertips to the right (clockwise).

Nineteen University of Utah undergraduate students – six women and 13 men – participated in the study by driving the simulator. The screens that surround the driver’s seat on three sides displayed a scene in which the driver was in the center lane of three straight freeway lanes, with no other traffic.

Four driving scenarios were used, each lasting six minutes and including, in random order, 12 cues to the driver to move to the right lane and 12 more to move left.

In two scenarios, the simulator drivers did not talk on cell phones and received direction instructions either from the simulator’s computer voice or via the fingertip devices on the steering wheel. In the two other scenarios, the drivers talked on cell phones with a person in the laboratory and also received direction instructions, either from the computer voice or from the touch devices on the steering wheel.

Each participant did all four of the scenarios. The results:

In the two scenarios without cell phones, the drivers’ accuracy in correctly moving left or right was nearly identical for those who received tactile directions through their fingertips (97.2 percent) or by computerized voice (97.6 percent).

That changed when the drivers talked on cell phones while operating the simulator. When drivers received fingertip navigation directions while talking, they were accurate 98 percent of the time, but when they received audio cues to turn right or left while talking on a cell phone, they changed lanes correctly only 74 percent of the time.

Strayer says the findings shouldn’t be used to encourage cell phone use while driving because even if giving drivers directional information by touch works, “it’s not going to help you with the other things you need to do while driving – watching out for pedestrians, noticing traffic lights, all the things you need to pay attention to.”

A Touch of Product Development?

Provancher has patents and wants to commercialize his tactile feedback devices for steering wheels and other potential uses.

If we were approached by an interested automaker, it could be in their production cars in three to five years,” he says, noting he already has had preliminary talks with three automakers and a European original equipment manufacturer.

In addition to possible devices for the vision- and hearing-impaired, Provancher says the technology could be used in a handheld device to let people feel fingertip-stretch pulses – rather than hear clicks – as they scroll through an iPod music playlist. He also says it might be used as a new way to interact with an MP3 music player in a vehicle, or to control games.

Provancher set the stage for the tactile navigation devices in two research papers this year in the journal Transactions on Haptics, published by the Institute of Electrical and Electronics Engineers. Haptics is to the sense of touch what optics is to vision.

In one of those studies, Provancher tested a haptic device that stretched the fingertip skin in four horizontal directions (right, left, front, back) and found that relatively faster and larger (one twenty-fifth of an inch) movements conveyed direction information most accurately.

In that study, Provancher also mentioned other possible uses for such devices, including allowing command centers to direct emergency responders and urban soldiers to incident locations, or directing air traffic controllers’ attention to important information on a computer screen.


For more information on Provancher’s work on conveying information by touch, see:
http://heml.eng.utah.edu/index.php/Haptics/ShearFeedback

This news release and illustrations are available at:
http://www.unews.utah.edu/p/?r=082310-2