Is Google Glass Dangerous?

News about Google Glass is everywhere these days, and so are its critics.

Some charge it only with fashion crimes. Others worry about invasion of privacy: when out on a date with a Glass wearer, you won’t know if they are recording you — or Googling “seduction tips,” for that matter.

Nonetheless, most agree that a smartphone-linked display and camera placed in the corner of your vision is intriguing and potentially revolutionary — and like us, they want to try it. But Glass may inadvertently disrupt a crucial cognitive capacity, with potentially dangerous consequences.

In an impromptu TED talk and interview in March, Sergey Brin, one of Google’s founders, described a motivation for the new product. “We questioned whether you should be walking around looking down” at a smartphone, he said. Instead, the company’s designers asked, “Can we make something that frees your hands” and “frees your eyes”?

Google isn’t the only company selling a technology that makes it easier to use your phone while you do other things. Last month Chevrolet released a commercial touting “eyes-free and hands-free integration” with the iPhone’s Siri interface, showing a woman checking her text messages using voice commands while she drives in circles.

To their credit, Google’s designers have recognized the distraction caused by grabbing someone’s attention with a sudden visual change. Mr. Brin explained that Glass doesn’t flash an alert in its users’ visual field when a new text message arrives. Instead, it plays a sound and requires them to look up to activate the display.

The “eyes-free” goal addresses an obvious limitation of the human brain: we can’t look away from where we’re heading for more than a few seconds without losing our bearings. And time spent looking at a cellphone is time spent oblivious to the world, as shown in the viral videos of distracted phone users who stumble into shopping-mall fountains.

Most people intuitively grasp the “two-second rule.” When driving, for example, we glance only briefly at the radio or speedometer. But some distractions overwhelm this intuition.

Researchers at the Virginia Tech Transportation Institute outfitted cars and trucks with cameras and sensors to monitor real-world driving behavior. When drivers were communicating, they tended to look away for as much as 4.6 seconds during a 6-second period. In effect, people lose track of time when texting, leading them to look at their phones far longer than they know they should. Two-way communication is especially engaging, and time flies when we are reading and typing.

Heads-up displays like Google Glass, and voice interfaces like Siri, seem like ideal solutions, letting you simultaneously interact with your smartphone while staying alert to your surroundings. If your gaze remains directed at the world, then presumably if something important happens in your field of vision, it will capture your attention and take over your consciousness, letting you respond to it quickly.

The problem is that looking is not the same as seeing, and people make wrong assumptions about what will grab their attention.

According to the results of two representative national surveys we conducted, about 70 percent of Americans believe that “people will notice when something unexpected enters their field of view, even when they’re paying attention to something else.”

Yet experiments that we and others have conducted showed that people often fail to notice something as obvious as a person in a gorilla suit in situations where they are devoting attention to something else. Researchers using eye-tracking devices found that people can miss the gorilla even when they look right at it. This phenomenon of “inattentional blindness” shows that what we see depends not just on where we look but also on how we focus our attention.

If you think the situation would improve if the computer display appeared superimposed on the world itself, think again. Perception requires both your eyes and your mind, and if your mind is engaged, you can fail to see something that would otherwise be utterly obvious.

Research with commercial airline pilots suggests that displaying instrument readings directly on the windshield can make pilots less aware of their surroundings, even leading to crashes in simulated landings.

Google Glass may allow users to do amazing things, but it does not abolish the limits on the human ability to pay attention. Intuitions about attention lead to wrong assumptions about what we’re likely to see; we are especially unaware of how completely our attention can be absorbed by the continual availability of compelling and useful information. Only by understanding the science of attention and the limits of the human mind and brain can we design new interfaces that are both revolutionary and safe.

Daniel J. Simons is a professor of psychology and advertising at the University of Illinois. Christopher F. Chabris is a professor of psychology at Union College. They are the authors of The Invisible Gorilla: How Our Intuitions Deceive Us.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *