Unit 8, Listening 1, Automation and Us
Automation and Us
Host: Now as we heard earlier from Andrew McAfee,
our increasingly automated world is cause for concern and planning, especially when
it comes to the future of employment. Or . . . automation could give us what we
always wanted: a life free from the drudgery[1]
of work. Right?
Nicholas Carr: There
have been studies that show that we think we don’t like to work, so when people
are at work and you ask them what they’d rather be doing, they say, “Well, I’d rather
be at leisure[2], or idle.”
Host: Mm-hmm.
Carr: And yet, when you actually study their
feelings, you see that when people are at work, they tend to be happier, more satisfied,
more fulfilled than when they’re not
at work.
Host: That’s Nicholas Carr, author of the new
book The Glass Cage: Automation and Us.
He’s thought a lot about humans and work.
Carr: So we have this situation that psychologists
refer to as “mis-wanting”: that we think we want to be freed from labor[3], but
actually it’s when we’re working hard and facing challenges and exercising our talents
that we feel most fulfilled and most satisfied with life.
Host: Part of your argument is that our overreliance[4] on expert
systems and artificial intelligence[5] is actually
having a negative impact on performance, for instance with doctors. So what are
some of the problems with doctors relying on automation?
Carr: There’s been an assumption that you can
just bring computers into the medical profession. For instance, changing paper record-keeping
with electronic record-keeping, and that
would make things more efficient, allow doctors to share patient records more quickly,
but it wouldn’t really change the way doctors practice medicine. And what we’re
finding, in fact, is that there are all sorts of unanticipated[6], very
subtle changes that happen when, for
instance, primary care physicians start bringing computers or tablets into the examination
room. When you have a doctor with a computer, the doctor will tend to order more
diagnostic tests—more that turned out to be unnecessary—with a computer. Everybody
thought, oh, if you have a tablet, that you can call up previous blood tests or
X-rays, then doctors wouldn’t be so quick to order new ones, and in fact, it seems
to work exactly the opposite way: that doctors think, well, if I order a test, it
will be really easy to get it on my tablet, so I’ll just go ahead and order tests.
And as anybody who has gone in for a physical or an exam and had their doctor use
a computer, which is now more and more common, you certainly know that the doctor
spends a lot of time, in general, looking at the computer screen, and this too comes
out in the research that’s been done, that when you’re in an exam room with a doctor
with a computer, the doctor spends about 25 to 50 percent of the time looking at
the screen. And that leaves both patients and doctors a little uncomfortable. They’re
both aware that there isn’t as intimate
a connection being made between doctor and patient as there used to be when the
doctor gave you his or her full attention.
Host: Right. You know, one of my personal obsessions
is the role of the body and how we understand the world around us and how we behave
intelligently. So what’s your sense of the role of the body and what’s at risk with
increased automation?
Carr: If there’s, you know, one overarching[7] kind
of philosophical concern I have, it’s that we’re trading deep engagement with the world—an engagement
that includes our mind, but also our body—and we’re replacing that with a more generic form of simply doing things through
computer screens. So if you look at pilots, for instance, there’s a job that requires—or
at least in the past—you know, a great deal of psychomotor[8]
skills—thinking but also acting with your body, coordinating all that. Now, more
and more the flying of a plane—the actual physical flying of a plane—is done by
autopilot systems and other automated systems. The pilot’s job becomes one of looking
at screens and entering data into computers. Architects, the same thing; lawyers,
more and more that’s happening . . . and in our personal lives too, a lot of our
time now is devoted to looking at screens and being kind of data-entry clerks for
our own lives. And so what you see is this kind of uniformity in activity, uniformity in skill and talent, where you used
to have this great deal of diversity,
and you used to have people building these very subtle talents that involved engaging
with the world in all sorts of different ways, and it could be, you know, the early
stages of a great loss in the diversity of human activity, human thought, human
talent as all of us come to spend more and more time just interacting with a computer
and a computer screen.
Host: But on the other hand, isn’t flying, for
instance, safer than ever before, thanks to automation?
Carr: It’s certainly much, much safer. The whole
history of flight over the last hundred years has been one of, you know, steady
increases in safety, and I think it’s pretty clear that autopilots and other automation
systems have played an important role in that—not the only role. What the aviation
experience tells us is that some degree of automation may be good, and then if you
push it even further, it may start turning bad. So we’ve had this history of automation
helping to increase flight safety, but in recent years, as the computer has kind
of taken over more and more of the job of the pilot, and you know on an average
flight these days the pilot is actually in manual
control of the plane for only about three minutes. And what happens then is that
they start to lose situational[9] awareness.
They fall victim to what researchers call “automation complacency[10].” And
then if something happens—if the autopilot system breaks down or they run into some
unusual weather and are suddenly forced to retake manual control of the plane—because
their skills have gotten rusty[11], because
they’ve lost the awareness of what’s going on around them, they start making mistakes.
And unfortunately, you know, there have been some catastrophic plane crashes in
recent years that are associated with this kind of dependency on automation. What the airline example tells us is not that
automation is bad but that more automation is not necessarily good and also that the way you design these systems is incredibly
important. . . .
[1] drudgery: noun hard, boring work
[2] at leisure: idiom with no particular activities; free
[3] labor: noun hard physical work
[4] overreliance: noun relying too much on something
[5] artificial intelligence: noun an area of study concerned with making
computers copy intelligent human behavior
[6] unanticipated: adjective that you have not expected or predicted
[7] overarching: adjective very important, because it includes or influences many things
[8] psychomotor: adjective relating to both the
brain and the movement of the body that is produced by muscles; connected with the
nerves that control movement
[9] situational: adjective depending on the circumstances and things that are happening
at a particular time and in a particular place
[10] complacency: noun a feeling of satisfaction with yourself or with a situation, so
that you do not think any change is necessary
[11] rusty: adjective not as good as it used to be, because you have not been practicing
Comments
Post a Comment