The most effective way to find and destroy a land mine is to step on it.
This has bad results, of course, if you're a human. But not so much if you're a robot and have as many legs as a centipede sticking out from your body. That's why Mark Tilden, a robotics physicist at the Los Alamos National Laboratory, built something like that. At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.
Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.
The human in command of the exercise, however -- an Army colonel -- blew a fuse.
The colonel ordered the test stopped.
Why? asked Tilden. What's wrong?
The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.
This test, he charged, was inhumane.
I think I found this story so funny because I can see the conflict between two personality types: the engineer who happily watches his machine perform its intended function, and the leader who identifies emotionally with the people and tools under his command. I was laughing at the confusion of the engineer who made a flawless machine but failed to consider the human factors in the design.
The article is interesting throughout, as is the one that pointed me to it:
They're designed to do a job, and they're designed to be able to interact with people to the extent that it facilitates their ability to do that job, but service robots are really not programmed to be your pet, your best friend, or a member of your family.
Whether it's in their programming or not is, to some extent, beside the point, since it happens anyway. And when it happens, it dramatically changes the way that people interact with what on a primary level is intended to be little more than a tool. Realizing this, a team from Delft University of Technology and Philips Research in the Netherlands decided to take a look at how people actually want their robot vacuums to behave, and what kinds of personalities they'd like them to display.