TED 2011 Q&A: How Technology Bites Us

LONG BEACH, California — Technological advances and other inventions have the ability to change lives and alter the course of history. But they also sometimes take revenge on us or have other unintended consequences that can undermine their reason for existing, according to technology historian Edward Tenner. Take the Barcalounger; touted as a treat for […]

LONG BEACH, California -- Technological advances and other inventions have the ability to change lives and alter the course of history. But they also sometimes take revenge on us or have other unintended consequences that can undermine their reason for existing, according to technology historian Edward Tenner.

Take the Barcalounger; touted as a treat for your feet, it was supposed to provide health-inducing relaxation for hardworking stiffs. Instead, it became a symbol of an unhealthy lifestyle and obesity. Asbestos, a life-saving fire retardant, became a toxic time bomb.

Tenner the author of Why Things Bite Back: Technology and the Revenge of Unintended Consequences and Our Own Devices is speaking at the Technology, Entertainment and Design conference Thursday about the unintended consequences -- negative and positive -- of innovation. He talked with Wired.com in advance of his presentation.

Wired.com: You’ve written a lot about the negative consequences of technologies and innovations. But doesn't every solution and innovation have some Mr. Hyde DNA in it? You can't foresee every potential negative outcome, and even if you could, don't you sometimes just have to proceed anyway for the sake of innovation?

Edward Tenner: Yes, and trying to avoid all unintended consequences also has unintended consequences. Where I think we haven't been as skillful as we should be, is in recognizing them early and cutting our losses when they occur, and in using our imagination better.

The purpose of Why Things Bite Back and Our Own Devices is really to use history to help people develop their imaginations in thinking about consequences -- not to avoid anything new, but to not make premature and irreversible commitments.
Continue reading ...

Wired.com: It seems that some of the adverse effects or consequences you mention in your books stem not from the technology or innovation itself, but from a lack of moderation in adopting it. We fail to anticipate how explosively popular it will become, and its overuse leads to negative consequences.

ET: That's a significant point. I'd really have to look at that case-by-case. But to give an example [where we did anticipate bad consequences], the possibility of fossil fuels accelerating climate change was recognized [as early as] the 19th century. Nuclear power was originally promoted partly because people were aware that there really was an issue of greenhouse gasses and climate change.

But there were people arguing the other side, as well. The Utopian socialist Charles Fourier was very happy that human activity would make the Earth warmer. He thought everything would be more mellow; predators would stop eating other animals. He thought a warming Earth would be a paradise on Earth. So even when people foresee what's going to happen, they don't really understand how complicated it is.

I do think that in many of the cases that I've discussed, people, with the knowledge that they had, could have done a much better job of anticipating their problem. I'm talking about things like antibiotic resistance, which was recognized very early [as a possible consequence of the overuse of antibiotics]. Yet the medical profession did not fully educate its members and the public of the importance of limiting the use of antibiotics to where they were really needed.

Wired.com: How do we become better at assessing risk and consequences?

ET: One technique that's often used in assessing risk and consequences is phasing in [something] on an experimental basis and observing. Some of the examples [where] plant introductions and animal introductions became pests could probably have been avoided if there was more caution and observation early in the process.

There's also anther case that Henry Petroski has discussed in several books. There's a cycle [in which] engineers take a new principle and develop it, and it becomes bolder and bolder until it reaches a point where it fails. The cycle for that is 30 years.... Petroski says if engineers study this cycle, then they are better able to recognize in developing their new designs when they might be reaching a point like that. He's talking primarily about bridges, but it applies to other things. … And his idea is that by looking at the history of these cycles, it's possible to develop a better intuition of when a design is reaching that point, and to then test it more rigorously.

On the other hand, our ability to innovate goes faster than our ability to model how things will behave over time. A New York Times story talks about a dam in California that is now considered to be in a dangerous condition. It was really state-of-the-art over 50 years ago when it was built, and these were the best engineers at the time. And yet there are things about the interaction of the soil [that have now made the dam dangerous] that are discovered only in the course of decades. That is an inevitable feature of innovation. All I think you can do is try to recognize these things as quickly as you can and address them earlier, because they may be cheaper to fix earlier than later.

Wired.com: In the new book you're working on, you look not at negative consequences, but at unintended positive consequences from circumstances and things that would ordinarily be considered negative. What are some examples?

ET: One of the really interesting effects that I discovered was something called the Teddy Roosevelt effect that appears in many newspapers in the 1970s. It was about how astronaut candidates who had suffered a serious illness or injury, where they were in bed for a prolonged period as young people, were actually better performers than average. It wasn't only that they had overcome their illness or injury, but somehow that process had benefited them.

And there's a growing literature on the positive consequences of the right amount of adversity. This kind of resilience is something that has intrigued me. I gave a talk called Schmentoring [about how] a negative boss or supervisor or teacher can stimulate people to do their best work just to spite them.... Some people, when they're confronted with that, have a really resilient response. I have no explanation for it.

You can have two kids grow up in the same family with the same parents, the same circumstances, and one will persevere and the other will be in trouble. It is an interesting question whether that is some kind of innate disposition or whether it's something that can be learned. It's an effect that really needs to be studied more.

Wired.com: What about examples in technology or innovations where something bad led to something good?

ET: Antibiotics are a good case of how bad events can help good causes, because it was really the pressure of war time that got the U.S. government to encourage the big pharmaceutical companies to scale up the production of penicillin, which had been discovered in 1929 but had been very difficult to produce in quantities that were medically useful. The scaling up of penicillin has saved more lives than the first nuclear weapons took.

Wired.com: You talk about how crises, such as the Great Depression, can give birth to great advances, because misfortune has the power to push people out of staid positions. Are we sometimes too quick today to want to move out of such discomfort zones, thereby short-circuiting potentially fertile circumstances?

ET: A lot of companies aren't really as willing to take risks on really bold products as some of them were during the Depression. One favorite example is the development of synthetic detergent, which became Tide, by Proctor and Gamble. In spite of the Depression, Proctor and Gamble spent six or seven years turning that into an economically viable product.

So, one good thing that happened during the Depression is that some companies were able to use the times to good advantage. They may have cut back on their staff, but they really focused on some key projects.

I think one difference between the spirit of a hundred years ago and now is that people were much more ready to plunge into things, and in a way they were almost better off because they didn't spend too much time pondering all the things that could and would go wrong.

I wrote an essay for the NEH’s Humanities magazine comparing the Panama Canal with the space program, because both of them required solving open-ended problems that had really seemed extremely difficult.…

There's a principle in economics development by Albert O. Hirschman called the "hiding hand." People often start things without really realizing how difficult they're going to be, but once they're committed to them, human ingenuity discovers ways to make them succeed. Not only the Panama Canal and the space program were like that, but also the development of jet engines.

If you look at the technical problems that had to be solved to make a useable jet engine, if it hadn't been for the urgency of the Cold War, it isn't clear whether that kind of research would have been adequately supported.

You need a certain amount of inspired irrationality. It can be irrational to be too rational. Sometimes you need to take a leap; you have to trust your instinct.