Tuesday, March 8, 2011

Didn't James Cameron Warn Us About This?

There was a quite a bit of buzz when the IBM supercomputer “Watson” faced off against former “Jeopardy!” champions Ken Jennings and Brad Rutter and won handily. Watson, which has been designed to solve complex logical problems at a dizzying rate of speed, is in many ways the epitome of what technology can do in support of society. Some have hypothesized that Watson, with its ability to analyze and solve for almost infinite amounts of variables, dependencies using copious amounts of data, could even develop solutions for problems such as health care policy and the Palestinian conflict.

But there are some who have misgivings of this next big step in technological advancement. What are some of the unintended consequences of developing technologies which will not simply answer questions at a speed much faster than humanly possible, but have the ability to determine the “right” question to be asked, and solve problems better that humans can. Will there be a point where we will outsource the adjudication of justice and the negotiations of diplomacy of a computer which will never be “tainted” with bias or emotion, armed only with cold and quantifiable logic and rationality?

And of course the fear lingers of a technology which ultimately becomes “intelligent” to the point of self-awareness, which comes to the realization that the best course of action is to not be controlled by the finite judgment of humans. This is akin to something eloquently articulated in that great philosophical flick, James Cameron’s “Terminator 2: Judgment Day” (movies such as “I, Robot” and “Eagle Eye” have also touched on this topic):

Terminator: The man directly responsible is Miles Bennett Dyson. In a few months, he will create a revolutionary type of microprocessor.

Sarah Connor: Go on. Then what?

Terminator: In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online on August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware 2:14 AM, Eastern time, August 29th. In a panic, they try to pull the plug.

Sarah: Skynet fights back.

Terminator: Yes. It launches its missiles against their targets in Russia.

John: Why attack Russia? Aren't they our friends now?

Terminator: Because Skynet knows that the Russian counterattack will eliminate its enemies over here.

Sarah: Jesus.

But of course, we're not pulling the plug. We're a society addicted to our technology, as I type this on my laptop connected via broadband to a blog which includes an algorithm in a widget which uses predictive analytics to offer banner advertising based on keywords in my entries and the past surfing history of my viewer. We're not stopping the progress of our handheld devices which make our lives convenient by taking out all the work for us. Voice to text technology wil soon become thought to text technology, and before soon - the only thing we'll need to do is think - because technology will do all the rest of the work for us... until we come to the conclusion that our technology actually thinks more effectively than we do.

And if Earth becomes uninhabitable due to strategic military technology-induced nuclear holocaust, soon we'll find ourselves needing to colonize other planets and using avatars to make contact with unfriendly native inhabitants.

What? James Cameron made a movie about that too?

No comments: