Search This Blog

Sunday, June 08, 2014

Will Our Children Be Able to Pass the Turing Test?

Back in 1950, computing pioneer Alan Turing said that a computer could be understood to be thinking if it was able to dupe 30 per cent of human interrogators in five-minute text conversations. In his 1950 paper, 'Computing Machinery and Intelligence' he posed the idea that imitating a real human being successfully was the real test of sentience - at least sentience at the level of human beings.

For the first time, a Russian computer program has successfully convinced 33% of the members of a panel of judges at the Royal Society in London that it was, in fact, a 13 year-old Ukrainian boy named  Eugene Goostman. The event is hailed as a groundbreaking milestone in the development of artificial intelligence.

I think not!

All it really proves is that the programmers were able to program a computer specifically to past the Turing test. The program did, in fact, bamboozle at least a third of a panel of self-important old fuddy duddies and convince them that there was a human kid on the other end who wasn't smart enough to be a real computer. The programmers admit they shorted the program's knowledge base in order to simulate the gaps in knowledge that a typical 13 year-old might have. So the success of the test was more about how the programmers anticipated the panelists than it was about the sentience of their computer program. Actually, it's not surprising that they chose a 13 year old boy for their persona. Everybody knows 13 year-old boys run primarily on their hormones rather than their intelligence. The abject servitude of pubescent males to 13 year-old females looks a great deal like the abject servitude of a machine to its human masters.

The successful attempt to design a computer to imitate a person is, to me, frighteningly like a mirror image to the way our own public school system's diligent efforts to teach (to program really) our kids to pass minimum skills tests like Texas' controversial STAAR* exams which every student must pass in order to graduate from high school. Like the Russian code-makers, we may be reverse engineering our kids, not toward sentience, but away from it. Instead of teaching our kids to think, we're increasingly teaching them to remember the "right" answers to test questions selected for them by a self-appointed group of people who consider themselves qualified to know what kids ought to think.

If our school system keeps this up long enough, we may find that our children soon won't be able to pass a Turing Test themselves. At least they won't be able to pass the test for a generation or so - not until they, themselves, programmed to be intellectual machines in their youth, start administering the Turing Test themselves. At that point, sounding like a machine will sound "human" to the judges, who themselves were programmed to think that way as kids.


I met a Chinese exchange student, recently, who fled China to finish high school in America because she feared what this type of teaching was doing to her. She told me she left China as a high school freshman, when she realized all they were doing was teaching students to parrot back rote answers. The system, she said, discourages independent thought at any level.

One frightening thing occurs to me as our education system continues to re-invent itself in service to computers and databases. If we train up our children to think and act like computers now, then, when they are old, will our world be run by people who think like machines?  If that happens, are we headed toward the Orwellian world portrayed in the famous 1984 Apple Macintosh Superbowl commercial - the world of service to the machines that the Mac was supposed to save us from?

Don't get me wrong. I love computers. They are lovely tools - like a library in your pocket. I just don't want to be one.


Just one man's opinion.

© 2014 by Tom King

*The original TAAS test, the predecessor of the TAKS test and the current STAAR test, by the way, was the brain child of Texas computer data tycoon, H. Ross Perot, a man who made his fortune stuffing things into computer managed databases. Which could explain a lot of things.

No comments: