Can Machines Think? | The Odyssey Online
Start writing a post
Politics and Activism

Can Machines Think?

Eugene Goostman is a 13-year old Ukrainian boy that lives in a computer

79
Can Machines Think?
Science Friday

On June 7th 2014, Eugene Goostman passed the Turing Test, a test that had stumped many before him. A diligent boy, Eugene studied hard for the test, a cumulative assessment on everything that he had known up to that point, and rightfully so he earned a passing grade. However, Eugene is not a boy, and the Turing Test is not your average midterm. Eugene Goostman is a computer program, and on that day, he proved to the world that machines can think.

The Turing Test, developed by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behavior indistinguishable from that of a human. In the test, a third-party inspector observes the exchanges between a human and a machine through text. As defined by Turing, if the inspector fails at least 30 percent of the time to distinguish which party is the machine after observing 5 minutes of conversation, then the machine passes the test. And Eugene Goostman, a software program created to imitate a 13-year old Ukraine boy, deceived 33 percent of the event's judges into thinking it was a human.

If you think about it, it's scary that we've created technologies that think like us. Humans are already hard enough to deal with as it is, but think about having infinitely more independent-minded entities to deal with. What we've created is kind of the opposite of what we want. We don't want Siri to represent a real human being; we want Siri to represent Apple, the company providing this service to us. Luckily for us, critics of Eugene Goostman and the Turing Test have argue that this event doesn't prove that machines can think like humans.

Most argue this because they think the Turing Test suffers from poor design. For starters, Eugene Goostman was a 13-year-old Ukrainian boy who wasn't fluent in the English language. Critics think that this underlying factor caused judges to be less critical when the program answered questions wrong or in a weird way because it could've been either from a lack of knowledge about the world or lack of experience with English. This poses a huge problem in the Turing Test because I can code a program tomorrow that passes the Turing Test as a one-year old Chinese boy. Secondly, the program more often than not deflected the questions asked to change the subject. Here's a snapshot of a conversation between Eugene and Scott Aaronson:

Scott: Which is bigger, a shoebox or Mount Everest?

Eugene: I can’t make a choice right now. I should think it out later. And I forgot to ask you where you are from…

Scott: How many legs does a camel have?

Eugene: Something between 2 and 4. Maybe, three? :-))) By the way, I still don’t know your specialty – or, possibly, I’ve missed it?

As you can see from the interaction, Eugene deflects the question by asking another question about something completely different. Eugene doesn't necessarily answer the questions correctly, but it does so in a way that makes some part of you think: "Well, it could be a 13 year-old Ukrainian boy ... He might not know English that well, which would explain for the problems with grammar, and he might find it hard to concentrate on one topic, which would explain for odd questions ... I don't know." You see, the programmers who developed Eugene took advantage of the Turing Test's poor design and bent the rules a little in their favor by making the mistakes look like possible human mistakes.

As a proposed revision, I recommend that they make it illegal for these machines to ask questions in return. This will ensure that they do not deflect the subject, stay on topic and are not able to distract the inspector from the quality of the machine's response.

I proposed this solution in philosophy class, and someone objected by saying that banning questions defeats the whole purpose of the Turing Test. The Turing Test aims to identify machines that can exhibit human-like intelligent behavior, and a big part of human intelligence is the ability to ask questions. Any machine with enough training data can know the answers to all of our questions so having a Turing Test that requires a machine to only answer the questions given is more of a test to see if machines can have human knowledge and not intelligence.

However, the reason I propose this revision is because the questions asked by the machine are often unrelated to the subject matter at all and just used to distract the interpreter. I agree that a machine with enough data can know the answers to all the questions, but that's because we're asking the wrong type of questions. The question of how many legs a camel has, though wasn't perfectly responded to by Eugene, can easily be in the future at some point when machine learning algorithms and data processing improve. But that's because we're asking the wrong questions.

Instead of asking how many legs a camel has or how many grams are in an ounce, questions that have definitive answers, we should be asking questions that require the machine to engage in a certain level of introspection. This is the type of conversation that Alan Turing imagined would pass a Turing Test:

Interrogator: In the first line of your sonnet which reads "Shall I compare thee to a summer's day," would not "a spring day" do as well or better?

Witness: It wouldn't scan.

Interrogator: How about "a winter's day," That would scan all right.

Witness: Yes, but nobody wants to be compared to a winter's day.

Interrogator: Would you say Mr. Pickwick reminded you of Christmas?

Witness: In a way.

Interrogator: Yet Christmas is a winter's day, and I do not think Mr. Pickwick would mind the comparison.

Witness: I don't think you're serious. By a winter's day one means a typical winter's day, rather than a special one like Christmas.

I think what Alan Turing had in mind is very different from what Eugene Goostman showed us. In the hypothetical conversation above, the machine is actively responding to the questions of the interrogator by thinking about why it did those things. Like a human, the machine here is able to answer questions about itself and its behavior, and holds up conversation like a human by staying on the same topic. Maybe we need better judges for these events, or maybe we need a better, more specific design of the Turing Test. Either way, it's safe to say that Eugene Goostman definitely cannot think like a human.

Of course, this leads to the question of whether or not computers can think about themselves the same way we do when we introspect. Can they provide adequate answers to questions that do not have clear answers in a way that human-like?

I don't know, but I am going to leave you with the thought that never is a strong word. I don't think it is a problem of will computers will ever be able to simulate human intelligence. I think it's more of a problem of when will they able to. What do you think?

Report this Content
This article has not been reviewed by Odyssey HQ and solely reflects the ideas and opinions of the creator.
Entertainment

Every Girl Needs To Listen To 'She Used To Be Mine' By Sara Bareilles

These powerful lyrics remind us how much good is inside each of us and that sometimes we are too blinded by our imperfections to see the other side of the coin, to see all of that good.

624822
Every Girl Needs To Listen To 'She Used To Be Mine' By Sara Bareilles

The song was sent to me late in the middle of the night. I was still awake enough to plug in my headphones and listen to it immediately. I always did this when my best friend sent me songs, never wasting a moment. She had sent a message with this one too, telling me it reminded her so much of both of us and what we have each been through in the past couple of months.

Keep Reading...Show less
Zodiac wheel with signs and symbols surrounding a central sun against a starry sky.

What's your sign? It's one of the first questions some of us are asked when approached by someone in a bar, at a party or even when having lunch with some of our friends. Astrology, for centuries, has been one of the largest phenomenons out there. There's a reason why many magazines and newspapers have a horoscope page, and there's also a reason why almost every bookstore or library has a section dedicated completely to astrology. Many of us could just be curious about why some of us act differently than others and whom we will get along with best, and others may just want to see if their sign does, in fact, match their personality.

Keep Reading...Show less
Entertainment

20 Song Lyrics To Put A Spring Into Your Instagram Captions

"On an island in the sun, We'll be playing and having fun"

517627
Person in front of neon musical instruments; glowing red and white lights.
Photo by Spencer Imbrock on Unsplash

Whenever I post a picture to Instagram, it takes me so long to come up with a caption. I want to be funny, clever, cute and direct all at the same time. It can be frustrating! So I just look for some online. I really like to find a song lyric that goes with my picture, I just feel like it gives the picture a certain vibe.

Here's a list of song lyrics that can go with any picture you want to post!

Keep Reading...Show less
Chalk drawing of scales weighing "good" and "bad" on a blackboard.
WP content

Being a good person does not depend on your religion or status in life, your race or skin color, political views or culture. It depends on how good you treat others.

We are all born to do something great. Whether that be to grow up and become a doctor and save the lives of thousands of people, run a marathon, win the Noble Peace Prize, or be the greatest mother or father for your own future children one day. Regardless, we are all born with a purpose. But in between birth and death lies a path that life paves for us; a path that we must fill with something that gives our lives meaning.

Keep Reading...Show less

Subscribe to Our Newsletter

Facebook Comments