University students fooled by robot TA - Action News
Home WebMail Tuesday, November 26, 2024, 10:41 AM | Calgary | -16.2°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
ScienceQ&A

University students fooled by robot TA

Imagine discovering someone you thought was human is, in fact, a robot. As CBC Radio technology columnist Dan Misener explains, that's what happened to a class full of Georgia Tech students recently, when they learned that their teaching assistant was actually a piece of software.

Artificial intelligence behind supercomputer Watson used to create TA for online AI course

The question was when to reveal the robot teaching assistant wasn't a real person. (l i g h t p o e t/Shutterstock)

Imagine discovering someone you thoughtwas humanis, in fact, a robot.

It sounds like the stuff of science fiction. But that's what happened to a class full of Georgia Tech students recently, when they learned that "Jill," their teaching assistant,was actually a piece of software.

CBC Radio technology columnist Dan Misener explains what happened.

Where did this 'robot teacher' come from?

The story starts with a computer science professor namedAshok Goel, who teaches at theGeorgia Institute of Technology.

For the past few years, he'sbeen teaching a graduate-level online course on artificial intelligence (AI).It's a popular class.About 300 students enrolin the course every semester, and it's run by Goel and eight teaching assistants.

Georgia Tech professor Ashok Goel (left) worked with a team to develop an artificial intelligence that acted as a teaching assistant for one of his online courses. (Georgia Institute of Technology)

About a year ago, he realized that his students were asking a lot of questions in the class's online forumquestions about assignments, due dates, what was going to be covered on any given week, things like that.

"These 300 students generatedsomething like 10,000 postings on the discussion forum," he said."That's like receiving100 emails every day."

Not only did the students ask a huge number of questions, but different students would ask the same question over and over. It was repetitive.

So Goel decided to create a robot teaching assistant that would answer these questions.He named her "Jill Watson," trained her using human teaching assistants, and then listed her as a teaching assistant on the syllabus.

But Goel's students only found out that Jill was an AI chatbot afterthe end of the term.

How did they keep students from finding out Jill was a robot?

Goel saidthe goal wasn't to keep the AI a secret. They always intended to reveal that Jill wasn't human. The question was when to make that reveal.

That said, Jill had a few things going for herthat helped her avoid detection as a robot.First, this was an online course, so most of the interactions between students and teaching assistantshappened through text, on an online discussion board. It's much easier to pass yourself off as human if you never have to meet face-to-face with another human.

Also, Jill wasn't overly formal. She used conversational language.

And finally, Goeland his assistants made sure Jill was good enough at answering questions before they unleashed her on the class's online forum. Jill was programmed to onlyrespond to a small subset of questions. And she was programmed to only respond when she was at least97 per centcertain she could answer correctly.

How did Jill learn to answer questions?

Jill Watson is a machine, so they used an approach called "machine learning."

First, they had to train Jill.Goel and his assistantstook all the questions that students had asked in previous semesters. These hadalready been answered by human TAs, so itwas a pretty good set of training data.

Initially, Jill's performance wasn't very good. According to one of Goel's assistants, she'd get stuck on keywords in questions and give irrelevant answers.So they continued to train her on new student questions, and her performance gradually improved.

"By the end of the term, her performance had reached a level where we could let her loose into the discussion forum on her own," Goel said.

Jeopardy! contestants Ken Jennings, left, and Brad Rutter, far right, with IBM's artificial brain, Watson. The same technology which allowed Watson to become a Jeopardy! champion in 2011 has now been used to create a robot teaching assistant called Jill Watson. (Seth Wenig/Associated Press)

Jill's full name is Jill Watson, and that's not by accident. The machine learning system Ashok used was Watson, from IBMthe technology best known for beating humans atJeopardy!It's well-suited to these types of question and answer tasks.

How else could this technology be used?

Beyond building robot TAs (and competing on Jeopardy! several years ago), Watson has been used in lots of different ways.We've seen it used infinancial planning, for clinical research trials in medicine, andin computer security.

But I think what's most compelling about Jill Watson, the robot TA, is that it uses a combination of machine learning that gets better over time, alongside natural language processing the ability to pull meaning from the way humans speak or write naturally.

We've seena huge amount of buzz recently about conversational chatbots, and the underlying technologies of Jill Watson are very similar to the kinds of customer service bots and sales botsthat companies hope we'll start using on a regular basis.

The underlying technologies of Jill Watson are very similar to the kinds of chatbots that are becoming more and more popular. (iStockphoto)
The question, I think, when we interact with these types of robots iswill we know we're interacting with robots?

Could robots put teachers out of work?

Goel said a lot of the media coverage of Jill Watson has played up that angle robot teaching assistantstaking jobs from human TAs.

But he also pointed out along list of things that human teachers doacting as mentors, coaching, tutoring,motivating.These are things Jill Watson can't do, and may never be able to do.

Where humans cannot go, Jill can go. And what humans do not want to do, Jill can automate.- Ashok Goel,Georgia Tech professor

"We do not think of Jill, ever, in our lab as replacing any human," he said.

"Instead, we think of Jill as complementing humans, in such a way that where humans cannot go, Jill can go. And what humans do not want to do, Jill can automate."

For example, answering the same question about when an assignment's dueover and over and over.

Goel sees software like Jill as a way to free up humans to focus on what humans are best at.He's so confident in this approach, he's planning to run the program again in the next semesteralthough he plans to change Jill's name.

So if you're taking an online course anytime soon, and the instructor seems a bit "stiff" there may be a reason for that.