She has quite a brain, but this teaching assistant is just not human

Teaching assistants, part of a group studying the effectiveness of a basic artificial intelligence teaching assistant, participate in a roundtable discussion of the project at Georgia Tech. (HENRY TAYLOR / HENRY.TAYLOR@AJC.COM)

Teaching assistants, part of a group studying the effectiveness of a basic artificial intelligence teaching assistant, participate in a roundtable discussion of the project at Georgia Tech. (HENRY TAYLOR / HENRY.TAYLOR@AJC.COM)

It began as an experiment, and it could change how college students learn.

Last year, students in Georgia Tech professor Ashok Goel’s online Masters in Computer Science class were told the questions they asked after class would be answered online by a teaching assistant named Jill Watson.

What the students didn’t know is Watson is not human, but computer artificial intelligence.

Georgia Tech is believed to be the only school in the nation using artificial intelligence in such fashion. Goel and other Tech researchers are working on the next phase, virtual teaching assistants that can be tutors, answering questions and asking questions.

Goel believes their work has immense potential.

“We get to build a future of our own imagination,” he said exuberantly.

Students in courses where the teaching assistants were artificial intelligence programs say they were surprised that it wasn’t a human answering their questions.

“These are answers I would expect to get from a regular teaching assistant,” said Duri Long, 22, a first-year doctoral student studying human-centered computing.

Jill and the other artificial intelligence teaching assistants are digital programs that answer questions from students online, somewhat like automated telephone operators, but they don’t speak.

Artificial intelligence — AI for short — is complex, compelling and, to some, disconcerting. An October 2016 White House report on artificial intelligence concluded that it could cost low-income Americans their jobs “and that there is a risk that AI-driven automation will increase the wage gap between less-educated and more educated workers, potentially increasing economic inequality.”

Susan Schneider, an associate professor of philosophy at the University of Connecticut, fears AI will encourage some universities to employ fewer faculty and graduate students, who teach to help pay for their own doctoral studies.

Schneider outlined the challenge of AI.

“How can we use technology in a way that benefits humanity, and minimize grave risks, such as the possibility of building AI systems that out-think us, and work against us,” she said in an interview.

Goel said he’s been inundated by educators and businesses interested in his work. Many learned about artificial intelligence from the 2001 film “AI,” about a boy made with the technology. Today, it’s used for medical diagnosis, advertising targeting, customer service and self-driving vehicles.

Goel and his team weren’t initially trying to start a philosophical discussion when they created Jill. They were trying to make it easier for him and his colleagues to teach. Goel said he often spent 20 hours a week answering similar questions in message posts from students after class.

In 2014, IBM gave Goel access to Watson, a computer system that could answer questions and became popular after its success on the television show “Jeopardy.” Goel wondered if a computer system could answer questions from students sent his way after class. Thus came the idea for Jill, a program using IBM’s technology.

In January 2016, the Jill Watson experiment began. Jill’s challenge was three-fold. She had to be able to answer many questions, get them right and make her responses sound human, not like a computer.

Jill was programmed to answer questions specific to the coursework and not about other subjects.

“Her performance was not very good,” Goel said.

Jill had trouble interpreting similar questions, he said. It took his team about three months to work out the kinks.

The students, to Goel’s delight, didn’t know Jill wasn’t human.

There are three AI teaching assistants this semester, each with “human” names. The students don’t meet the teaching assistants until the end of the semester, and Goel isn’t telling his students which ones are artificial intelligence and those who are human.

The students, he said, enjoy the mystery. They’re computer science students after all.

Master’s student Chris Cassion, 22, took a course last semester in knowledged-based artificial intelligence and said the answers to his questions about asssignments and deadlines were always specific and helpful.

“I’ve never encountered an instance where the answer was completely out of left field,” he said.

Students believe AI is a good tool, but it shouldn’t replace professors.

Goel is, too, concerned about the impact on AI on the job market. He hopes as it automates some professions, AI will also free humans to be more creative and that society will support those who lose their jobs.

Is Goel worried artificial intelligence may take his job?

“Am I committing professional suicide?,” he asked. “Not yet.”

Goel and his team are studying how AI teaching assistants affect student engagement and retention. The preliminary results show it’s helped, he said. They haven’t researched how the interaction’s impact on academic performance.

Goel hopes that if they can develop AI tutors, they will improve his students’ critical thinking skills.

“That’s where true teaching begins,” he said.