The Bulls provide a perfect case study. Jordan and Pippen were the sorts of players who bent destiny to their will. They had ultimate agency,…
Mr. Renick decided to test Pounce in 2016 as a chatbot walking students through tasks like turning in immunization forms and untangling financial aid issues, responding to particular complications along the way. Mass outreach, he said, “was not responsive to individual problems,” like getting information from an absent parent. The pilot was so successful — it reduced summer melt by 19 percent that first year — that they continued the practice and have more recently expanded the chatbot’s role.
This spring semester, it was offered to all undergraduates; 21,000 (of 25,000) students now text with Pounce. The key, said Mr. Renick, is that every message sent to a student is personalized, “pertinent to them and time sensitive.” Recently, he said, 54 percent of those receiving a payment reminder text from Pounce responded and did “what we needed them to do” within 12 hours. By contrast, he said, less than 20 percent typically open campus emails.
Chatbots are only as good as their databases. For payment reminders, Mr. Renick said, account information must “be updated within minutes of sending a text.” Certainly, chatbots are especially good at procedural, “box-checking” communications, said Lindsay Page, associate professor of education at the University of Pittsburgh who has done research on college access and persistence, including on Pounce and other chatbots.
In fact, among this generation of digital natives, some students “may feel more comfortable or safer” than engaging a real person,” she said, “especially around issues that may be giving their families stress, like financial aid or paying for college.” What she sees coming next: How can A.I. anticipate students’ needs, including “building a system of proactive outreach” if say, an early test does not go well?
Ashok Goel, professor of computer science in the school of interactive computing at the Georgia Institute of Technology in Atlanta, created a virtual teaching assistant, Jill Watson, who responds to questions about course information for classes, much like chatbots do for campuses. “Jill can answer questions about anything you put in the syllabus,” he said.
But Mr. Goel is also looking at what else A.I. could do. Like working with sentiment analysis (it uses “the order of the words, the phrases of the words” to classify moods and personality types) to explore how virtual tools could answer a human’s emotion.