November 15, 2024

The expansion of chatbots in higher ed

Author: Lindsay McKenzie
Go to Source

More and more colleges are deploying virtual assistants or chatbots to communicate with students on all aspects of college life, creating a virtual “one-stop-shop” for student queries.

Colleges initially were deploying this technology only in specific areas, such as financial aid, IT services or the library. Now institutions are looking to deploy chatbots with much broader capability. For the companies that make this computer software that conducts text or voice-based conversations, this changing usage on campus marks a significant shift.

This expansion happened naturally, said Mark McNasby, CEO and co-founder of chatbot company Ivy.ai. He noted that around 35 percent to 40 percent of the questions students ask a departmental chatbot are actually the domain of another department.

A student talking to an admissions bot might, for example, want to know about the career outcomes of a particular program. Rather than directing that student to a separate career services bot, institutions want integrated services where students “can ask any question, no matter the entry point,” McNasby said.

“Initially we were selling bots to one department, then three, then seven or eight departments. In the last three months, we’ve designed more bots for the whole institution,” McNasby said.

Institutional level bots may be on the rise, but many individual departments still want the option of designing a custom-user interface, according to McNasby. Not only does the appearance of a specialized bot indicate to students that the bot can answer specific questions, it also allows administrators to filter out and review which questions are directed to their department.

Ivy.ai started out as a tool for the careers office, then moved into adjacent areas such as academic advising, said McNasby. Colleges have requested bots for administrative functions such as HR and purchasing. And Ivy.ai is exploring new communication channels, including texts, Facebook messenger and email.

McNasby said there are few places in the university ecosystem where chatbots can’t be deployed, including for providing counseling services for students who may be dealing with serious mental health issues.

“We have 10 to 15 counseling bots,” he said, adding that providing such services requires a delicate touch — students are quickly referred to trained staff when they need help.

“We need to be mindful that when someone is in crisis, they need to speak to a live human or reach emergency services. Often we err on the conservative side.” 

Andrew Magliozzi, CEO of chatbot company AdmitHub, has seen a similar expansion. AdmitHub, as the name suggests, started with a focus on admissions, but now touches all stages of the student’s college-going experience from the first expression of interest in an institution to enrollment, retention and even alumni relations. AdmitHub’s chatbots cover 6,500 discrete topics, but students can still ask unexpected questions.

“The goal is not perfection,” he said. If the chatbot can’t help, it will direct the student to someone who can.

AdmitHub primarily communicates with students through text messages rather than a web interface. “Texting is the easiest way to reach students,” said Magliozzi. It also enables the university to proactively reach a student and remind them of deadlines to register for class, for instance.

“Students know that they aren’t texting with a person, and that gives them the freedom to ask whatever is on their mind,” said Magliozzi. “They might feel embarrassed to ask an admissions advisor whether there’s a Chipotle on campus or whether they can bring a dog to their dorm.”

Both McNasby and Magliozzi agree that personalization will drive the next generation of chatbots. Not only will chatbots be able to retrieve personal information, such as grades or account balances, for students who are logged in to the college system, but they will also talk to students in an individualized way, perhaps employing humor or references based on the interests of the user.

Personalization is an area where it would be easy to go too far, which is why most chatbots have so far limited personalization to functions such as addressing the student by name.

“You never want to seem creepy,” McNasby said.

Magliozzi said chatbots can do a lot more than most institutions are using them for, but he believes the technology can still be improved. He also said that chatbots won’t yield positive results, such as reducing summer melt, without good institutional organization.

“Chatbots are like connective tissue between services,” he said. “They are not a solution to all problems.”

Many companies offering higher ed chatbots say their services are not replacing administrative staff, but instead supporting them to make better use of their time. “One of the key drivers for institutions to use chatbots is a desire to increase efficiency and cut costs,” said Patricia Velazquez Alamo, director of education and research industry marketing at Oracle Higher Education.

“The average cost of a call center call is around $5 across the board,” said Velazquez Alamo. “If you introduce chatbots, you can slash that price quite significantly.”

Oracle Higher Education offers a chatbot as part of its cloud services. Keith Rajecki, vice president of industry solutions, says that this can be used by university employees to check their vacation balance and to perform a variety of other administrative tasks. For students, it can be used to check grades or schedule appointments with professors or advisors at any time.

Oracle’s chatbot, which it describes as a digital assistant, was initially referred to internally as “Lucy,” much as Amazon’s digital assistant has the name Alexa, and Apple has Siri. But it was never the company’s intention for every institution to have an assistant called Lucy. “We wanted people to make it their own,” Rajecki said.

With so many customization options available, few institutions have attempted to create their own chatbots in-house, particularly sophisticated ones powered by AI or machine learning. But there are some experiments underway.

Researchers at Stanford University have developed a “QuizBot” that they say is more effective than flashcards at helping students learn and retain information, and WGU Labs, a non-profit organization founded by Western Governors University is also exploring this technology.

Jason Levin, executive director of WGU Labs, has been working with AI researchers at Carnegie Mellon University to develop a tool that will help guide students into the right degree programs and courses for the career they want to achieve.

“It’s a big decision to commit to a four-year degree,” said Levin. If someone wanted to pursue a career in digital marketing, for example, there are many considerations to take into account when advising the student: What are the skillsets related to that career? Which competencies does the student already have? What factors are important in helping them weigh that decision?

“We’ve done a lot of qualitative research on how students make career decisions, and we’re starting to analyze those results,” Levin explained. “Our process is going to involve a lot of user testing to help students make better decisions and ultimately save them money in the long run.”

Bryan Alexander, a futurist, writer, educator, and consultant, noted that chatbots have been around for a long time. The first chatbot, known as Eliza, was created by the Massachusetts Institute of Technology from 1964 to 1966.

“AI and machine learning have come a long way since the 1960s, but many chatbots in higher ed are fairly limited in scope,” said Alexander. “Being able to ask what time the library closes instead of searching for it yourself is useful, but it isn’t terribly sophisticated,” he said.

Alexander is nonetheless curious about how far the technology can go, particularly how it might be used in the classroom.

“You can easily imagine practicing a foreign language with a chatbot, particularly if you’re at an institution where there isn’t enough student demand for languages such as Mandarin.”

It’s “kind of hard to get away” from the prospect that chatbots might be used to cut jobs, Alexander said. “Think about how many campuses are under enormous financial pressure. Vendors say that this technology doesn’t replace people, and they may well mean it. But they’re not the ones making that decision.”

Young people may well be more comfortable chatting to robots than the generation before them, “but this technology does create a sense of distance between the institution and the student,” he said. 

Alexander suggested the possibility that students’ familiarity with chatbots may play out along socioeconomic lines over time, with poorer students receiving less human interaction than their wealthier peers. Not using chatbots may become a “marker of taste and class” at some educational institutions.

The line between helpful, personalized information and creepiness is a difficult one to walk, Alexander said. Can chatbots mimic the language that students use without the user noticing? Should they give students information, such as where and when to find their professors when they are struggling in class?

“That might be nice, or it might be terrifying,” he said. A lot of testing will need to be done to establish where these lines should be drawn.

Alexander would like to see more institutions doing their own research into how students respond to chatbots.

“It would make a great senior testing project for undergrads,” he said.

Technology
Is this diversity newsletter?: 
Newsletter Order: 
0
Disable left side advertisement?: 
Is this Career Advice newsletter?: 
Magazine treatment: 
Display Promo Box: