November 19, 2024
Machines can craft essays. How should writing be taught now?

Machines can craft essays. How should writing be taught now?

Author: Susan D’Agostino
Go to Source

Image: 
On the left-hand side, a person writes on a clipboard with a pen. On the right-hand side, a robotic hand types on a laptop keyboard.

“It doesn’t feel like something I’d write, but it also doesn’t not feel like something I’d write,” a North Carolina State University student said about their work integrating prose from an artificial intelligence text-generating program into a final course essay. Paul Fyfe, associate professor of English and the student’s instructor in the Data and the Human course, had asked students to “cheat” in this way and then reflect on how the experiment tested or changed their ideas about writing, AI or humanness.

Humans have long relied on writing assistance powered by artificial intelligence to check spelling and grammar, predict text, translate or transcribe. Now, anyone with an internet connection can access an AI tool such as OpenAI or Moonbeam, give it a prompt and receive—in seconds—an essay written in humanlike prose.

Instructors who are concerned that students will use these tools to cheat may hold fast to in-class writing assessments or install surveillance tools to try to detect misconduct. But others argue those are fools’ errands. AI-generated prose is original, which prevents plagiarism software from detecting it.

Besides, the breakneck pace of AI developments suggests that humans could never outrun it. To prepare students for workplaces in which AI writing tools will be ubiquitous, some faculty members are embracing the tech and reimagining teaching to help students learn to write prose that differs from what machines could produce.

“All language scholars and teachers will need to reckon with these applications, and very soon,” said Stephen Monroe, chair and assistant professor of writing and rhetoric at the University of Mississippi. “Silicon Valley is developing them feverishly.” For that reason, Monroe and his colleagues are “gingerly” experimenting with introducing the tools in their classrooms.

Concerns About Cheating

Michael Mindzak, assistant professor of education at Brock University, in Ontario, is concerned about the potential for student misconduct regarding the use of AI writing tools. He suggests that instructors raise their awareness of the now-widespread tools, recognize that detection tools are limited and consider the option of returning to “analog” solutions such as in-class essays.

“Of course, that is increasingly difficult, as a lot moved online during the pandemic,” Mindzak said.

Some instructors, informed by anecdotal evidence that pointed to mediocre results with obvious fabrications, suggest that AI writing assistance is “nothing to worry about.” But a study has shown that today’s AI writing tools can produce plausible, college-level essays that earn passing grades and receive feedback on par with those written by human students. And some students report earning top marks with the technology.

Since the 1990s, supercomputing performance—measured by the number of transistors per silicon chip—has doubled roughly every 14 months, which computer scientists dub “Moore’s Law.” The subfield of artificial intelligence is not measured by transistors per silicon chip, but researchers have nonetheless determined that its rate increase in computational power has far outpaced even Moore’s law, doubling roughly every six months.

Nobel-equivalent computing pioneers acknowledge having trouble keeping up with the pace of AI advances, and they exhibit humility in the face of what AI is capable of achieving without human input. Said differently, AI-generated writing may be something to worry about.

“The essays that students turn in are about to get a lot better,” Ethan Mollick, associate professor of management at the University of Pennsylvania’s Wharton School, tweeted recently. “I just tried Moonbeam and it produced an outline & credible undergraduate essay with just the prompt ‘Legitimation and startups.’ And that is without human intervention, which would help.” In fact, Mollick misspelled the title as “legimation,” and the system corrected the error in the essay.

To be sure, even AI writing tools “think” that students should not use them to cheat. One student learned as much when they asked an AI to explain the moral and social issues with using AI to do homework.

“If students are using AI to do their homework for them, they may not be learning the material as well as they could be,” the AI wrote. “This could lead to problems down the road when they are expected to know the material for exams or real-world applications … Using AI to do homework could lead to a reliance on technology that could be detrimental if the technology were to fail or become unavailable.”

Some plagiarism detection and learning management systems have adapted their surveillance techniques, but that leaves systems designed to ensure original work “locked in an arms race” with systems designed to cheat, Fyfe said.

A Gray Area

All the experts with whom Inside Higher Ed spoke said that students who submit essays that are completely composed by AI have crossed an ethical line. But they also said the gray area between acceptable and unacceptable uses of this evolving technology is vast.

While faculty members will likely spend some time trying to identify a boundary line between AI assistance and AI cheating with respect to student writing, that may not be the best use of their time.

“That path leads to trying to micromanage students’ use of these models,” said Ted Underwood, English professor and associate dean of academic affairs at the University of Illinois at Urbana-Champaign. “Much more important … is to think about whether our curriculum is preparing students effectively for the world they’re going to be living in in the 2030s … when software may be built into word processing.”

Students—in fact, most humans—write to learn, and they also write to report on what they have learned.

“I don’t know if [the latter] is going to be the case anymore,” John Shahawy, the founder of Moonbeam, said with no detectable trace of swagger. He’s a software developer with an M.B.A. who initially created Moonbeam for his own purpose—he wanted an efficient way to share his knowledge with others. His system allows users to identify target readers and to generate writing from different points of view. The product has attracted attention.

“It’s pretty good at anticipating what an argumentative paper that feels like a blog post or maybe a 10-page college essay sounds like,” Underwood said of Moonbeam. If the product was trained on a corpus of student essays, Underwood suggested, it would “start to feel little bit more like a paper mill.”

But software developers do not always have the higher ed market in mind when designing products.

“I didn’t suck up anyone’s articles or information to train it. I just used the generated articles that I tweaked,” Shahawy said. When asked how he knew what college professors wanted in student essays, he set the record straight.

“I absolutely don’t know what a college professor is looking for, but I took a good guess to build the base model,” Shahawy said. Faculty members might take note.

“Its ability to do so well in that niche might be a reminder to us that we’ve allowed academic writing to become a little bit too tightly bound up in a predictable pattern,” Underwood said. “Maybe forcing us to stretch the kind of assignments we’re giving students is not a bad thing.”

“That’s the scariest part to me,” Shahawy said. “I still want kids and people to train their brains to think critically. I don’t want AI to be substitute for that.”

Working With, Not Against, the Technology

Most (87 percent) of the North Carolina State students who “cheated” by integrating AI prose into their final course essay in Fyfe’s course reported that doing so was far more complicated than writing the paper themselves. That suggests that writing with computational assistance may be a collaboration—albeit with a nonhuman entity—that demands active intellectual labor on the part of the human.

“We don’t yet have a vocabulary for what’s going on,” Fyfe said of the writing, editing, idea generation and other activities students reported during the experiment. In the process, many discovered that the AI adapted to resemble or anticipate their own thoughts.

“I was genuinely surprised with how well some of the content flowed with my personal writing and how it continued to sound like me,” one student wrote.

“I felt like I was reading potential sentences that I would have written in another timeline, like I was glancing into multiple futures,” another student wrote.

For students who do not self-identify as writers, for those who struggle with writer’s block or for underrepresented students seeking to find their voices, it can provide a meaningful assist during initial stages of the writing process. At the same time, students seeking to hone their writing abilities may learn from the software. For example, a machine might highlight problematic punctuation, suggest how to tighten prose or offer instruction on building complex sentences.

“AI can impact every stage of the writing process—from invention to research, drafting, proofreading and documentation,” said Robert Cummings, an associate professor of writing and rhetoric at the University of Mississippi. “It is only through direct engagement with these emerging AI tools that students will gain familiarity with a purposeful integration into their writing processes and an awareness of the ethical challenges of engaging AI in their writing.”

Some instructors ask students to examine artificial prose to understand the value that they, as humans, could add.

“That’s a reasonable strategy, because frankly the writing assistant software is not so awful that that’s easy,” Underwood said. For example, students could look for where the writing took a predictable turn or identify places where the prose is inconsistent. Students could then work to make the prose more intellectually stimulating for humans.

Students who refine their awareness of artificial prose may also be better equipped to recognize what Fyfe calls “synthetic disinformation” in the wild. Students in his experiment, for example, discovered plausible-sounding false statements and quotes from nonexistent experts in the essays they produced with the help of AI.

“Think about it as a partner, that we humans and AI computers aren’t doing things the same way and aren’t good at the same things, either,” Fyfe said. “Each has unique specializations. What are the kinds of partnerships we can imagine?”

Challenges Moving Forward

AI writing tools bring urgency to a pedagogical question: If a machine can produce prose that accomplishes the learning outcomes of a college writing assignment, what does that say about the assignment?

As math professors once had to adjust their math teaching in the presence of calculators, writing instructors may need to adjust their teaching in the presence of AI tools.

“It would be like micromanaging the use of calculators in in a math class,” Underwood said. “If you’re doing that, it’s a sign that you’re not you’re not taking the opportunity to teach them more advanced math that would actually help them.”

Institutions, departments, administrators and individual faculty members seeking to help students coexist with the tools may need to offer students AI literacy training that offers guidance on its responsible use and susceptibility to bias, Fyfe said. Systems that are trained on writing that excludes some perspectives, for example, are likely to reproduce those limited views. AI is trained on large data sets; if the data set of writing on which the writing tool is trained existing reflects societal prejudices, then the essays it produces will likely reproduce those views. Similarly, if the training sets underrepresent the views of marginalized populations, then the essays they produce may omit those views as well.

Also, academic culture and the culture at large need to develop a means for crediting nonhuman entities in creative or scholarly endeavors.

“Even if a student wanted to quote or credit [the AI writing assistant’s] output for a statement or idea—in other words, even if they wanted to play fair by standard rules of plagiarism—how would they go about it?” Fyfe asked.

Cummings agreed that existing citation systems “were not created with the use of AI in mind,” though he and his colleagues at Ole Miss are instructing students to report on the role AI played in their writing. “We look forward to their evolution to accommodate AI.”

Faculty members, journalists and businesspeople will all need to grapple with the challenges posed by increasingly ubiquitous and proficient AI tools, as will college students who, upon graduation, may enter fields that do not now exist.

Teaching and Learning
Technology
Editorial Tags: 
Image Source: 
baona/iStock/Getty Images
Is this diversity newsletter?: 
Newsletter Order: 
0
Disable left side advertisement?: 
Is this Career Advice newsletter?: 
Magazine treatment: 
Display Promo Box: 
Live Updates: 
liveupdates0
Most Popular: 
3
In-Article Advertisement High: 
6
In-Article related stories: 
9
In-Article Advertisement Low: 
12

Read more