Building Anti-Surveillance Ed-Tech
Author:
Go to Source
These are the slides and transcript from my conversation this morning with Paul Prinsloo — a webinar sponsored by Contact North
Pardon me if I just rant a little. Pardon my language. Pardon my anger and my grief. Or don’t. Let us sit with our anger and our grief a little.
We are living in terrible, terrible times — a global pandemic, economic inequality exacerbated by economic depression, dramatic and worsening climate change, rampant police violence, and creeping fascism and ethno-nationalism. And in the midst of all this danger and uncertainty, we have to navigate both old institutions and practices — may of which are faltering under a regime of austerity and anti-expertise — and new(ish) technology corporations — many of which are more than happy to work with authoritarians and libertarians.
Education technology — as a field, an ideology — sits right at that overlap but appears to be mostly unwilling to recognize its role in the devastation. It prefers to be heralded as a savior. Too many of its advocates refuse to truly examine the ways in which ed-tech makes things worse or admit that the utopia they’ve long peddled has become a hellscape of exploitation and control for a great deal of the people laboring in, with, under its systems.
Ed-tech may not be the solution; in fact, ed-tech may be the problem — or at the very least, a symptom of such.
Back in February — phew, remember February? — Jeffrey Moro, a PhD candidate in English at the University of Maryland, wrote a very astute blog post “Against Cop Shit” in the classroom.
“For the purposes of this post,” Moro wrote, “I define ‘cop shit’ as ‘any pedagogical technique or technology that presumes an adversarial relationship between students and teachers.’ Here are some examples:
- ed-tech that tracks our students’ every move
- plagiarism detection software
- militant tardy or absence policies, particularly ones that involve embarrassing our students, e.g. locking them out of the classroom after class has begun
- assignments that require copying out honor code statements
- ‘rigor,’ ‘grit,’ and ‘discipline’
- any interface with actual cops, such as reporting students’ immigration status to ICE and calling cops on students sitting in classrooms.
The title of this webinar is “Building Anti-Surveillance Ed-Tech,” but that’s a bit of a misnomer as I’m less interested in either “buiding” or in “ed-tech.” Before we build, we need to dismantle the surveillance ed-tech that already permeates our schools. And we need to dismantle the surveillance culture that it’s emerged from. I think this is one of our most important challenges in the months and years ahead. We must abolish “cop shit,” recognizing that almost all of ed-tech is precisely that.
I know that that makes people bristle, particularly if your job is administering the “cop shit” or if you are compelled by those with more authority at work to use “cop shit” or if you believe that “cop shit” is necessary because how else do we keep everyone safe.
Why do we have so much “cop shit” in our classrooms, Moro asks. “One provisional answer is that the people who sell cop shit are very good at selling cop shit,” he writes, “whether that cop shit takes the form of a learning management system or a new pedagogical technique. Like any product, cop shit claims to solve a problem. We might express that problem like this: the work of managing a classroom, at all its levels, is increasingly complex and fraught, full of poorly defined standards, distractions to our students’ attentions, and new opportunities for grift. Cop shit, so cop shit argues, solves these problems by bringing order to the classroom. Cop shit defines parameters. Cop shit ensures compliance. Cop shit gives students and teachers alike instant feedback in the form of legible metrics.”
I don’t think that ed-tech created “cop shit” in the classroom or created a culture of surveillance in schools by any means. But it has facilitated it. It has streamlined it. It has polished it and handed out badges for those who comply with it and handed out ClassDojo demerits for those who haven’t.
People who work in ed-tech and with ed-tech have to take responsibility for this, and not just shrug and say it’s inevitable or it’s progress or school sucked already and it’s not our fault. We have to take responsibility because we are facing a number of crises — some old and some new — that are going to require us to rethink how and why we monitor and control teachers and students. And now, the “cop shit” that schools are being sold isn’t just mobile apps that track whether you’ve completed your homework on time. It’s body temperature scanners. Contact tracing. Movement tracking. Immigration status. Political affiliation.
Surveillance practices pre-date digital technologies — of course they do. I pulled my copy of Michel Foucault’s Discipline and Punish off the shelf to re-read as I prepared for this talk (and for my next book project, which will be on the history of surveilling children — someday, I’ll regale you all with the story of how the baby monitor was invented and reinvented to respond to moral panics of the day), and roll your eyes all you want at the invocation of poststructuralism and the Panopticon. But this is where we reside.
Surveillance in schools reflects the values that schools have (unfortunately) prioritized: control, compulsion, distrust, efficiency. Surveillance is necessary, or so we’ve been told, because students cheat, because students lie, because students fight, because students disobey, because students struggle. Much of the physical classroom layout, for example, is meant to heighten surveillance and diminish cheating opportunities: the teacher in a supervisory stance at the front of the class, wandering up and down the rows of desks and peering over the shoulders of students. (It’s easier, I should note, to shift the chairs in your classroom around than it is to shift the code in your webinar software.) And all of this surveillance, we know, plays out very differently for different students in different schools — which schools require schools to walk through metal detectors, which schools call the police for disciplinary infractions, which schools track what students do online, even when they’re at home. And nowadays, especially when they’re at home.
Of course, educators — teachers and staff — are at home now too. (Or my god, I hope they are.) And the surveillance technology that’s been wielded against students will surely be used against them as well.
We can already see some of this at place outside of educational institutions in the new, workplace surveillance tools that many companies are adopting. For a very long time, the argument that many employers made against working from home was that they didn’t trust their employees to be productive. The supervisor needed to be able to walk by your desk at any moment and make sure you were “gonna have those TPS reports to us by this afternoon,” to borrow a phrase from the movie Office Space. Companies are now installing software on employees’ computers to track where they are, for how long, doing what. Much as education technology is designed on the basis of distrust of students, enterprise technology — that is, technology sold to large businesses — is designed around a distrust of workers. Again, there’s a long history here — one that isn’t just about computing. The punch clock, for example, was invented in 1888 by a jeweler William LeGrand Bundy in order to keep track of what time his employees came and left work. He and his brother founded the Bundy Manufacturing Company to manufacture the devices, and after a series of mergers, it became a part of a little company called International Business Machines — one we know better as IBM. Those “business machines” were sold with the promise of more efficient workplaces, of course, and that meant monitoring workers. And that included the work teachers and students do at school.
Zoom, this lovely piece of videoconferencing software we are using right now, is an example of enterprise technology. Zoom never intended to serve the education market, despite its widespread adoption since “work-from-home” began earlier this year. And there is quite a bit about the functionality of the Zoom software that reveals whose interests it serves — the ability to track who’s paying attention, for example, and who’s actually working on something else in a different application (a feature, I will say, that the company disabled earlier after complaints about its fairly abysmal security and privacy practices).
Who’s cheating the time-clock, right? Who’s cheating the boss. What are workers doing? What are workers saying? Enterprise software and ed-tech software — both “cop shit” — claim they can inform the management — the principal, the provost. This software claims it knows what we’re up to, and if it can’t stop us from misbehaving, it can narc us out.
What it’s been coded to identify as “misbehavior” is fairly significant. Early in June, if you’ll recall, at the bequest of Beijing, Zoom disabled the accounts of Chinese dissidents who were planning on commemorating Tiananmen Square protests — something that should give us great pause when it comes to academic freedom on a platform that so many schools have adopted.
Digital technology companies like to say that they’re increasingly handing over decision-making to algorithms — it’s not that Beijing made us do it; the algorithm did. Recall Facebook CEO Mark Zuckerberg testifying before Congress, insisting that AI would prevent abuse and disinformation. But Facebook does not rely on AI; content moderation is still performed by people — it’s terrible, traumatizing, low-pay work.
Ah, the sleight of hand when it comes to the promises of automation. Recall the mechanical Turk, for example, an eighteenth-century machine that purported to be an automated chess-player that was actually operated by a human hidden inside.
Automation is, nonetheless, the promise of surveillance ed-tech — that is, the automation of the work of disciplining, monitoring, grading. We’ve seen, particularly with the switch to online learning, a push for more proctoring “solutions” that gather immense amounts of data to ascertain whether or not a student is cheating. Proctoring software is some of the most outrageous “cop shit” in schools right now.
These tools gather and analyze far more data than just a student’s responses on an exam. They require a student show photo identification to their laptop camera before the test begins. Depending on what kind of ID they use, the software gathers data like name, signature, address, phone number, driver’s license number, passport number, along with any other personal data on the ID. That might include citizenship status, national origin, or military status. The software also gathers physical characteristics or descriptive data including age, race, hair color, height, weight, gender, or gender expression. It then matches that data that to the student’s “biometric faceprint” captured by the laptop camera. Some of these products also capture a student’s keystrokes and keystroke patterns. Some ask for the student to hand over the password to their machine. Some track location data, pinpointing where the student is working. They capture audio and video from the session — the background sounds and scenery from a student’s home. Some ask for a tour of the student’s room to make sure there aren’t “suspicious items” on the walls or nearby.
The proctoring software then uses this data to monitor a student’s behavior during the exam and to identify patterns that it infers as cheating — if their eyes stray from the screen too long, for example. The algorithm — sometimes in concert with a human proctor — determines who is a cheat. But more chilling, I think, the algorithm decides who suspicious, what is suspicious.
We know that algorithms are biased, because we know that humans are biased. We know that facial recognition software struggles to identify people of color, and there have been reports from students of color that the proctoring software has demanded they move into more well-lit rooms or shine more light on their faces during the exam. Because the algorithms that drive the decision-making in these products is proprietary and “black-boxed,” we don’t know if or how it might use certain physical traits or cultural characteristics to determine suspicious behavior.
We do know there is a long and racist history of physiognomy and phrenology that has attempted to predict people’s moral character from their physical appearance. And we know that schools have a long and racist history too that runs adjacent to this, as do technology companies — and this is really important. We can see how the mistrust and loathing of students is part of a proctoring company culture and gets baked into a proctoring company’s software when, for example, the CEO posts copies of a student’s chat logs with customer service onto Reddit, as the head of Proctorio did last month.
That, my friends, is some serious “cop shit.” Cops have no business in schools. And frankly, neither does Proctorio.
So, if we are to build anti-surveillance ed-tech, we have much to unwind within the culture and the practices of schools — so much unwinding and dismantling before we even start building.
Indeed, I will close by saying that — as with so much in ed-tech — the actual tech itself may be a distraction from the conversation we should have about what we actually want teaching and learning to look like. We have to chance the culture of schools not just adopt kinder ed-tech. Chances are, if you want to focus on the tech because it’s tech, you’re selling “cop shit.”