Software that monitored Mpls. students' online activity sparks concerns
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
School buildings allow staff to keep a watchful eye over students that may be in emotional distress or doing something potentially dangerous. But what happens when classes transition online? Last year, Minneapolis schools answered that question by contracting with Gaggle.
Gaggle, an artificial intelligence software that scans documents and messages linked to students’ school-issued Google and Microsoft accounts, monitors the digital files of more than 5 million students each year. The software alerts school officials of material related to illicit behavior and mentions of self-harm found in students' online accounts.
In an article from The 74, a national news organization covering K-12 education, school officials said that Gaggle has helped the school district intervene when students needed help. But privacy experts have concerns about the online surveillance of Minneapolis students.
The 74 reporter Mark Keierleber spoke with MPR News host Tom Crann about what he learned.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
This interview has been lightly edited for length and clarity. Listen to the full interview using the audio player above.
How have school districts and students been using Microsoft and Google accounts over the past year?
Basically, the entire classroom became a digital platform. That includes Google Docs, Google Spreadsheets, and it also includes Google Hangouts — like the G Chat feature which allows you to message other users, and allows users to use the video platforms.
Minneapolis Public Schools paid about $300,000 to have Gaggle monitor these online platforms. How does Gaggle work?
Artificial intelligence and a team of content moderators look through students’ communications in search of keywords that could indicate problems. When they find any, they reach out to the security team at Minneapolis Public Schools and say “Hey, Johnny was discussing being depressed and we think that you should be alerted and aware of this.” Ultimately, it’s up to the school districts to figure out how best to respond.
You requested six months worth of incident reports connected to Gaggle alerts in Minneapolis schools. What did you find and did it work the way that school officials expected it to when they set it up?
It generated about 1,300 documents. Pornography and sexual material were the number one (flagged) form of content that students were looking at on their computers. Number two is potentially the most concerning to find during the pandemic, and that’s issues relateing to suicide and self-harm.
Is there any evidence to support that Gaggle is a good tool for mental health intervention?
If you talk to the security head at Minneapolis Public Schools, he cited a specific example where in the middle of the night, he was alerted of a suicide note written on a kids computer. (The school was) able to jump into action. As a result, they feel like they saved that student’s life. That’s certainly a positive anecdote. The problem is that there isn’t any independent research to verify that this is an effective intervention.
There’s another side to this that has privacy advocates and civil rights advocacy groups worried about the use of Gaggle in Minneapolis schools. Can you tell us about that?
We’re talking about a tool that scans billions of students' communication across the country, 365 days a year, 24 hours a day. And to some privacy advocates that is a blatant, Big Brother kind of approach.
How has the Minneapolis Public School district responded to concerns over the use of Gaggle?
Gaggle is tied to school-issued accounts. And they highlight that as one of the reasons why they don’t see this as a privacy issue. There is a technology use policy within the Minneapolis school district that says that students should not expect full privacy on school devices. Bigger picture, they’re talking about how during the pandemic there was a concern surrounding students and self-harm. And the security had said “Hey, if we can save one student’s life, then it’s all worth it.”