Lawrence school district soon will have software that uses AI to look for ‘concerning behavior’ in students’ activity on school computers, devices

photo by: Journal-World

Lawrence Public Schools district offices pictured in April 2021.

The Lawrence school district has purchased a new system that uses artificial intelligence to look for warning signs of “concerning behavior” in the things students type, send and search for on their district-issued computers and other such devices.

The purchase of the software system, called Gaggle, comes at a time when questions are growing about how artificial intelligence will affect people’s privacy. But school district leaders are emphasizing that the software’s main purpose will be to help protect K-12 students against self-harm, bullying, and threats of violence.

“First and foremost, we have an obligation to protect the safety of our students,” Lawrence school board member Ronald “G.R.” Gordon-Ross told the Journal-World. “It’s another layer of security in our quest to stay ahead of some of these issues.”

Gordon-Ross, who is a longtime software developer, said that he respects the “privacy piece” of the question surrounding the use of monitoring systems. But he also said it’s important to keep in mind that the iPads and other devices that the software will monitor are the district’s property, even though they’re issued to students — “we’re still talking about the fact that they’re using devices and resources that don’t belong to them.”

District officials haven’t yet announced a specific date for the program to begin, but technology director David Vignery told the Journal-World the district is in the process of implementing it, which includes training for staff members.

According to a memo from Vignery to the Lawrence school board, Gaggle is specifically designed to scrutinize “resources” within a district’s network, such as email and online accounts, by honing in on keywords that could indicate “concerning behavior” when a student uses them, such as “suicide,” “self-harm,” “bomb” or “gun.” Vignery said it also scans pictures, videos, attachments and links via a process that combines artificial intelligence and human moderators.

“An uptick in suicides during the COVID-19 pandemic and harm that came to kids is continuing to be on the rise, and that’s why this is a focus of ours,” Vignery told the Journal-World. “You can’t put a dollar figure on a life.”

Gordon-Ross said the Gaggle system isn’t like a parental control or “content filtering” program that tracks where kids go online and blocks them from visiting objectionable sites. Rather, it tracks what they write, send and search for and examines them for warning signs.

“When they’re surfing the web, content-filtering is going to stop them from places they shouldn’t go,” Gordon-Ross said. “(Gaggle) serves a completely different purpose, where it monitors the language and verbiage they use, and the things that they say when they type an email or create a document. This isn’t filtering content, it’s monitoring for key words and phrases related to students’ physical safety and social-emotional well-being.”

He also said that it’s targeted to just those specific keywords, and doesn’t involve more general surveillance of users’ personal files and activity.

“We’re not going through and reading everyone’s email and all their documents,” Gordon-Ross said. “This is not a personal review. We’re scanning for specific words and phrases.”

The software will only monitor those devices — primarily iPads — that are owned by the district and issued to students. But that monitoring will happen regardless of whether the devices are used at school, at the student’s home or elsewhere. Vignery said that Gaggle is capable of flagging content around-the-clock.

In an interview with CBS News in 2021, Jeff Patterson, the founder and CEO of Gaggle’s parent company, Gaggleware, said the software was monitoring devices used by more than five million K-12 students and generated more than 140,000 incident alerts over the 2020-21 school year. He was also asked whether warning signs might go undetected if students were aware that they were being monitored and became more cautious and guarded about what they wrote. Patterson told the interviewer that his software was “an early warning system to identify children in crisis before tragedy happens.”

But context matters, Vignery said. District administrators are going to be tasked with determining levels of severity for flagged content, he said, ranging from “something less than an overt threat” to “life-threatening scenarios.”

“It could be something that we’re just going to check into, but if it ramps up a bit, we’ll be getting a phone call from the company,” he said. “We can be contacted 24/7, so we’ll have safeguards in place to make sure someone gets the call.”

Vignery said that district administrators and staff will assist in investigating “red flags,” and that Gaggle staff also plays a role in that. But he also said that the district’s protocols are still in the development phase, and that personnel would undergo professional development training as the district formulated its blueprint. The district’s technology department is slated to meet with Gaggle associates this week.

“At that point, they’ll start working with our network people on getting things set up,” Vignery said, adding that he anticipated the new software would be fully operational in 40 to 60 days.

“The alert system will be designed to involve all administrators, staff and counselors,” he said.

As part of the roughly $162,000 that the district is paying for the system, Gaggle will assist the Lawrence school district with implementation of the software.

Gordon-Ross said that the software is compliant with the federal Children’s Internet Protection Act, which is designed to shield children from obscene or harmful content on the internet. He also said that the district has long had content-filtering software on its network that also complies with CIPA, but that he’s been displeased with its “lack of consistency” in the types of content it targets.

“Gaggle seems to be more consistent in terms of fewer false hits,” Gordon-Ross said. “The hits with Gaggle are more narrowly focused and much more actionable.”


Welcome to the new Our old commenting system has been replaced with Facebook Comments. There is no longer a separate username and password login step. If you are already signed into Facebook within your browser, you will be able to comment. If you do not have a Facebook account and do not wish to create one, you will not be able to comment on stories.