New security system that monitors students’ computer use has ‘inundated’ district with alerts; leader apologizes to staff

photo by: Lawrence Journal-World

Lawrence school district offices, pictured in April 2021.

Insufficient training caused Lawrence school district staff to be unprepared and overwhelmed by a new program that monitors students’ computer usage for unsafe behaviors, according to Superintendent Anthony Lewis and school board member Ronald “G.R.” Gordon-Ross.

During the Dec. 11 school board meeting, Lewis said that issues have arisen following the recent rollout of Gaggle, a software system that uses artificial intelligence to look for signs of concerning behavior in the things students type, send and search for on their district-issued computers and other such devices.

“It was not as seamless as we had hoped,” Lewis said of the software’s rollout, which took place the week of Thanksgiving.

District officials have confirmed the system has produced nearly 190 alerts since late November that were sent to building principals or other district administrators, many times outside of school hours.

As the Journal-World reported in September, the system produces real-time, around-the-clock analysis of potentially concerning behavior detected on students’ district-owned devices and account platforms. Examples include indicators of self-harm, suicidal thoughts, substance abuse, cyberbullying and credible threats against others. The technology also scans pictures, videos, attachments and links via a process that combines artificial intelligence and human moderators.

A notification on Dec. 1 triggered a brief lockdown at Free State High School after the software detected threatening language in a student’s email account. The district said in a news release that officials determined a threat had not been made, but that the “school safety management system worked as intended.”

Despite some implementation issues, Lewis said the program has shown how it can be a valuable tool to improve student safety.

“This product has given us some alerts that we really believe have saved some students’ lives,” Lewis said.

‘A lot of confusion’

When Gaggle flags concerning content, it can generate an alert: a notification that goes out to building staff, who are tasked with facilitating a potential response to those concerns. But Lewis said it is now clear that some key personnel in the district were not part of a daylong training session on Nov. 13 where personnel learned how the system worked.

“So when it went live, there were people that did not know it was coming or what to expect,” he said.

The district developed contact lists of staff members who would be notified of alerts from Gaggle, but Gordon-Ross said it is now clear that not all of the people listed were aware of the Gaggle rollout. Gordon-Ross, a longtime software developer who strongly supported the implementation of Gaggle, told the Journal-World that administrative-level employees were present for training, but that the district’s mental health personnel were not involved in that process.

“There was no specific training for mental health teams that got included on those building (lists),” he said. “So some people just started getting those notifications and had no idea what they were or what to do with them. There was just a lot of confusion.”

Gordon-Ross added that district administration, including Director of Technology David Vignery, worked with Gaggle to set the parameters of the flagged content prior to the rollout. But many staff members were “operating on a belief and assumption” that the software would only trigger alerts on “rare occasions.”

“Then it went live the week of Thanksgiving, and some of our buildings just got inundated with alerts,” he said, “and they were coming at all times. We weren’t expecting this level of onslaught and didn’t think we would have to work the kinks out by day one. But that’s essentially what happened.”

Detections and alerts

According to information obtained from the district on Friday, there have been 408 “detections” of concerning behavior since Gaggle’s districtwide launch on Nov. 20. Of those, 188 have resulted in actual “alerts.”

District spokesperson Julie Boyle said that there are three different priority levels that Gaggle uses to classify the concerning information it detects. The lowest level, “violations,” includes minor offenses like the use of profanity. Those do not trigger alerts, but the system collects data on them “in case future review is necessary.” Next is a level called “Questionable Content,” which triggers a “non-urgent alert to the building administrators for review and follow-up as necessary.”

Finally, Boyle said, there is the most urgent level: “Potential Student Situations.” This level includes warning signs of suicide, violence, drug abuse, harassment and other serious behavioral or safety problems, and it triggers “urgent alerts involving an immediate phone call, text, and email to the building administrators.” An alert of this kind is assigned to a staff member for investigation and follow-up.

Approximately 95% of the 188 alerts the district has received so far were categorized as “Questionable Content.” Nine of them were categorized as “Potential Student Situations.”

Lewis said one of the problems with the alert system is that the sheer number of Gaggle alerts has infringed on the personal time of staff. He said notifications that didn’t require urgent action were often being sent to district personnel during their time off work.

“As the leader of this district, I sincerely apologize for any confusion or inconvenience that this may have caused,” Lewis said during the Dec. 11 board meeting. He specifically apologized to “our administrators and mental health team members that were impacted by these notifications that happened outside of normal hours.”

While Lewis and others say the system needs improvement, district officials also believe it will give them more ability to efficiently detect potentially dangerous situations and identify students who may need help.

Boyle told the Journal-World that the district has been running its previous content-filtering system “alongside Gaggle” since Gaggle’s launch, and that the old system generates many more false positives. She said the old system has generated “546,572 simple keyword detections for review” since Gaggle launched.

“A large majority of these detections are false because it cannot determine context,” Boyle said of the district’s previous content-filtering system. “Gaggle increases the accuracy of detections and greatly reduces the time from detection to alert-notification, so someone can assist a child who may need help.”

‘Trying to work through it’

Although false positives are rarer with Gaggle than with the previous system, they’re still an issue. Gordon-Ross said that some of the detections that triggered “non-urgent” alerts “were just students writing a college essay and talking about some of the struggles that they’ve had.”

But he said these issues and many others are just a natural part of implementing something new.

“This is a new system and we’re trying to work through it,” he said.

Lewis said that district leaders are “actively working to rectify the situation to ensure that Gaggle is properly configured to meet the needs and expectations of our schools and community.” The adjustments could include changes to the notification time frame, he said, as well as more clearly defined “roles and responsibilities” for district staff.

“We want to make sure we’re respectful and honor time outside of the school day,” Lewis said. “Some notifications can wait until the next workday.”

Gordon-Ross said that another concern comes from a state law that requires school districts to receive parental consent prior to administering a mental health evaluation.

“So that adds another layer of complexity,” he said. “If they get a flag for potential ‘suicidal thoughts,’ the mental health teams really cannot do a whole lot until they talk to the parents of the student. So there are a lot of questions and avenues that weren’t really addressed with staff in a meaningful way — for staff to really feel like they could make the best use of the information that they were given.”

Gordon-Ross said that the district needs to work with Gaggle to gain a clearer understanding of its algorithm, which will assist with “filtering off some of the (flagged) messages that we know are not necessary for us to see.”

Gordon-Ross also expressed frustration that the item approving the use of Gaggle appeared on the school board’s consent agenda in late August. Consent agenda items are routinely passed without discussion and on a single board vote.

“In hindsight, Gaggle should have never been on the consent agenda,” he said. “There should have been a presentation by the IT department and the board should have been allowed to ask questions. With something like Gaggle, if you were to look at the impact it was going to have after being implemented, it really should have come to the board as ‘new business.'”

At the Dec. 11 meeting, Gordon-Ross called for a review in the procedures for how items are placed on the consent agenda.

COMMENTS

Welcome to the new LJWorld.com. Our old commenting system has been replaced with Facebook Comments. There is no longer a separate username and password login step. If you are already signed into Facebook within your browser, you will be able to comment. If you do not have a Facebook account and do not wish to create one, you will not be able to comment on stories.