Warren, Markey, Blumenthal Raise Concerns About Discriminatory Bias in EdTech Student Surveillance Platforms and Harmful Effects on Students’ Mental Health
Letter to Gaggle (PDF) | Letter to Bark Technologies (PDF)
Letter to GoGuardian (PDF) | Letter to Securly Inc. (PDF)
Washington, D.C. -- United States Senators Elizabeth Warren (D-Mass.), Edward J. Markey (D-Mass.), and Richard Blumenthal (D-Conn.) sent a letter to four educational technology companies -- Gaggle.net, Bark Technologies, GoGuardian, and Securly Inc. -- regarding their use of artificial intelligence (AI) and algorithmic systems to monitor students’ online activity.
“Your company and other education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline, and draining resources from more effective student supports,” wrote the senators to each of the companies. “We are concerned these products may extend far beyond the direction in federal laws to monitor online activity to protect children from exploitation and abuse.”
A new report from the Center for Democracy and Technology revealed that the recent expansion of remote learning increased the use of online monitoring software to track student activity, with 81% of teachers stating that their schools now use at least some type of monitoring software. This increase in use of digital education platforms will likely outlast the pandemic.
Prior to the recent revelations uncovered by the Center for Democracy and Technology, studies already highlighted numerous unintended but harmful consequences of student surveillance programs that target vulnerable populations. Artificial intelligence and algorithmic systems frequently mischaracterize students’ activity and flag harmless activity as a “threat.” School disciplinary measures have a long history of disproportionately targeting students of color, and research has shown that students from minority or marginalized communities, including students of color and LGBTQ+ students, are far more likely to be flagged. Language processing algorithms are also less successful at analyzing language of people of color, especially African American dialects. This raises concerns that digital student surveillance platforms will perpetuate racial and discriminatory biases.
Additionally, the use of these tools may break down trust within schools, prevent students from accessing critical health information, and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students. According to mental health advocates and experts, LGBTQ+ students are more likely to seek help online, and these tools frequently prevent them from accessing the health information they seek due to website filtering by student surveillance programs.
The escalations and mischaracterizations of crises may have long-lasting and harmful effects on students’ mental health due to stigmatization and differential treatment following even a false report. These letters seek information regarding the steps each company is taking to ensure the efficacy of its products and to mitigate potential harms on students, including perpetuating discriminatory biases.
“We strongly support measures that will protect students and ensure student safety, and we share the urgency that school districts are facing to identify ways to keep students safe. As school districts look ahead, they must decide which safety tools and systems to use in order to protect student safety.” the senators wrote. “It is crucial that the tools school districts select will keep students safe while also protecting their privacy, and that they do not exacerbate racial inequities and other unintended harms.”
These letters build on Senator Warren’s concerns about algorithmic bias disproportionately affecting communities of color in the financial sector and health care systems. Senator Warren previously sent a letter to Zoom asking about its student safety and privacy protections during the pandemic and signed onto a letter about student privacy and racial bias in exam-proctoring software.
###
Next Article Previous Article