Encode Justice, an international group of grassroots activists advocating for ethical AI applications has had a busy year. Legislators have been lobbied, online seminars have been held, and meetings have been attended, all in the hopes of educating others about the dangers of facial recognition technology.
Fitting everything into a workday would be difficult for any activist group; most of the Encode Justice team has had to cram it all-around high school.
This is due to the fact that the organization was founded and is run almost entirely by high school students. Sneha Revanur, the company’s founder and president, is a 16-year-old San Jose high-school senior, and at least one of the leadership team members isn’t old enough to get a driver’s license. It may be the only youth activist group devoted solely to highlighting the dangers posed by AI-based applications such as facial recognition software and deepfakes, both real and potential.
“We’re fighting for a future where technology is used to empower rather than oppress,” Revanur said.
Over the last few years, AI has become more widely used, but the general public has only recently become aware of it. Concerns about the accuracy and underlying racial bias of facial recognition systems have become increasingly common. For example, facial recognition technology has been shown to be less accurate when identifying people of color, and at least a few Black men have been wrongfully arrested as a result of its use. While there is no federal legislation governing the technology’s use, an increasing number of states and cities are enacting their own regulations to limit or prohibit its use.
Schools are increasingly using facial recognition systems for on-campus surveillance systems and as part of remote testing services, ostensibly to protect student safety and integrity. Encode Justice is taking action in the hopes of raising awareness about its concerns about the spread of such surveillance among students and adults — and this action can take many forms and be carried out from almost anywhere, thanks to the pandemic’s increased use of online mobilization.
Encode Justice members, for example, were fighting facial recognition technology in schools and other public places with the American Civil Liberties Union of Massachusetts and another youth group, the Massachusetts-based Student Immigrant Movement, during a week in mid-September. People were encouraged to contact local officials to push for a state ban on facial recognition software in schools, as well as to post support for such a ban on social media during this student week of action.
Despite their limited time (and experience), the Encode Justice students have recruited students from all over the world. And they are being heard by legislators, well-known civil rights organizations such as the ACLU, and an increasing number of their peers.
How it all began
It all began with a news report. Revanur discovered a 2016 ProPublica investigation into risk-assessment software, which uses algorithms to predict whether a person will commit a crime in the future, about two years ago. One statistic stood out to her: according to a ProPublica analysis of public data, the software’s algorithm was labeling Black defendants as nearly twice as likely to commit a future crime as White defendants. (At this time, no AI-based software is known to be in use in the United States.)
“That was a rude awakening for me,” she said, “because I realized technology isn’t this completely objective, neutral thing as it’s reported to be.”
In 2020, Revanur decided to take action: Proposition 25, one of her state’s ballot measures, sought to replace California’s pretrial cash bail system with risk-assessment software, which judges would use to decide whether to hold or release a person prior to their court dates. In the summer of 2020, she founded Encode Justice in the hopes of raising opposition to the bill.
Revanur collaborated with a group of 15 other adolescent volunteers, the majority of whom she knew from school. They held town hall meetings, wrote opinion pieces, and made phone calls and texts in opposition to Proposition 25, which opponents claimed would perpetuate racial inequality in policing because risk-assessment tools rely on data such as a defendant’s age and arrest history.
The measure was defeated in November, but Revanur was energized by the outcome. But she also realized that she was only looking at a small part of a larger issue involving the various applications of technology.
“At that point, I realized Encode Justice was dealing with a problem that was bigger than a single ballot measure,” she explained. “We’re going to have to deal with a twenty-first-century civil rights challenge.”
Encode Justice, Revanur decided, should investigate the ethical issues raised by all kinds of algorithm-laden technologies, particularly those involving AI, such as facial-recognition technology. The group is lobbying federal, state, and local legislators to stop it from being implemented in schools and other public places.
Though statistics are scarce, some schools are said to be using such software in the hopes of improving student safety through surveillance; others have attempted to implement it but failed due to opposition. A number of cities have their own regulations prohibiting or restricting the use of the technology.
Encode Justice grew quickly through social media, according to Revanur, and recently merged with Data4Humanity, a smaller student-led group. It now has about 250 volunteers from more than 35 states and 25 countries, she said, including a 15-member executive team (12 of its leaders are in high school; three in college).
Revanur speaks quickly but clearly; she’s enthusiastic and enthusiastic, and she’s clearly spent a lot of time considering the real and potential dangers that technology can pose. She’s almost awestruck by the group she’s created, describing its growth as “really surreal.”
Damilola Awofisayo, a high school senior from Woodbridge, Virginia, said she was misidentified by facial recognition software while attending a now-defunct summer camp before her sophomore year. According to Awofisayo, facial recognition software mismatched a picture of someone else’s face to a video of her, resulting in her receiving an ID card with another camper’s picture on it.
“Hey, I’ve actually experienced algorithmic harm, and these people are doing something about it,” Awofisayo said after learning about Encode Justice’s work. She is now the company’s director of external affairs.
“If you look back in history, civil rights have always been fought for by teenagers, and Encode Justice recognizes that and uses the medium we’re used to combating it,” Awofisayo said.
How’s it going?
Social media is frequently used as that medium. Encode Justice uses social media platforms such as Twitter, Instagram, and TikTok to reach out to other young people.
At public hearings, the group has spoken out against facial recognition technology, such as one in Minneapolis that resulted in the city banning the use of such software by its police department and other city departments.
Kashyap Rajesh, a high school sophomore in Hawthorn Woods, Illinois, who works as Encode Justice’s chapter coordinator, said that for the past three months, chapters have been paired up with one another to host town-hall-style events and conduct email and phone-based campaigns. He also mentioned that chapters are hosting guest speakers from data and privacy-related organizations.
Revanur claims that the group has reached over 3,000 high school students through presentations developed and delivered by its members (both in-person and virtually). She said one study looks at AI and policing, another at AI and climate change, and a third at AI and healthcare. Encode Justice has also spoken with attendees at conferences and hackathons.
The group has also reached out to lawmakers from a variety of political parties. According to Adrian Klaits, the group’s co-director of advocacy and a high-school junior in Vienna, Virginia, the group’s executive board traveled to Washington, DC, this summer and met with staff at several senators’ offices.
Despite the fact that it runs entirely by volunteers, all of this work is expensive. Grants, including $5,000 from the We Are Family Foundation, have provided funding, according to Revanur.
“We’ve been able to accomplish a lot despite not having a lot of financial capital,” Revanur said.
The advantage of youth
Partnering with veteran activists, such as the ACLU of Massachusetts, to amplify the voices of students in its fight for algorithmic fairness and learn from those who have worked on social justice and technology issues for years, is one of these accomplishments.
“I think those of us who are a little more established and have done more political work can provide some important guidance about what has worked in the past, what we can try to explore together,” said Kade Crockford, the ACLU of Massachusetts’ technology for the liberty program director.
Encode Justice also works with the Algorithmic Justice League, founded by activist and computer scientist Joy Buolamwini to raise awareness about the dangers of artificial intelligence.
The Algorithmic Justice League’s director of research and design, Sasha Costanza-Chock, said the group learned about Encode Justice last fall while looking for youth groups to partner with in its fight against algorithmic injustice in education. The two groups are working on a way for people to tell their stories about how AI systems have harmed them.
“We believe in and support what they’re doing,” said Costanza-Chock. “People who are most directly impacted should lead movements. As a result, the fight for more accountable and ethical AI in schools will need to include a strong leadership component from students.”
While the members of Encode Justice lack years of organizing or fundraising experience, Klaits pointed out that they do have energy and dedication, which they can use to help spark change. And, like young climate activists, they have a unique understanding of the longer-term consequences of pressing for change now.
“There aren’t a lot of young activists in this movement, and I think it’s important to remember that, especially when we go into lobbying meetings or meetings with different representatives, people who are important to our cause, we’re the ones who will be dealing with this technology the longest,” Klaits said. “We’re the ones who are most affected over the course of our lives.”