Technology

This school camera maps kids' faces. How do you feel about that?

A camera with facial recognition capabilities hangs from a wall while being installed at Lockport High School in Lockport, N.Y.
Enlarge Icon
A camera with facial recognition capabilities hangs from a wall while being installed at Lockport High School in Lockport, N.Y.
Credit: AP Photo/Carolyn Thompson

The screens we look at are returning our gaze.

Facial recognition technology is replacing the password on smartphones.

And it’s inching into public places, like schools, where they could relieve some of the mounting concerns about safety.

But there are concerns about how this technology could be used.

It certainly looks simple when you encounter it in the lobby of RealNetworks. There’s a screen near the reception area that is constantly hunting for faces – those it knows and those it doesn’t.

It’s easy to register so the machine can connect a name to a face. But then it waits for instructions: If this person appears, are they allowed to open a door?

In the case of this headquarters – no – visitors can’t open doors. But in a school with so many people bringing children and picking them up, it could be easy for people who don’t belong is squeeze in.

RealNetworks’s Mike Vance is in charge of SAFR. This is how, he says, facial recognition technology can help. 

“This is really useful even in small schools where there’s somebody at the front desk who knows everybody,” he says.

Because people at the front desk aren’t always there. And they aren’t there at all hours, when schools also need some protection.

SAFR can take action if it sees someone denied access. It can alert people, or call the police.

SAFR is installed at a private elementary school in the University District. Right now, no one is being forced to use it.

Not so on the other side of the country. In New York State, Lockport City School District is plunging into facial recognition with a system called Aegis.

Aegis is constantly looking at all the faces inside schools, watching for criminals and kids who have been expelled or suspended, and also watching for weapons.

One parent worries the technology could do much more. 

“The system was pitched as a way to keep bad people out of the building,” says Jim Schultz, a parent of a sophomore at Lockport High School. “What nobody admitted until the very end is that this is going to be a set of 300 cameras that are going to be constantly filming the students and the teachers in the halls in every school in the district.”

Schultz says he worries about how the data collected from all those cameras could be used. “They can literally create a map of who hung out with whom in the halls, of who speaks to whom and that is a whole other kind of surveillance. “

This is the problem with facial recognition technology. How could these images be used later? RealNetworks says only people who want to use SAFR are doing so now. They’ve given consent. And it says data from SAFR is not available outside the school, so there are limits on how it could be applied.

But Amazon employees have been concerned about how facial recognition data could be used. They have been asking the company not to sell Amazon’s cloud-based Rekognition technology to the Trump administration.

They say they worried about how the technology could be used by immigration enforcement. They also asked the company to stop hosting the data mining company Palantir on Amazon’s cloud. That indicates the employees are concerned about data gathered for one purpose being used elsewhere for another.

Digital life is now full of examples where data taken for one reason is used later for a completely different reason. One example is the Facebook personality quiz data that was eventually used to manipulate voters.

In New York, parent Jim Schultz says he’s worried: “It could be used for all kinds of things. It’s just a violation of privacy that no one talked to parents about.”

Shankar Narayan of the American Civil Liberties Union in Seattle says there is reason to be concerned.

“The extreme example of this is in China, where schools are heavily monitored and all this data goes into the record of that student. That not only is for disciplinary purposes in the school but actually follows them around for the rest of their lives.”

“We may say, that’s China, it would never happen here. But we are building a surveillance infrastructure that would certainly enable that.”

Data gathered for one purpose shouldn't be used for another reason. But Narayan says after a while those lines tend to blur.

“So many times there is mission creep. A system that’s adopted for one purpose down the line is used for another purpose and another purpose,” he says.

People have started to see that, so he's hopeful. Public pressure, he says, tends to force business and government to find solutions.

Eileen Buckley of WBFO in New York State contributed reporting.

Amazon employees worry how their facial recognition software could be used by immigration officials.


“This is really useful even in small schools where there’s somebody at the front desk who knows everybody,” he says.

Because people at the front desk aren’t always there. And they aren’t there at all hours, when schools also need some protection.

SAFR can take action if it sees someone denied access. It can alert people, or call the police.

SAFR is installed at a private elementary school in the University District. Right now, no one is being forced to use it.

Not so on the other side of the country. In New York State, Lockport City School District is plunging into facial recognition with a system called Aegis.

Aegis is constantly looking at all the faces inside schools, watching for criminals and kids who have been expelled or suspended, and also watching for weapons.

One parent worries the technology could do much more. 

“The system was pitched as a way to keep bad people out of the building,” says Jim Schultz, a parent of a sophomore at Lockport High School. “What nobody admitted until the very end is that this is going to be a set of 300 cameras that are going to be constantly filming the students and the teachers in the halls in every school in the district.”

Schultz says he worries about how the data collected from all those cameras could be used. “They can literally create a map of who hung out with whom in the halls, of who speaks to whom and that is a whole other kind of surveillance. “

This is the problem with facial recognition technology. How could these images be used later? RealNetworks says only people who want to use SAFR are doing so now. They’ve given consent. And it says data from SAFR is not available outside the school, so there are limits on how it could be applied.

But Amazon employees have been concerned about how facial recognition data could be used. They have been asking the company not to sell Amazon’s cloud-based Rekognition technology to the Trump administration.

They say they worried about how the technology could be used by immigration enforcement. They also asked the company to stop hosting the data mining company Palantir on Amazon’s cloud. That indicates the employees are concerned about data gathered for one purpose being used elsewhere for another.

Digital life is now full of examples where data taken for one reason is used later for a completely different reason. One example is the Facebook personality quiz data that was eventually used to manipulate voters.

In New York, parent Jim Schultz says he’s worried: “It could be used for all kinds of things. It’s just a violation of privacy that no one talked to parents about.”

Shankar Narayan of the American Civil Liberties Union in Seattle says there is reason to be concerned.

“The extreme example of this is in China, where schools are heavily monitored and all this data goes into the record of that student. That not only is for disciplinary purposes in the school but actually follows them around for the rest of their lives.”

“We may say, that’s China, it would never happen here. But we are building a surveillance infrastructure that would certainly enable that.”

Data gathered for one purpose shouldn't be used for another reason. But Narayan says after a while those lines tend to blur.

“So many times there is mission creep. A system that’s adopted for one purpose down the line is used for another purpose and another purpose,” he says.

People have started to see that, so he's hopeful. Public pressure, he says, tends to force business and government to find solutions.

Eileen Buckley of WBFO in New York State contributed reporting.