RealNetworks is offering schools a new, free security tool: facial-recognition software. But as the technology moves further into public spaces, it’s raising privacy concerns and calls for regulation — even from the technology companies that are inventing the biometric software.
As Mike Vance approaches the glass door that leads to RealNetworks’ engineering office, he smiles slightly at a small camera mounted in front of him. Click. The door unlocks, responding to a command from software powering the camera that recognized Vance’s face and confirmed his identity.
Vance, a senior director of product management at the Seattle tech company, leads the team that created Secure, Accurate Facial Recognition — or SAFR, pronounced “safer” — a technology that the company began offering free to K-12 schools this summer.
It took three years, 8 million faces and more than 8 billion data points to develop the technology, which can identify a face with near perfect accuracy. The short-term goal, RealNetworks executives say, is increased school safety.
“There’s a lot of benefit for schools understanding who’s coming and going,” Vance said.
Most Read Business Stories
Plane crash lands in Pacific lagoon VIEW
A wave of condos is coming to Seattle and Bellevue for the first time since the housing bust
Facebook was hacked. Here are 3 things you should do now.
Boeing wins $9.2B Air Force Trainer program in a big boost for its defense side
British Columbia cracks down on dirty money in real estate market
The software is already in use at one Seattle school, and RealNetworks is in talks to expand it to several others across the country. Looking ahead, RealNetworks — known for video- and music-streaming software introduced in the early 2000s — plans to sell SAFR to various industries, though the company is staying completely mum on the details for now.
The introduction of the technology has thrust RealNetworks into the center of a field that is growing quickly as software gets better at identifying faces. But growing along with it are privacy concerns and rising calls for regulation — even from the technology companies that are inventing the biometric software.
Facial-recognition technology is already common, used in everything from photo apps that sort pictures of people, to unlocking an iPhone, to law-enforcement agencies searching databases of driver’s license photos.
Facial recognition is used, broadly, in two ways, said Oren Etzioni, CEO of Seattle’s Allen Institute for Artificial Intelligence, the sister organization to Paul Allen’s brain science institute. One is consumer convenience, such as grouping photos, and the other is for surveillance and tracking.
The big tech players have been involved for years: Microsoft markets Face API for companies to identify and group similar faces for apps and other products, while Amazon has Rekognition, which came under fire earlier this year when the ACLU asked the company to stop selling it to law-enforcement agencies. Google, Apple and Facebook are also in the game, as tagging and grouping photos on smartphones illustrate.
But now, as RealNetworks’ SAFR shows, the technology has been moving further into public spaces. And with that, privacy advocates wonder if people fully realize how often their faces are being scanned, and advocates and the industry alike question where the line is between the benefits to the public and the cost to privacy.
Learning a face
Facial-recognition technology functions much like fingerprinting — each face has its own unique signature, and companies teach machines to recognize and match people’s unique features.
RealNetworks’ technology maps 1,600 data points on each face it sees. The team has been “training” its machine for about two years, since the launch of RealTimes, its free app that lets people build photo slideshows. Baked into the 3,300-word user agreement for that app is language that allows RealNetworks to use customer photos to train its facial-recognition system.
There’s no panacea here. But I do think that trading some degree of privacy is a reasonable trade-off for saving children’s lives.” – Oren Etzioni
SAFR doesn’t know the identity of people in the RealTimes photos, Vance said — there are no names, addresses or other identifying information in the massive database of 8 million faces. But what it can do is tell if two faces are the same person. It’s gotten so accurate that it can tell identical twins apart and match family photos of the same person even if they were taken decades apart.
SAFR relies on being able to identify people “in the wild,” or acting candidly, not posing.
“The great things about those kinds of faces is that they’re people doing things that they naturally do in life,” Vance said. “They’re not mug shots or canned shots. You can overtrain a system for people looking squarely into the camera. But when you’re walking around here, when you’re walking around a school, you’re not always looking squarely at the camera.”
Many face-recognition technologies can also identify basic demographics of a person. Microsoft’s Face API, for instance, can guess your age with just one photo — a feature that has gotten more accurate since it was first released in 2015 to middling user reviews.
That has led to concerns of bias, though, especially since a study at MIT’s Media Lab found some big tech companies’ facial-recognition apps had error rates up to 35 percent higher when identifying women with darker skin compared to men with lighter skin. Some feared that could lead to misidentifying women and people of color, a troubling issue especially if the systems are used by law enforcement.
Microsoft has acknowledged the bias issues and is taking steps to better identify diverse faces, broadening the database it uses to train its system by adding photos of more diverse people.
RealNetworks, however, hasn’t trained its software to identify someone based on race. You couldn’t, for example, ask SAFR to alert you when a white man walks in a door because it won’t know which faces are white.
The company didn’t see any benefit in teaching the machine to recognize race. “It’s loaded,” Vance said of racial identification.
Boundaries of security
SAFR has been watching over the main entrance gate of Seattle’s private elementary University Child Development School (UCDS) since this spring, buzzing in parents who come to pick up or drop off their kids. The school, where RealNetworks CEO Rob Glaser’s kids were enrolled last year, wanted to try out the system to add to overall security at its University District campus.
It acts as an automatic doorman for parents and staff members — if a parent’s face is recognized by the camera mounted above the front gate, the door opens, reducing the need for someone inside the school to diligently answer a buzzer.
“It’s very convenient,” said Ana Hedrick, whose daughter is in second grade at the school. “It feels safe.”
The school sent out information about the system to parents and gave them the option of adding their face to the machine’s database — something that Hedrick and about 300 parents and caregivers have done. The SAFR system at the school will identify only adults and rejects the addition of children to the group of identifiable faces (which a few kids have tried to do, unsuccessfully, using the system’s self-add feature).
The school offered to answer questions about privacy concerns, and explained they use the adults’ photos only for their own security system, said another mother, whose kids are in first and third grade at UCDS. “I trust that the school knows what it’s doing,” she said. “Feeling like my kids are safe here is huge.”
RealNetworks plans to keep giving SAFR to schools for free, even as the company expands it to new industries where it will charge for the system, Vance said.
The rash of school shootings across the country has thrust school security into the spotlight. RealNetworks executives said they know SAFR isn’t a fail-safe system, but they figure each little security component helps, especially one that can recognize if a known, unwanted person has attempted to enter school grounds.
“There’s no panacea here,” said the Allen Institute’s Etzioni, who is not involved in RealNetworks’ project. “But I do think that trading some degree of privacy is a reasonable trade-off for saving children’s lives.”
Some critics, however, aren’t so sure such systems in schools will be effective enough to outweigh the privacy costs.
“There’s a general habituation of people to be tolerant of this kind of tracking of their face,” said Adam Schwartz, a lawyer with digital privacy group Electronic Frontier Foundation. “This is especially troubling when it comes to schoolchildren. It’s getting them used to it.”
School security is a serious issue, he agreed, but he said the benefits of facial recognition in this case are largely unknown, and the damage to privacy could be “exceedingly high.”
Call for regulation
Privacy issues have become an increasingly common topic as facial recognition gets more accurate. Big tech companies, including Amazon and Google, faced opposition for selling their systems to law-enforcement agencies, which some fear could lead to a police state and unfair profiling.
But law-enforcement agencies have also used such systems to find missing people and arrest criminals, something that helps public safety.
Some critics believe tech companies should restrict the use of the technology. But its development and its use by law enforcement agencies has reached the point where the federal government should step in and regulate it, Microsoft President Brad Smith argued in a blog post this summer.
“The only effective way to manage the use of technology by a government is for the government proactively to manage this use itself,” he wrote. “And if there are concerns about how a technology will be deployed more broadly across society, the only way to regulate this broad use is for the government to do so.”
More on AI
Aibo the robot dog will melt your heart with mechanical precision | Tech Review
Lockheed Martin partners with ESPN’s Drone Racing League on self-piloting drone competition
Sony is relaunching its Aibo robot dog – and, it hopes, its brand
‘Please do not switch me off!’: Experiment with begging robot shows people hesitate to pull the plug
Robots are getting more social, but are humans ready to accept them?
Toymaker Anki wants its robot assistant to be a pet for adults
RealNetworks is supporting its Redmond neighbor’s call for regulation, Vance said.
Regulations are already in place in Europe, where a recent sweeping privacy law calls for organizations to inform people before collecting their biometric data and tell them what it will be used for, even in stores and businesses.
A few U.S. states also have consent laws regarding the technology. A Washington law passed last year requires companies, before collecting biometric data, to inform people and tell them what their information will be used for. But most states have no mandates and the federal government has yet to address the topic.
Between here and sci-fi
In China, the technology is so common that it can identify people who are jaywalking and display their photos on public digital billboards.
The U.S. isn’t near that level yet of routinely identifying people in public streets or parks, said Clare Garvie, an associate at the Center on Privacy and Technology at Georgetown Law Center, but she finds the lack of transparency into how the technology is being used and the lack of federal laws troubling.
Garvie was on a team that conducted a widespread study that found 54 percent of U.S. residents are in a facial-recognition database accessible by law enforcement — usually in the form of a driver’s license photo. “It is unprecedented to have a biometric database that is composed primarily of law-abiding citizens,” Garvie said.
“The current trajectory might fundamentally change the relationship between police and the public,” she said. “It could change the degree to which we feel comfortable going about our daily lives in public spaces.”
But proper regulation could prevent that, and there’s reason to be optimistic, Garvie said, pointing to Microsoft’s call for such laws.
Use of the technology has been slowed to some extent because it’s still limited in some ways: It isn’t as accurate identifying faces in different lights or in motion, and it takes a tremendous amount of computing power to quickly match faces among billions of photos. But these hurdles are surmountable, said Alessandro Acquisti, a professor of information technology and public policy at Carnegie Mellon University.
He pointed out that facial recognition can be used for good — to combat child trafficking — and for bad — to track law-abiding citizens anywhere they go.
That doesn’t mean it’s neutral, he said. Anonymity is becoming more scare with the proliferation of photos on social media and the technology that can recognize faces.
“Those with unfettered access to your data, and especially those whose usage of your own data you cannot inquire about or limit, have power over you,” he said.