The security guard couldn’t hear Beth Williams over the screeching alarm. LED strobe lights danced up and down the powder-blue and pastel-pink walls of the Cub House day-care center. The guard moved to block Maria Sanchez from walking toward a classroom to pick up her daughter. He extended his arm toward her chest. Maria looked frightened, and Beth, Cub House’s founder and director, was horrified. She rushed to the guard and hissed, “She’s fine!” directly into his ear. She then took out her phone and typed a few numbers, and the alarm and lights stopped. A piano resumed playing “Somewhere Over the Rainbow” in a classroom down the hall.

“I’m really sorry about that,” Beth said, putting a hand on Maria’s shoulder.1 “We’re testing our facial recognition security system. The stupid thing keeps going off for no reason.”

A Faulty System

Late that afternoon, after all the children and their caregivers had left the building, Beth asked Charles Rivers, her chief technology officer, and Anthony Michaels, her head of security, to join her in the teachers’ lounge. Both men had been with the not-for-profit Cub House since 2010, when Beth had founded it in Dayton, Ohio, her hometown. The largest facility of its kind in the state, it had numerous recreational and educational spaces for children up to age five, and caregivers could use it flexibly. Some kids attended daily; others came once a month. But all the kids and the adults who dropped them off and picked them up were expected to be registered, and the adults had to supply photo IDs and license plate numbers with their names.

Three months earlier, while the receptionist and the security guard were away from the welcome desk, an elderly, mentally ill woman had wandered into the facility, frightening several toddlers and causing an uproar with parents. Charles and Anthony had suggested that Beth implement a facial recognition–based security system. It would use cameras to photograph every visitor’s face and software to analyze the face’s geometry, after which each photo would receive a unique code tagged with a name and access permissions. If someone unrecognized by the system or with an obscured face entered the building, or a registered visitor went somewhere unauthorized, such as a child’s bathroom or an administrative office, an alarm would sound.

Anthony had said that thanks to the Cub House’s nonprofit status, the vendor was willing to waive the up-front costs to get the system running properly, and the project would take only a few weeks. Charles had agreed: “The whole thing will be financially manageable and technologically sound, and it will help everyone sleep better at night.”

Beth had been skeptical. Were they really going to delegate the important job of ensuring child and visitor safety to an automated system? Facial recognition seemed creepy to her, not the sort of thing to have in a children’s space. Still, her colleagues were insistent, so she’d agreed to test the system, and in informal trials with her staff over a three-day development break, when the center was closed to visitors, the technology had worked perfectly. When she sent a teacher’s assistant who didn’t yet have full access into a restricted area, the alarm sounded. When she rushed past reception with her face in her hands, the alarm sounded. The security app on her phone captured an image of the trespasser, and Beth could enter a passcode to stop the alarm. She ran dozens of informal tests like those, and the system responded flawlessly. Beth’s concerns fell away.

But the morning the system went live, it was an immediate disaster. The alarm sounded on five occasions, none of which was a security breach. Each time, it frightened the staff, the children, and the caregivers in the facility. The fifth incident had been the one involving Maria Sanchez.

“I’m concerned,” Beth said to Charles and Anthony that afternoon. “We’ve already gotten complaints. It just seems like a huge community-relations and visitor-satisfaction risk.”

“These are typical first-day kinks,” Charles countered. “We know the system works. The vendor assures me that several hospitals and schools have already tested it.2 We can just explain to the parents that this is all being done to protect their kids. And if people see that we’re prioritizing safety, they’ll be more likely to come here. If we can prevent even one incident during which someone might be harmed, this project will have been worth it.”

“But I think we’ve got a problem,” Anthony interjected. “Did you happen to notice any similarities among the people who triggered the alarm?”

Beth and Charles looked at each other and shrugged.

“All five were dark-skinned women,”3 Anthony said. “Facial recognition systems, like any other automated technology, depend on the data used to build them: If the faces used were primarily white and male—which describes most of the guys who program these things—the system will make more errors when evaluating Black, brown, or feminine faces. We can address this problem partly in technical ways—right, Charles?” Charles nodded. “But can you guarantee that we’ll be able to fully solve it?”

Beth was again horrified. If false alarms disproportionately affected women and people of color—which described the vast majority of those who came to the center each day—not only would it be a huge and frequent disruption and a potential reputational and legal-liability risk, but it would go against everything she tried to teach the children about fairness and empathy. That said, given the number of school shootings4 in the United States, high crime rates in Dayton, and the scare with the mentally ill woman, keeping Cub House safe was without a doubt her and her clients’ number one priority.

“Let’s shut the system down for now,” Beth said. “I’ll give you and the vendor a few weeks to work on fixing the false alarms.”

The Legal Implications

Beth met Tanya Marshall for lunch the next day at a bistro in downtown Dayton. The two women had been friends since they were undergraduates at Ohio State. Tanya was now the first Black partner at the city’s largest law firm. She’d encouraged Beth to build the Cub House back when it was just a set of PowerPoint slides. Her two children had attended regularly as toddlers, and she was one of the Cub House’s most active board members and consistent donors.

“I wanted to check in with you,” Tanya said after they’d ordered. “Maria Sanchez posted on the Cub House parents’ Facebook page that she and two other Latinx women were stopped by alarms when they entered the center.”

“It was five,” Beth said.

“Five?”

“Yes, the alarm rang for five different women of color.” Beth told Tanya about the mentally ill woman, the facial recognition software, and the software’s flaws. “The system is being improved as we speak,” she said.

“You’d better not bring it back,” Tanya said. “At least not yet. You might be opening the center up to a lawsuit if Ohio passes new privacy laws.” She cited recent lawsuits and settlements in Illinois involving Facebook, Google, and Clearview AI.

“Most people are uncomfortable5 when companies use this kind of technology,” Tanya said. “I’m not sure it would be any different for the Cub House.”

“Even if it makes us safer?” Beth asked. “What if we take the system out and there’s another breach that we could have prevented? God forbid someone enters the facility with a gun! How would I explain that to parents—that we had the technology to better protect the kids but didn’t use it? Wouldn’t that open us up to a lawsuit too?”6

Tanya was silent for a moment. “Look, I’m not saying you can’t use facial recognition software,” she said. “But you’ve got to think it through, legally and ethically, and you’ve got to make sure you’re covered from all angles. If women of color start getting harassed by security guards, you can say goodbye to the Cub House.”

Good to Go?

For the next month Charles worked with the vendor to improve the security system’s accuracy. They installed three high-resolution cameras at the reception desk, which they used to take photos from varying angles of everyone who entered and exited the building. A sign at the reception desk stated, “Your picture is being taken for security purposes.” The facial recognition software tagged each photo with the visitor’s name and access permissions. The more photos that were attached to each file, the more accurate the system would be in recognizing the face. To directly combat the issues with feminine and dark-skinned faces, Beth had authorized the installation of high-powered lights at the reception desk so that the cameras could better capture distinctions.

To measure how well the system would work in real time, Charles used it during the summer recital, when most of the kids and their caregivers were at the Cub House listening to five-year-olds singing songs from The Wizard of Oz. He disabled the alarm but kept notifications active so that he’d receive an alert whenever a breach was registered. He received two that evening. The first was triggered when an unregistered parent entered the center without stopping at reception. The second was triggered by a registered father who took his toddler into a children’s bathroom for a diaper change.

The next morning Charles brought in doughnuts, and he, Beth, and Anthony discussed the results at a picnic table in the Cub House’s backyard. Charles said he was confident that the system had improved and that false alarms, although inevitable, would be less frequent. He recommended lowering the volume slightly so that the alerts wouldn’t be as harrowing. He had also drafted a letter to the community explaining the system and potential causes of false alarms and asking parents to forgive him in advance for any inconvenience.

Anthony was satisfied with the plan. At worst, the system would be an annoyance. At best, it would protect the kids and provide some help to his security staffers.

Beth was still worried that a month of tinkering and a night of testing wasn’t enough to ensure the system’s accuracy. A quieter alarm was better, sure, but what good would it be if not everyone could hear it? Would this technology really make the Cub House safer? Or would it open Beth’s business to further problems?

The Experts Respond: Should Beth implement facial recognition software at the Cub House?

Joseph Steinberg, a former CEO of SecureMySocial and Green Armor Solutions, is the author of Cybersecurity for Dummies.

I would ask Beth three questions: Does the technology solve the problem that you are trying to solve? If so, is it the best way to solve your problem? And does it solve the problem without creating new problems? The answer to all three questions is no.

Beth does not need, and should not use, facial recognition technology as a primary mechanism for securing the Cub House against entry by unauthorized parties. There are better ways to achieve her aims—and those ways will deliver better security than will facial recognition technology while creating fewer undesirable side effects.

Beth could, for example, issue scannable identification badges to authorized entrants. A guard could scan the ID of anyone trying to enter the Cub House and compare the appearance of would-be entrants with photos that appear on a computer screen. (The guard would no doubt become familiar with people who arrive every day.) Beth could lock from the outside and alarm all Cub House doors except the front entrance and give the guard camera views of all doors. She might even install two sets of doors in a man-trap formation at the entrance.

A classic approach may be less fancy than facial recognition, but it should also be more effective. Depending on how it is configured, facial recognition technology is likely to either create more false alarms or allow more unauthorized people to enter the Cub House than would a security guard. Also, unlike a guard, such a system cannot stop a would-be shooter or other unauthorized person from entering by force. The bottom line is that the Cub House is likely to be less secure with Beth’s facial recognition system in place than it would be if Beth utilized an alternative, time-tested approach.

Cedric L. Alexander is the author of In Defense of Public Service and The New Guardians.

Facial recognition is a useful technology. We use it every day—to open our phones, when we travel, for security purposes. But you can’t test and promote a product around facial recognition using only white people as your subject base, and you can’t test it in one night during a concert, which is what Beth’s team did. Before she brought the system into the Cub House, where she’s responsible for the safety of children, Beth should have talked to other vendors and other people who have experience with facial recognition systems. There needs to be reliable data to support the ability of the system to respond accurately. It certainly helps that the Cub House added lighting and a better camera system, but the company Beth is buying the technology from really needs to show her more data to prove that the system is accurate. She should ask the vendor directly, “Is this product being tested on a broad population of people?” Then she must work with the vendor to provide her staff with expert training, and she should require the staffers using the system to meet certification requirements.

I would have spoken to the parents up front, not just in a letter from Charles. Beth could have talked with them one-on-one or in groups. She could have explained everything to them: “Here is the technology we’re going to be using to keep your children safe. This is the vendor we’re considering using. Here’s how the technology will reduce the likelihood of someone’s coming in here and doing harm to your children.” When you tell people that you’re considering using a new technology, they have an opportunity to buy into it or to express their concerns. People are often willing to take part in something new and innovative, especially if it comes across as fair and productive. But if Beth doesn’t talk to the parents and something goes awry, she’ll face more backlash, even if Charles has already sent a letter.

In the next 10 or 15 years facial recognition systems will become a normal part of how we secure ourselves. But they must be refined, and we must do a better job of teaching the public the value of the technology.

HBR’s fictionalized case studies present problems faced by leaders in real companies and offer solutions from experts. This one is based on the University of Virginia Case Study “Ubiquitous Surveillance” (product no. UVA-OB-1403), by David Danks and Maralee Harrell, which is available from UVA Darden Business Publishing.

A version of this article appeared in the November–December 2022 issue of Harvard Business Review.