italianestro/shutterstock

Trump Calls For More Biometric Scans, Data Sharing To Stop Terrorism

International travelers can expect to see more facial recognition and other biometric technologies per the latest national security strategy document.

Travelers coming to the U.S. might have seen new biometric facial readers at airports, ports and land crossings this year and, per a new national security strategy, are likely to see a lot more in the near future.

President Donald Trump on Wednesday signed the National Strategy to Combat Terrorist Travel, a new policy document that calls for increased use of biometric technologies and sharing of the data being collected in order to prevent the travel of known or suspected terrorists.

“Key to detecting and interdicting terrorists attempting to travel will be enhancing systems that validate identities and advancing the use of biometric technologies,” Trump wrote in the introduction.

The use of biometrics, as well as biographic identifiers, is central to the strategy as the primary form of vetting whether a traveler is a suspected terrorist.

“Vetting includes automated biographic and/or biometric matching against watchlists and threat information,” the strategy states. The process referenced in the strategy “does not include the physical screening or inspection of people or goods that may occur at the border, United States Secret Service venues, or transportation checkpoints.”

Customs and Border Protection’s Entry/Exit office has been working on biometric programs for years, most recently rolling out a set of facial recognition pilots at international airports and border crossings. The program stopped an alleged imposter from entering the country this week: an 18-year-old El Salvadoran man using a false U.S. passport at the Port of San Luis.

The pilots collectively stopped 26 alleged imposters from entering the U.S. in the first three months of operation, the vast majority at land crossings on the southern border.

Those programs don’t store the data they ingest but do match facial scans against databases like the ones referenced in the strategy document. Among enhancements to the nation’s intelligence and data collection capabilities, the strategy calls for the government to “improve functionality of identity-management systems and expand the collection and use of biometric, biographic and derogatory data for vetting and screening."

Later, the strategy calls on administration officials to urge foreign partners to adopt these same technologies and share data between countries.

During a press conference in September unveiling the facial biometric boarding at Dulles International Airport in Washington, D.C., federal officials told reporters the program is not connected to any international databases. However, vetted information from international partners could find its way into U.S. databases, and vice versa.

“CBP is working diligently to meet the Congressional mandate for biometric entry/exit in a way that is most efficient and secure for the traveler, and is the least disruptive for the travel industry while effectively enhancing immigration and border security,” a CBP spokesperson told Nextgov after the new strategy was signed. “As part of these efforts, CBP is working with DHS to propose and implement the necessary regulatory updates. A rigorous process is in place to propose regulatory changes, and CBP and DHS are actively working toward that end.”

Civil liberties groups continue to speak out against the broad use of facial recognition technology, voicing concerns over privacy and the ramifications of false positives, which tend to be higher for minorities.

“Congress should be reviewing and limiting existing border surveillance programs, not providing additional funding for dangerous technologies that infringe on our basic rights,” said Evan Greer, a lead organizer with Fight for the Future, which pulled together a group of 25 civil liberties groups to oppose the increased use of surveillance technologies at the border.

Other groups, like the American Civil Liberties Union, have moved to block private companies from selling biometric technologies to the government, specifically citing problems with false positives.

For instance, Amazon’s Rekognition system reportedly incorrectly matched 28 members of Congress to mug shots and had a 5 percent overall error rate among legislators. That error rate is much higher among minority populations, prompting civil rights groups to issue warnings that such technologies will lead to more racial profiling and civil rights abuses.

The strategy accounts for some of this, including a call out in the reference appendix entry for vetting: “Vetting includes automated biographic and/or biometric matching against watchlists and threat information as well as manual and automated processes used to resolve potential matches and false positives.”

“We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future,” Matt Wood, general manager of artificial intelligence for Amazon Web Services, told Nextgov in a statement in January. “Through responsible use, the benefits have far outweighed the risks.”