On Sunday, Immigration and Customs Enforcement (ICE) will begin a series of deportation raids at the behest of President Donald Trump’s administration.
At the request of Democrats, I have delayed the Illegal Immigration Removal Process (Deportation) for two weeks to see if the Democrats and Republicans can get together and work out a solution to the Asylum and Loophole problems at the Southern Border. If not, Deportations start!— Donald J. Trump (@realDonaldTrump) June 22, 2019
As agents break down doors and try to capture 2,000 “high priority” targets–including families who recently crossed the border–they will be guided by both boots-on-the-ground intelligence as well as artificial intelligence.
ICE is among the many government enforcement agencies that, in recent years, has ambitiously expanded its use of artificial intelligence, such as facial and license plate recognition. It has also ramped up monitoring social media and other online activities. While Customs and Border Protection (CBP) uses drones, ICE does not (as of last year, at least). The agency has expressed an interest in them, though.
Constitutional rights advocates have long raised concerns about the legal, ethical, and racial implications of mass surveillance–often without notice or cause–in law enforcement. So far there are few indications that enforcement agencies will pay heed to such concerns.
Many were stunned by recent reports that at least three states quietly granted ICE permission to access millions of driver’s records to analyze the photos with facial recognition software. The state’s—Vermont, Utah, and Washington—allow undocumented immigrants to get driver’s licenses. Vermont and Utah have reportedly complied, though to what extent is not entirely clear. Washington authorized administrative subpoenas, but it is unknown whether searches have been conducted.
Like the Dreamers before them, undocumented immigrants in these states may have provided the government with biographical information–so that they don’t get deported for a simple charge of driving without a valid license–only to have it later handed over to ICE.
State and local law enforcement agencies have also been scrutinized for allowing ICE to use their databases to gather intelligence. Many departments routinely provide ICE agents with information, even if it violates ICE’s privacy policies and other data-sharing laws.
Chicago Mayor Lori Lightfoot recently announced that the city’s police force would not coordinate with ICE and has stripped its access to Chicago Police Department’s database.
“They will not team up with ICE to detain any resident. We have also cut off ICE access from any CPD databases and that will remain permanent,” Lightfoot said.
Chicago’s cold shoulder is not likely to break immigration agents’ stride, however, because ICE can simply turn to Vigilant Solution’s massive database.
Vigilant’s database comprises information gathered via automated license plate readers (ALPRs), facial recognition software, and data-sharing agreements. Through its contract with Vigilant, ICE “has access not only to 5 billion records gathered by private businesses but also to 1.5 billion data points contributed by over 80 local law enforcement agencies from more than a dozen states,” according to Wired.
Some states regulate or limit the dissemination and retention of ALPR data, but the laws are inconsistent–and even more inconsistently enforced–that they may not even apply to a private company like Vigilant.
For instance, ICE may be able to use information from a driver’s license to determine vehicle ownership and license plate number. They then can track that car every time its plate is scanned by an ALPR, anywhere in the nation–indefinitely.
The company has bragged that it adds millions of ALPR hits every month, enabling any of the 9,200 ICE agents who reportedly have access to the data to track a car.
The most controversial artificial intelligence used by law enforcement is facial recognition software. In addition to myriad privacy concerns, facial recognition technology has long been criticized for being inaccurate, particularly when identifying minorities, and for its potential to serve as a tool for racial profiling. Earlier this year, San Francisco became the first city to ban it.
In April, reports surfaced that Amazon’s facial recognition software had issues when identifying women and minorities. Researchers from Google, Facebook, and Microsoft joined more than two dozen experts calling on the company to stop selling it to the government. In May, the American Civil Liberties Union (ACLU) and other groups penned an open letter to Amazon CEO Jeff Bezos making the same request.
At a congressional hearing in June, representatives were especially concerned that these records are nearly always collected from an individual without their knowledge and consent.
The general public is likely aware that their faces will probably be scanned in a place like an airport, but may be less aware that their faces could also be scanned at places like a protest or public event.
Vigilant claims to have amassed 15 million images in its “gallery” of faces. And that’s just one database. Any number of the faces stored by it, other agencies, or companies might correspond to an undocumented immigrant—or even someone who just kind of looks like them.
In 2017, ICE came under fire for its plans to monitor the internet and automatically flag people for deportation or visa denial as part of Trump’s extreme vetting initiative. ICE was developing a system that incorporated machine learning and automated decision-making to constantly scour the internet. The system would look for anyone who fell within the same parameters as people included in Trump’s travel ban–commonly referred to as a Muslim ban because five out of seven of the targeted countries are Muslim-majority countries.
“It will function, in effect, as a digital Muslim ban,” the Brennan Center for Justice wrote at the time.
The system was required to flag at least 10,000 people annually.
Following intense backlash in 2018, ICE dropped the machine-based learning aspect of the program. It did not discontinue it, though. Now humans are to conduct the extreme digital vetting.
It is unknown how many, if any, of the estimated 2,000 people who will be caught up in ICE’s upcoming dragnet were monitored online. Nor is it clear to what extent ICE relies on AI and other tech. It is clear, however, that law enforcement’s use of technology is high and rising.
“It is surprising when a government agency obtains unfettered access to information that reveals where we live, where we work, and our private habits,” Matt Cagle of the ACLU told Wired.
- This list of U.S. ‘concentration camps’ is being seen as a call to action
- Border Patrol officers reportedly mocked migrant deaths in secret Facebook group
- Report: ICE used facial recognition to search DMV databases
- Internet rights group calls on government to stop using facial recognition
Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.