In the weeks after Russia invaded Ukraine and images of the devastation wrought there flooded the news, Hoan Ton-That, CEO of the facial recognition company Clearview AI, began thinking about how he could get involved.
He believed his company’s technology could offer clarity in complex situations in the war.
“I remember seeing videos of captured Russian soldiers and Russia claiming they were actors,” Ton-That said. “I thought if Ukrainians could use Clearview, they could get more information to verify their identities.”
In early March, he reached out to people who might help him contact the Ukrainian government. One of Clearview’s advisory board members, Lee Wolosky, a lawyer who has worked for the Biden administration, was meeting with Ukrainian officials and offered to deliver a message.
Ton-That drafted a letter explaining that his app “can instantly identify someone just from a photo” and that police and federal agencies in the United States used it to solve crimes. That feature has brought Clearview scrutiny over concerns about privacy and questions about racism and other biases within artificial intelligence systems.
The tool, which can identify a suspect caught on surveillance video, could be valuable to a country under attack, Ton-That wrote. He said the tool could identify people who might be spies, as well as deceased people, by comparing their faces against Clearview’s database of 20 billion faces from the public web, including from “Russian social sites such as VKontakte.”
Ton-That decided to offer Clearview’s services to Ukraine for free, as reported earlier by Reuters. Now, less than a month later, the New York-based Clearview has created more than 200 accounts for users at five Ukrainian government agencies, which have conducted more than 5,000 searches. Clearview has also translated its app into Ukrainian.
“It’s been an honor to help Ukraine,” said Ton-That, who provided emails from officials from three agencies in Ukraine, confirming that they had used the tool. It has identified dead soldiers and prisoners of war, as well as travelers in the country, confirming the names on their official IDs. The fear of spies and saboteurs in the country has led to heightened paranoia.
According to one email, Ukraine’s national police obtained two photos of dead Russian soldiers, which have been viewed by The New York Times, on March 21. One dead man had identifying patches on his uniform, but the other did not, so the ministry ran his face through Clearview’s app.
The app surfaced photos of a similar-looking man, a 33-year-old from Ulyanovsk who wore a paratrooper uniform and held a gun in his profile photos on Odnoklassniki, a Russian social media site. According to an official from the national police, attempts were made to contact the man’s relatives in Russia to inform them of his death, but there was no response.
Identifying dead soldiers and notifying their families is part of a campaign, according to a Telegram post by Ukrainian Vice Prime Minister Mykhailo Fedorov, to break through to the Russian public the cost of the conflict and to “dispel the myth of a ‘special operation’ in which there are ‘no conscripts’ and ‘no one dies,’” he wrote.
Images from conflict zones of slaughtered civilians and soldiers left behind on city streets turned battlefields have become more widely and instantaneously available in the social media era. President Volodymyr Zelenskyy of Ukraine has shown graphic images of attacks on his country to world leaders in making his case for more international aid. But beyond conveying a visceral sense of war, those kinds of images can now offer something else: a chance for facial recognition technology to play a significant role.
Critics warn, however, that the tech companies could be taking advantage of a crisis to expand with little privacy oversight and that any mistakes made by the software or those using it could have dire consequences in a war zone.
Evan Greer, a deputy director for the digital rights group Fight for the Future, is opposed to any use of facial recognition technology and said she believed that it should be banned worldwide because governments had used it to persecute minority groups and suppress dissent. Russia and China, among others, have deployed advanced facial recognition in cameras in cities.
“War zones are often used as testing grounds — not just for weapons, but surveillance tools that are later deployed on civilian populations or used for law enforcement or crowd control purposes,” Greer said. “Companies like Clearview are eager to exploit the humanitarian crisis in Ukraine to normalize the use of their harmful and invasive software.”
Clearview is facing several lawsuits in the United States, and its use of people’s photos without their consent has been declared illegal in Canada, Britain, France, Australia and Italy. It faces fines in Britain and Italy.
Greer added, “We already know that authoritarian states like Russia use facial recognition surveillance to crack down on protests and dissent. Expanding the use of facial recognition doesn’t hurt authoritarians like Putin. It helps them.”
Facial recognition has advanced in power and accuracy in recent years and is becoming more accessible to the public.
While Clearview AI said it makes its database available only to law enforcement, other facial recognition services that search the web for matches, including PimEyes and FindClone, are available to anyone willing to pay for them. PimEyes will surface public photos on the internet, while FindClone searches photos scraped from the Russian social media site VKontakte.
Facial recognition vendors are choosing sides in the conflict. Giorgi Gobronidze, a professor in Tbilisi, Georgia, who bought PimEyes in December, said he had barred Russia from using the site after the invasion started, citing concerns it would be used to identify Ukrainians.
“No Russian customers are allowed to use the service now,” Gobronidze said. “We don’t want our service to be used for war crimes.”
Groups like Bellingcat, the Dutch investigative site, have used facial recognition sites for reports on the conflict and on Russia’s military operations.
Aric Toler, research director at Bellingcat, said his preferred face search engine was FindClone. He described a three-hour surveillance video that surfaced this week, said to be from a courier service in Belarus, showing men in military uniforms packing up materials, including TVs, car batteries and an electric scooter, for shipping.
Toler said FindClone allowed him to identify several of the men as Russian soldiers sending “loot” to their homes from Ukraine.
As Ukraine and Russia fight an information war over what motivated the invasion and how it is going, journalists like Toler sometimes play the role of arbiter for their audiences.
Federov, Ukraine’s deputy prime minister, tweeted a still from the same surveillance tape, of one of the soldiers at the courier service counter. Federov claimed the man had been identified as an “officer of Russian special forces” who had committed atrocities in Bucha and was “sending all the stolen items to his family.”
Federov added, “We will find every killer.”
The technology has potential beyond identifying casualties or tracking certain units. Peter Singer, a security scholar at New America, a think tank in Washington, said the increasing availability of data about people and their movements would make it easier to track down individuals responsible for war crimes. But it could also make it hard for civilians to lie low in tense environments.
“Ukraine is the first major conflict that we’ve seen the use of facial recognition technology in such scale, but it is far from the last,” Singer said. “It will be increasingly hard for future warriors to keep their identity secret, just as for regular civilians walking down your own city streets.
“In a world of more and more data being gathered, everyone leaves a trail of dots that can be connected,” he added.
That trail is not just online. Drone footage, satellite images, and photos and videos captured by people in Ukraine are all playing a role in discerning what is happening there.
Toler of Bellingcat said the technology was not perfect. “It’s easy to misfire; that goes without saying,” he said. “But people are more right than wrong with this. They have figured out how to corroborate identifications.”
Faces can look similar, so secondary information, in the form of an identifying mark, a tattoo or clothing, is important to confirm a match. Whether that will happen in a tense wartime situation is an open question.
Toler is not sure how much longer he will have access to his preferred facial recognition tool. Because FindClone is based in Russia, it has been subject to sanctions, he said.