This man is said to have looted an apartment in a Kyiv suburb. A war crime – albeit one of the less serious ones currently being committed by Russian soldiers in Ukraine. He took a selfie with a Polaroid camera that he found in the apartment and then carelessly left it lying around.
Ukraine wants to identify the man using artificial intelligence. Because in the Ukraine war, both warring parties are currently trying to gain an advantage with AI. The Ukraine in particular makes this very effective in terms of publicity – it often works well, but sometimes not so well.
The Deputy Prime Minister and Minister of Digitization of Ukraine, Mikhail Fedorov, believes he knows the identity of the soldier – and publicly denounces him. “With AI technology,” the “looter and war criminal,” he said, could be identified within a minute. It is 26-year-old Nikita Tretyakov.
One of the few remaining independent Russian media outlets, “Vashnije Istorii” (which is Russian for “important stories”), sees things differently. They say: No, the photo shows 25-year-old Pavel Pilantow. In fact, Pilantov is in the regiment that was in the Kyiv suburb where the photo was taken. The journalists could confirm that. Nikita Tretyakov, on the other hand, was not in Ukraine at the time of the fighting, his sister claims. He hasn’t had anything to do with the army for a long time. Photos of him at an event, taken on the day of the fighting, can also be found online. Due to the media attention that followed the Deputy Prime Minister’s tweet, Nikita Tretyakov has deleted his presence on social networks. We don’t know what he thinks about the war. But the Polaroid photo certainly looks a lot more like Pavel Pilantov than him.
The AI technology that the Ukrainian minister is talking about apparently also makes mistakes. And with such delicate purposes, every mistake is a problem. Ukraine uses Clearview AI software not only to identify war criminals and prisoners of war and collect evidence, but also to identify those killed Russian soldiers who the Russian army – sometimes even their own families – is hiding. Somewhere between information gathering and information warfare.
However, AI algorithms are not only used for facial recognition. The Ukrainian secret service has managed to intercept unencrypted telephone calls between Russian soldiers and their families on a large scale. These were transcribed, translated and analyzed using a tool from an American company called Primer. It would take too many people to listen to the phone calls one by one. So you leave that to an AI: You train the speech recognition system on certain words and phrases, teach it to recognize dialects as well, and then tell it to filter out relevant conversations. Like this shocking example of a soldier calling his wife at home.
Woman: You can rape Ukrainian women there, but don’t tell me about it, okay?
Man: Mmm. I’m allowed to rape, but I shouldn’t tell you anything?
Woman: Yes, so I don’t know anything about it.
Man: May I?
Woman: Yes, I allow you. But use birth control.
For the Ukrainian side, such calls are worth their weight in gold in the information war. They show how brutal Russian soldiers act and leave no doubt: the “good side” is Ukraine.
So we see: AI has become a powerful tool in war, but it is still error-prone. The holy grail of AI research has not even come close. AI can basically be divided into two categories: weak and strong AI.
All of the above examples are so-called weak AI. In other words, AI, which is usually trained using machine learning, to perform a specific task, such as recognizing faces, faster and better than humans.
Strong or general AI would be able to solve complex problems and make autonomous decisions, similar to rational humans. Strong AI does not really exist yet, but not only Russia is hoping for massive geopolitical advantages from its successful invention.
Vladimir Putin: «As I said before: AI is an enormously powerful tool. Those who master it will emerge as leaders. You will gain a massive competitive advantage.”
An AI that is intellectually on par with humans but does not fear for its own survival would be capable of terrible things in war. However, we are still a long way from such a «Terminator robot».
Neither Russia nor Ukraine currently has autonomous AI weapons, i.e. weapons that are controlled solely by artificial intelligence, although research is of course being carried out worldwide and breakthroughs are also taking place. Russia, for example, states that its KUB-BLA drone, manufactured by Kalashnikov, already uses AI to identify targets. Whether this is true is difficult to verify.
But even with weak AI, military applications are more complicated than civilian ones. Because AI need huge amounts of data to learn. And because, thankfully, there aren’t as many military confrontations in the world as there are faces or voices, it’s also a lot easier to train a face or speech recognition algorithm than one that identifies an opponent’s weapons in a battle.
The bad news: In the Ukraine war, Russia can collect a lot of material – including the weapons supplied by NATO countries. However, Ukraine and its allies also have this opportunity. Ukraine, for example, already uses software that enables its drones to use AI to scan a large area and detect enemy tanks, even when they are cloaked. The coordinates of any tanks are then relayed to the soldiers operating the drone. The US military also says it has made significant advances in AI analysis of satellite and drone imagery. Ultimately, the United States wants to use images from Ukraine to train an AI that is supposed to learn how to simulate Russian tactics on the battlefield.
So Vladimir Putin is right about at least one thing: Whoever masters artificial intelligence will have a clear geopolitical and military advantage in the future. Major military powers, including Russia and the USA, have so far resisted legal regulation of autonomous weapon systems. That doesn’t bode well for world peace.
#Ukraine #War #Artificial #Intelligence #Battlefield