A poorly edited video purporting to show Ukrainian President Volodymyr Zelenskyy publicly capitulating to Russian demands drew widespread ridicule on Wednesday, but experts said it could be a harbinger of more sophisticated deceptions to come.
The video appeared to show an ashen-faced Zelenskyy speaking from the presidential lectern and urging his countrymen to down their weapons in the face of Russian invaders.
Internet users immediately flagged the discrepancies between the skin tone onZelenskyy's neck and face, the odd accent in the video, and the pixelation around his head.
Nina Schick, the author of "Deepfakes," said the video looked like "an absolutely terrible faceswap," referring to programs that can digitally graft one person's face onto another's body -- part of a wider family of computer techniques that can create hyperrealistic forgeries known as "deepfakes."
Television station Ukraine24 said in a Facebook post that the video was broadcast by "enemy hackers" and was "FAKE! FAKE!" The station could not immediately be reached for further detail and Ukraine's cyber watchdog agency did not immediately return messages seeking comment. But Ukraine's Ministry of Defense later released a video from the real Zelenskyy apparently dismissing the footage as a "childish provocation."
"We are not going to lay down any weapons until our victory," he said.
Ukrainian officials have been warning of the danger of deepfakes, especially after Moscow's forces were denied a quick victory on the battlefield following their Feb. 24 invasion.
Two weeks ago, Ukraine's military intelligence agency put out a short video h alerting the country to the danger of deepfakes, alleging that the Kremlin was preparing a stunt involving one.
The Russian Embassy in Washington did not immediately return a message seeking comment.
Schick called the fake Zelenskyy video "very crude," but warned that the it was a matter of time before the technology became more accessible.
"Expect fakes like this to become easier to produce while appearing highly authentic," she said.
© 2022 Thomson/Reuters. All rights reserved.