On Tuesday, a man with the nickname “Ukrainian soldier” [sic] posted a live video on TikTok taken from a window showing buildings in the scorching sun, sirens wailing in the background. Speaking in English, he thanked tens of thousands of subscribers for sending virtual gifts that can be converted into cash.

But in reality, the sky was overcast and snowing in Kiev that day and the sounds apparently did not match the sirens heard in the city.

After the Financial Times reported the account to TikTok, the Chinese video app deleted the user – but not before the stream amassed over 20,000 views and over 3,600 freebies were sent, costing people supporters around £50 in total.

The ByteDance-owned app with over a billion users is best known for its viral dance videos, but has recently become a news source for many young people watching the Russian invasion of Ukraine through their telephone. Ukrainian accounts shared their experience, galvanizing support and help for their country.

But the dispute has also exacerbated loopholes in TikTok’s moderation controls that have allowed a proliferation of fraudulent accounts posting fake content to attract followers and money, experts say.

“There was an initial information void that was particularly susceptible to being filled by inaccurate images,” said TikTok misinformation researcher Abbie Richards.

Most of the cases involved users inadvertently posting misleading or inaccurate images, Richards said. But in other cases, it has led “people to post misleading content for likes, followings and views. Then there are people who snap, especially when it comes to live streams where donations are really easy.

Researchers point out that financial incentives on TikTok, designed to reward creators, have unintended consequences. Creators can receive virtual gifts – such as digital roses and pandas – during live streams and convert them into diamonds, a TikTok currency, which can then be withdrawn into real money. TikTok takes a 50% commission on money spent on virtual gifts, according to the creators of the platform. The company said it did not disclose the financial breakdown of the video giveaways and that the monetary value of the diamonds was based on “various factors”.

The FT found several live streams about the war in Ukraine that included misinformation, which had received thousands of virtual gifts and views. A video showed a destroyed building with mournful music, which the FT traced to archive footage from a photographer in Latvia. TikTok has removed several videos and accounts flagged by the FT for dangerous misinformation, illegal activity, restricted goods and authenticity violations.

One of the top-ranking posts, when you search the hashtag Ukraine on the platform, is a video of soldiers in military gear giving an emotional goodbye to women. It has been watched 7.3 million times but is actually a scene from a 2017 Ukrainian film The War of the Chimeras.

TikTok became the world’s fastest growing social media company in 2020 as users flocked to the platform during the pandemic. But its huge growth has raised questions about the short-video app’s moderation capabilities, as the company catches up and tries to learn from its former social media rivals.

In order to solve this problem, it has reinforced its team of moderators in Europe whose members manually review the most violent and disturbing content on the platform, and it also uses artificial intelligence technology to report offensive material.

Bret Schafer, head of the information manipulation team at the Alliance for Securing Democracy, said TikTok was in the “nascent stages of its large-scale content moderation policies around a conflict like this one”.

“It has a more opaque system than Facebook or Twitter, so it is difficult to know to what extent it [enforces policies]. But publicly his message has been aligned with Silicon Valley,” he added.

Social media companies are coming under increased scrutiny in the West for their perceived failures not to remove inaccurate content and Russian state propaganda during the escalation. Earlier this week, Meta-owned Facebook, Google’s YouTube and TikTok agreed to remove state-backed Russian media outlets Russia Today and Sputnik in the EU following demands from the bloc.

Experts say TikTok’s algorithm enables the spread of viral misinformation more than its peers. They also noted that the platform’s editing tools allow users to easily reuse or mix audio and visual content from a variety of sources.

On Monday, the Russian communications regulator complained that there was anti-Russian content related to its “special military operation” in Ukraine on TikTok and demanded that the company stop recommending military content to minors.

“The algorithm is currently aggressively promoting content related to Ukraine, whether it’s true or not,” said Imran Ahmed, chief executive of the Center for Countering Digital Hate, a campaign group that monitors hate speech. on the Internet. He added that TikTok does not provide enough data for users to verify content themselves.

On Friday, TikTok announced it would apply labels to content from some state-controlled media and add warning prompts to some videos and live streams.

On Sunday, TikTok announced it would suspend live streaming and posting of new content in Russia in response to the country’s recently implemented “fake news” law.

In a blog post, the company said it would continue to “examine the security implications of this law.” TikTok’s in-app messaging service will not be affected, he added.

TikTok’s off-the-cuff response stands in stark contrast to how quickly it reacts to erase political content that risks upsetting Beijing. The company removed videos of Hong Kong’s pro-democracy protests in the summer of 2019, according to the Information. TikTok has also blocked users from using the words “labor camp” and “re-education centers” in captions, to prevent any discussion of China’s treatment of Uyghur minorities.

Michael Norris, senior research analyst at Shanghai-based consultancy AgencyChina, said the deluge of fake videos on TikTok showed there was “little sense of urgency” among ByteDance officials to tackle the situation.

TikTok said the safety of its community is a top priority and it is closely monitoring the situation with increased resources. “We take action against content or behavior that threatens the security of our platform, including removing content that contains harmful misinformation,” a spokesperson added.