More than a quarter of the most-viewed coronavirus videos on YouTube contain “misleading or inaccurate information”, a study suggests.
In total, the misleading videos had been viewed more than 62 million times.
Among the false claims was the idea that pharmaceutical companies already have a coronavirus vaccine but are refusing to sell it.
YouTube said it was committed to reducing the spread of harmful misinformation.
The researchers suggested “good quality, accurate information” had been uploaded to YouTube by government bodies and health experts.
But it said the videos were often difficult to understand and lacked the popular appeal of YouTube stars and vloggers.
The study, published online by BMJ Global Health, looked at the most widely viewed coronavirus-related videos in English, as of 21 March.
After excluding duplicate videos, videos longer than an hour and videos that did not include relevant audio or visual material, they were left with 69 to analyse.
The videos were scored on whether they presented exclusively factual information about viral spread, coronavirus symptoms, prevention and potential treatments.
Videos from government agencies scored significantly better than other sources, but were less widely viewed.
Of the 19 videos found to include misinformation:
- about a third came from entertainment news sources
- national news outlets accounted for about a quarter
- internet news sources also account for about a quarter
- 13% had been uploaded by independent video-makers
The report recommends that governments and health authorities should collaborate with entertainment news sources and social media influencers to make appealing, factual content that is more widely viewed.
YouTube said in a statement: “We’re committed to providing timely and helpful information at this critical time, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, using NHS and World Health Organization (WHO) data, to help combat misinformation.
“We have clear policies that prohibit videos promoting medically unsubstantiated methods to prevent the coronavirus in place of seeking medical treatment, and we quickly remove videos violating these policies when flagged to us. Now any content that disputes the existence or transmission of Covid-19, as described by the WHO and the NHS is in violation of YouTube policies. For borderline content that could misinform users in harmful ways, we reduce recommendations.
“We’ll continue to evaluate the impact of these videos on communities around the world.”
Analysis
by Marianna Spring, specialist disinformation and social media reporter
In recent weeks, there has been an increase in highly polished videos promoting conspiracy theories being shared on YouTube – and they prove very popular.
So these findings – although concerning – are not surprising.
The accurate information shared by trusted public health bodies on YouTube tends to be more complex.
It can lack the popular appeal of the conspiracy videos, which give misleading explanations to worried people who are looking for quick answers, or someone to blame.
That includes videos such as Plandemic, which was widely shared online last week.
High-quality production values and interviews with supposed experts can make these videos very convincing. Often facts will be presented out of context and used to draw false conclusions.
And tackling this kind of content is a game of cat-and-mouse for social media sites.
Once videos gain traction, even if they are removed, they continue to be uploaded repeatedly by other users.
It is not just alternative outlets uploading misinformation either. Whether for views or clicks, the study suggests some mainstream media outlets are also guilty of spreading misleading information.