含羞草研究社

Skip to content

Health misinformation creates 含羞草研究社榳hack-a-mole含羞草研究社 situation for tech platforms

World Health Organization analysis found misinformation in up to 51 per cent of social media posts
web1_2024031817038-63930360bdcd42a4da8befa3c7791e074832fd58736454c9a4e63ddfecfe1747
For social media companies, how to address health information has become a perennial question that has only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online. Now, it含羞草研究社檚 not uncommon to spot medical misinformation with almost every scroll.This March 18, 2010, file photo shows the YouTube website in Los Angeles. THE CANADIAN PRESS/AP/Richard Vogel

When Dr. Garth Graham thinks about health misinformation on social media platforms, he envisions a garden.

No matter how bountiful or verdant that garden is, even the head of YouTube含羞草研究社檚 global health division admits it含羞草研究社檚 often in need of tending.

含羞草研究社淗ow do you weed and pull out the bad information?含羞草研究社 he questioned.

含羞草研究社淏ut also含羞草研究社ow do you plant the seeds and make sure people have access to good information as well as high quality information?含羞草研究社

For social media companies, these have become perennial questions that have only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online.

Now, it含羞草研究社檚 not uncommon to spot misinformation with almost every scroll.

A 2022 paper published in the Bulletin of the World Health Organization reviewed 31 studies examining how prevalent misinformation is. The analysis found misinformation in up to 51 per cent of social media posts associated with vaccines, up to 28.8 per cent of content associated with COVID-19, and up to 60 per cent of posts related to pandemics.

An estimated 20 to 30 per cent of YouTube videos about emerging infectious diseases were also found to contain inaccurate or misleading information.

The consequences can be harmful, if not deadly.

Research the Council of Canadian Academies released in 2023 said COVID-19 misinformation alone contributed to more than 2,800 Canadian deaths and at least $300 million in hospital and ICU visits.

Platforms take the risks seriously, Graham said in an interview. 含羞草研究社淲e are always concerned about anything that may produce harm.含羞草研究社

That concern often leads platforms to remove anything violating their content policies.

YouTube, for example, has banned content denying the existence of some medical conditions or contradicting health authority guidance on prevention and treatment.

Examples embedded in its medical misinformation policy show the company removes posts promoting turpentine, gasoline and kerosene as a treatment for certain conditions because these substances cause death. Ivermectin, used to treat parasitic worms in animals and humans, and hydroxychloroquine, a malaria drug, are also barred from being promoted as COVID-19 cures.

When it comes to vaccines, YouTube bans videos alleging immunizations cause cancer or paralysis.

Facebook and Instagram parent company Meta Platforms Inc. refused to comment for this story and TikTok did not respond to a request for comment, but in broad strokes, these companies have similar policies to YouTube.

Yet Timothy Caulfield, a University of Alberta professor focused on health law and policy, still spots medical misinformation on platforms. He recently asked his students to search for stem cell content and several posts spreading unproven therapies came up easily.

Still, he sympathizes with some of the challenges tech companies face because he sees conquering health misinformation as a game of 含羞草研究社渨hack-a-mole.含羞草研究社

He says there含羞草研究社檚 a nimbleness to spreaders of misinformation, who are often motivated to keep finding ways to circumvent removal policies because their posts can boost profits and brands or spread an ideology.

含羞草研究社淭hey can work around the moderation strategies, but that just shows how we含羞草研究社檙e not going to fix this with one tool,含羞草研究社 Caulfield said.

含羞草研究社淭his is going to be an on ongoing battle.含羞草研究社

In its misinformation policy posted on its website, Meta acknowledges the difficulties, saying 含羞草研究社渨hat is true one minute may not be true the next minute.含羞草研究社

含羞草研究社淧eople also have different levels of information about the world around them and may believe something is true when it is not,含羞草研究社 the policy says.

In an attempt to keep up with everything, Meta relies on independent experts to assess how true content is and whether it is likely to directly contribute imminent harm before it is removed. Third-party fact checking organizations are also contracted to review and rate the accuracy of its most viral content.

At YouTube, workers, including some who form an 含羞草研究社渋ntelligence desk含羞草研究社 monitoring posts and news to detect trends that might need to be mitigated, are used along with machine learning programs, which the company says are well suited to detecting patterns in misinformation.

Some responsibility is also put on credible health-care practitioners and institutions, whose content platforms highlight recommendations to make it easier for users to find trustworthy information.

YouTube, for example, has partnered with organizations including the University Health Network and the Centre for Addiction and Mental Health in Toronto.

CAMH runs a YouTube channel where medical professionals explain everything from schizophrenia to eating disorders. Production funding came from YouTube, but the institution含羞草研究社檚 resources were used for script writing and clinical review, CAMH spokeswoman Hayley Clark said in an email.

Graham sees it as a good example of the health-care profession 含羞草研究社渕eeting people where they are,含羞草研究社 which he said is 含羞草研究社渉ow we battle misinformation.含羞草研究社

含羞草研究社(Credible information) has to be in the palm of people含羞草研究社檚 hands so that they can have dinner conversations, so when they含羞草研究社檙e sitting down in their couch that they含羞草研究社檙e empowered,含羞草研究社 he said.

But when it comes to other organizations and doctors, 含羞草研究社渨e can含羞草研究社檛 assume that all of them have the capacity to do this,含羞草研究社 said Heidi Tworek, an associate professor at the University of British Columbia, whose research focuses on the effects of new media technologies.

These organizations want to get credible information out, but in the cash-strapped and time-lacking health-care industry, there含羞草研究社檚 always another patient to help.

含羞草研究社淪ome health-care institutions would say, 含羞草研究社極K, we含羞草研究社檝e got X amount of money, we含羞草研究社檝e got to choose what we spend it on. Maybe we want to spend it on something other than communications,含羞草研究社櫤卟菅芯可鐫 Tworek said.

In some instances, doctors are also 含羞草研究社渄oing it off the side of their desk含羞草研究社ecause they think it is valuable,含羞草研究社 but that subjects them to new risks like online attacks and sometimes even death threats.

含羞草研究社淪ome people don含羞草研究社檛 want to enter those spaces at all because they see what happens to others,含羞草研究社 she said.

To better combat medical misinformation, she would like platforms to act more responsibly because she often notices their algorithms push problematic content to the top of social media timelines.

However, she and Caulfield agree health misinformation needs an all-hands-on-deck approach.

含羞草研究社淭he platforms bear a lot of responsibility. They含羞草研究社檙e becoming like utilities and we know the impact that they have on public discourse, on polarization,含羞草研究社 Caulfield said.

含羞草研究社淏ut we also need to teach critical thinking skills.含羞草研究社

That could begin at school, where students could learn how to identify credible sources and detect when something could be incorrect 含羞草研究社 lessons he含羞草研究社檚 heard Finland begins in kindergarten.

No matter when or how that education takes place, he said the bottom line is 含羞草研究社渨e need to give citizens the tools to discern what含羞草研究社檚 misinformation.含羞草研究社

READ ALSO:





(or

含羞草研究社

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }