YouTube will remove content that promotes “harmful or ineffective” cancer treatments or discourages viewers from seeking professional medical care, the company said Tuesday. This change is part of YouTube’s updated and streamlined Medical Misinformation Policy.
The new policy removes content that promotes unproven treatments that are not approved or definitive treatments, or treatments that health authorities have identified as particularly harmful. For example, videos claiming that “garlic cures cancer” or “take vitamin C instead of radiation therapy” will be removed.
“When faced with a diagnosis, cancer patients and their loved ones often turn to online spaces to research symptoms, learn about treatments, and find community,” YouTube said in a blog post. “Our mission is to make sure that when they turn to YouTube, they can easily find high-quality content from credible health sources. In applying our updated approach, cancer treatment misinformation fits the framework – the public health risk is high as cancer is one of the leading causes of death worldwide, there is stable consensus about safe cancer treatments from local and global health authorities, and it’s a topic that’s prone to misinformation.”
Moving forward, YouTube will apply its medical misinformation policies if content is associated with a high public health risk, publicly available guidance from health authorities around the world, and when it’s generally prone to misinformation. YouTube says it needs to preserve the important balance of removing egregiously harmful content while ensuring space for debate and discussion.
YouTube says its policies on cancer treatment misinformation will go into effect today and enforcement will ramp up in the coming weeks. The company plans to promote cancer-related content from the Mayo Clinic and other authoritative sources. The platform’s latest policy comes years after YouTube stepped up its efforts to tackle misinformation about health and vaccines during the COVID-19 pandemic. In 2020, YouTube removed videos containing false information about COVID-19 from its platform. A year later, the company expanded its medical misinformation policy to include a policy banning vaccine misinformation. By that time, the company had already banned more than a million videos for misinformation about COVID-19. The updated policy said YouTube would also begin removing content that spreads misinformation about vaccine safety, vaccine efficacy and vaccine ingredients. YouTube announced last year that it would crack down on videos containing misinformation about abortion and remove videos it deemed unsafe. The company also launched information panels below abortion-related videos and above relevant search results that provide viewers with information from local and global health authorities.