Covid-19 pandemic.
The study, published in the Journal of Medical Internet Research, examined if YouTube's recommendation system acted as a «rabbit hole,» leading users searching for vaccine-related videos to anti-vaccine content.
For the study, researchers asked the World Health Organization-trained participants to intentionally find an anti-vaccine video with as few clicks as possible, starting from an initial informational Covid-19 video posted by the WHO.
They compared the recommendations seen by these users to related videos that are obtained from the YouTube application programming interface (API) and to YouTube's Up-Next recommended videos that were seen by clean browsers without any user-identifying cookies.
The team analysed more than 27,000 video recommendations made by YouTube using machine learning methods to classify anti-vaccine content.
«We found no evidence that YouTube promotes anti-vaccine content to its users,» said Margaret Yee Man Ng, an Illinois journalism professor with an appointment in the Institute of Communications Research and lead author of the study.
«The average share of anti-vaccine or vaccine hesitancy videos remained below 6% at all steps in users' recommendation trajectories,» said Margaret Yee Man Ng.
Initially, researchers only want to understand YouTube's famously opaque techniques for content recommendations and whether these techniques funnel