YouTube’s recommendation engine is one of its most important aspects; however, in 2018, the platform received massive backlash for directing conspiracy videos towards users. As a result, in January 2019, YouTube pledged to curb the number of conspiracy videos pushed to users.
Where users were still skeptical of the platform’s recommendation engine, a group of researchers from the University of California, Berkley, just revealed that there is a 40% reduction in the number of conspiracy videos being suggested to users. The group of researchers tracked the engine’s behavior for one year.
According to their report, the team trained their computers so that hundreds of conspiracy and non-conspiracy videos were analyzed. This helped the computers learn the difference between the two. Later, these trained computers were used to analyze eight million recommended videos over the course of one year.
However, watching the videos did not make much of a difference, which is why they were analyzed based on the transcripts, metadata, and comments. Based on the content in every source, the video would be classified as either conspiratorial or not.
The videos that cover secret plots by those in power, ideas contradicting scientific consensus, and theories not backed by evidence were considered as conspiracy videos. Apart from this, the analysis was done in a ‘Logged out’ state rather than using a proper account. This might have tweaked the results. Nevertheless, YouTube is working on curbing the problem on a broader level.
Haqeeqat tv is still there
yar main suna hai tecno camon 15 pakistan main aa raha hai i am excited
han main bhe uski wait ker raha hoon
us main pop up camera hai aur 6gb ram be hai
So, those computers were running some kind of ML model to derive data on the bases of Metadata and comments for decisions making process.
Cool Thought process.