NYT reports that YouTube’s algorithms may end up recommending peoples home videos of their kids in swimsuits, etc to adults seeking sexual content. Sounds like YouTube is working to fix the problem, but still alarming.
“YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next — had begun showing the video to users who watched other videos of prepubescent, partially clothed children, a team of researchers has found. YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.” https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html |
That’s incredibly f’ed up. |
Disgusting. That’s why I do not post pictures or videos of my children online. Ever. |