Ah, the classic “think of the children!” argument. It is no one’s responsibility other than the parent to ensure their child isn’t watching inappropriate content (which will be different for every family and individual).
This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.
I don't think that's the point. It is false advertising for YouTube to create YouTube Kids for kids, and then not have content that is appropriate for kids on it.
> This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ,
The article cites actual instances and recurring problems showing that "machine learning and collaborative filtering are incapable of producing healthy recommendations.": Even when YouTube tried to produce child friendly content, they failed. You can't just say "it's fine" after the article shows it not being fine.
Setting aside the personal responsibility angle for the moment (which I agree with you on!) don't you think that negative externalities should generally be managed?
YouTube is a paperclip maximizer (where paperclips correspond to eyeball-hours spent watching YouTube) and at some point optimizing paperclips becomes orthogonal to human existence, and then anticorrelated with it.
I think it's a perfectly fair thing to say that maybe the negatives outweigh the positives at the present.
(This argument doesn't apply solely to YouTube, of course)
I generally agree with you, but I think YouTube being safe for kids became their problem when they launched a version specifically for kids and marketed it as safe.
What law is there to prevent a kid from going on the internet and going to “inappropriate” sites? Watching video on cable? Finding their Dad’s Playboy magazine back in the day?
On cable there are ways to lock out channels, setting ratings on the TV and all that. If dad doesn't hide his Playboy well enough, it's obviously on him to fix it.
On the internet it is much more difficult, of course, and we can't realistically expect some shady offshore site from implementing age checks, let alone recommendation algorithms. But Google is a public, respected company from a first world country that claims to be promoting social good (which, of course, is marketing BS, and even if it weren't I would not want their idea of social good, but still). You'd think that they would invest some effort into not showing inappropriate content to kids at least. But no, they throw up their hands and go on ideological witch hunts instead.
I’ve got an idea - don’t let your kids get on YouTube and only allow them to get on curated sites. You can easily lock down a mobile device to only allow certain apps/curated websites.
I don't let mine anywhere near a TV or computer. Of course that might be a bit more difficult once tghey get old enough to actually reach the keyboard...
But then I try to not let my mom on YouTube either. Or myself, for that matter.
lol, do you even children. They will always find a way. You can restrict apps and services all you want. How about their friends at school? Are you going to restrict their phones as well? The only thing that works is actually talking to the kids about things they've seen/experienced. Not saying that is easy of course.
No we don't - not in the US. Short of literal pornography that could fall afoul of corruption of a minor the state isn't involved. That is just from ratings cartels and pressure groups.
If nobody gives a fuck enough to affect business you can give the complete SAW series to 3 year olds and all the offended can do is yelp indignantly.
Nope. This only applies to pornography if I recall correctly. There's not laws against showing R rates movies to kids, it's just the theaters that refuse to admit them. In 2011 the courts struck down a California law prohibiting selling I'd M rates games to minors, too.
Spotify's recommendation system is dealing mostly with artists that have recording contracts and professional production- their problem shouldn't be compared to YouTube's which has to deal with a mix of professional, semi-pro, and amateur created content. Also there's more of a "freshness" aspect to a lot of YT videos that isn't quite the same as what Spotify has to deal with (pop songs are usually good for a few months, but many vlogs can be stale after a week). Not only that, but many channels have a mix of content, some that goes stale quickly and some that is still relevant after many months- how does a recommendation engine figure that out?
It's better to compare Spotify's recommendations to Netflix's recommendations, which also deals with mostly professional content. Those two systems have comparable performance in my opinion.
Why the content exists is also important. People create video specifically for Youtube. Very few people create music just to host it on Spotify. This results in the the recommendation algorithm and all its quirks have a much bigger impact on the content of Youtube than Spotify. Also having that many people actively trying to game the recommendation algorithm can pervert that algorithm. That simply isn't a problem for sites like Spotify or Netflix.
>YouTube is a _disastrously_ unhealthy recommender system,
Can you explain with more details?
I use Youtube as a crowdsourced "MOOC"[0] and the algorithms usually recommended excellent followup videos for most topics.
(On the other hand, their attempt at matching "relevant" advertising to the video is often terrible. (E.g. Sephora makeup videos for women shown to male-dominated audience of audiophile gear.) Leaving aside the weird ads, the algorithm works very well for educational vids that interests me.)
Yes. Elsagate is an example - the creepy computer-generated violent and disturbing videos that eventually follow children's content - or the fact that just about every gaming-related video has a recommendation for an far-right rant against feminism or a Ben Shapiro screaming segment. There's also the Amazon problem - where everything related to the thing you watched once out of curiosity follows you everywhere around the site.
Yes, I was aware of Elsagate.[0] I don't play games so didn't realize every gaming video ends up with unwanted far-right and Ben Shapiro videos.
I guess I should have clarified my question. I thought gp's "unhealthy" meant Youtube's algorithm was bad for somebody like me that views mainstream non-controversial videos. (Analogy might be gp (rspeer) warning me that abestos and lead paint is actually cancerous but public doesn't know it.)
It's not 100%, but I'd consider "video games" => "Ben Shapiro" to be a pretty awful recommendation system, regardless of the reasoning behind it. As far as I know, the group "video gamers" doesn't have a political lean in either direction.
I've definitely seen this with comics. I watched a few videos criticizing Avengers: Infinity War, and now I see mostly Ben Shapiro recs. It makes no sense. I never have (and never plan to) seek out political content on YouTube.
I watch a number of gaming videos and have never had a far-right video recommended. Don't know who Ben Shapiro is.
It could be the type of games involved, since I usually watch strategy, 4x, city-building, and military sims. I usually get history-channel documentaries or "here's how urban planning works in the real world" videos recommended, which suits me fine. Somebody whose gaming preferences involve killing Nazis in a WW2-era FPS might be more likely to get videos that have neo-Nazis suggesting we kill people.
But that child comment didn't link Nazis to normal "video games". I assumed he just meant some folks (e.g. "1.8%" of web surfers) with the predilection for far-right videos would get more Nazi recommendations. Well yes, I would have expected the algorithm to feed more of what they seemed to like.
I do not see any Nazi far-right videos in 1.8% of my recommendations ever.
Isn't that an inevitable side effect of collaborative filtering? If companies could do content based-recommendation, wouldn't they? Until purely content based recommendations are possible, wisdom of the crowds via collaborative filtering will lump together videos that are about different things but watched by similar viewers.
I don't know what the comment you are replying to meant, I interpreted it to mean the algo takes you down a rabbit hole to darker content, however for me I miss the days when it actually recommended relevant videos, similar to the one I was watching.
My entire sidebar is now just a random assortment of irrelevant interests. For instance I wanted to learn to play a denser piano chord, I learned it ages ago but I still get like 20 videos that explain how to add extensions to a 7 chord, even if I'm watching a video on the F-35 fighter pilot.
I completely disagree, my children have a wonderful time following the recommended videos that youtube provides. I'm interested to hear your reasoning on why it is "disastrous".
I'm pretty sure all content on Spotify gets manually curated first, so abusive tagging doesn't happen, and some of the worst content simply doesn't get uploaded at all. Spotify also doesn't try to be a news site, so they can afford to have a couple week's lag between uploading a song and having it show up in people's recommendation feed.
I disagree in some sense. I personally have found the recommending system on YouTube pretty good for the main page of the site. The thing that bugs me is the recommended bar right (or bottom right) of the videos, which can be really annoying and infested with clickbait etc.
>It is no one’s responsibility other than the parent
Yes, but you _must_ understand that most (no, ALL) of the millennial generation grew up with public content over the airwaves that was curated and had to pass certain guidelines. So many parents think that the YouTube Kids app is the same thing. it's not!
If YouTube want to be the next Television, they're going to have to assume the responsibilities and expectations surrounding the appliances they intend to replace. Pulling a Pontius Pilate and tossing the issue to another algorithm to fail at figuring out is not going to fix the problem.
Thankfully, there's much more out there than YouTube when it comes to children's entertainment, actually curated by human beings with eyeballs and brains, and not algorithms. The problem is that parents don't know these apps even exist, because YouTube has that much of a foothold as "place to see things that shut my kid up, so I can see straight."
This article suggests that machine learning and collaborative filtering are incapable of producing healthy recommendations. I beg to differ, the New York Times may not like the result but they work for the vast majority of users on any service with too much content to manually curate.