Surely the responsibility here is broader than treating it after the fact? Perhaps it’s an over the top comparison but most places outlaw dangerous drugs— you can treat the after-effects but by that point a lot of the damage has already been done. Making tech companies answerable for having developed algorithms that serve up hours of obvious brainrot content at a time would go a long way.
(And like with many of these things, holding senior executives personally liable helps ensure that the fines or whatever are not just waved away as a cost of doing business.)
Yes it is an over the top comparison. I am a recovered / former addict (alcohol). I would never compare the two. I was spending too much time on Twitter a few years ago. I deleted my account. The problem was solved. It took me an entire year to accept that I had a serious problem and then another 9 months to finally stop drinking.
The brewery, the bar nor the bar ever made me drink. I chose to drink. I also was the one that chose to stop drinking. BTW drink is as dangerous or more dangerous as many illegal drugs IMO.
> Making tech companies answerable for having developed algorithms that serve up hours of obvious brainrot content at a time would go a long way.
You get recommended what you already watch. Most of my YouTube feed is things like old guys repairing old cars, guys writing a JSON parse in haskell and stuff about how exploits work and some music. That is because that is what I already watched on the platform.
Right, and recommendations for old car repair videos that you watch a few of per week is reasonable.
The argument I’m making is that it’s not beyond the pale for YouTube to detect “hey it’s been over an hour of ai bullshit / political rage bait / thirst traps / whatever, the algorithm is going to intentionally steer you in a different direction for the next little bit.”
They actually do show a several notices that says "Fancy something different, click here". They already have a mechanism in place that does something similar to what you describe.
What YouTube recommends to you is more of what you already watch. Removing stuff the you describe is as easy as clicking "Not interested" or "Do not recommend channel".
Also YouTube algorithm is rewarding watch time these days. So click bait isn't rewarded on platform as much. I actually watch a comedy show where they ridicule many of the click-baiters and they are all complaining about the ad-revenue and reach decreasing.
Also a lot of the political rage-bait is kinda going away. People are growing out of it. YouTube kinda has "metas" where a particular type of content will be super popular for a while and then go away.
I don't agree with this take. Some people are going to be more susceptible than others, just as with alcohol or other drugs. An individual choosing to stop doesn't mean much for society in aggregate.
I don't go down the political rage bait video pipeline, nevertheless next to any unrelated YouTube video I see all sorts of click/rage-bait littered in the sidebar just asking to start me down a rabbit hole.
As an example I opened a math channel/video in a private mode tab. Under it (mobile), alongside the expected math-adjacent recommendations I see things about socialist housing plans, 2025 gold rush debasement trades, the 7-stage empire collapse pattern ("the US is at stage 5"), and so on. So about 10% are unrelated political rage-bait.