Just seen someone b...
 

  You don't need to be an 'investor' to invest in Singletrack: 6 days left: 95% of target - Find out more

Just seen someone be killed on YouTube shorts.

75 Posts
34 Users
102 Reactions
801 Views
Posts: 5153
Free Member
Topic starter
 

I’ve just been shown a video on YouTube shorts that looks like someone being killed by a propellor that has come loose.

I’m aware that it may be fake, but it didn’t look like it to me, and anyway that’s not particularly the point.

Unsurprisingly it’s not something that I’ve shown an interest in on YouTube previously. Most of my searches are around mountain biking and bikepacking videos.

Frankly, I’ve seen enough people dying in real life when I was a junior doctor on the trauma team in a large city hospital.

I’m just staggered that it can serve up something so graphic (and potentially traumatising) without any warning whatsoever.

So much for “don’t be evil”.


 
Posted : 06/01/2024 8:30 am
jamj1974, gowerboy, bol and 5 people reacted
Posts: 2880
Full Member
 

That’s pretty rough. Did you manage to flag / report the video? <br /><br />

i can’t fathom folk who post / want to consume that sort of content at all. 


 
Posted : 06/01/2024 8:40 am
Posts: 34376
Full Member
 

Bloody hell, that's a bit nightmareish. Might be worth flagging if you think its real.


 
Posted : 06/01/2024 8:44 am
Posts: 18073
Free Member
 

Russian driving vids excepted


 
Posted : 06/01/2024 8:45 am
ayjaydoubleyou, footflaps, ayjaydoubleyou and 1 people reacted
Posts: 5153
Free Member
Topic starter
 

I’ve reported it.

The only thing I can think of was that yesterday I watched a video of chimpanzees hunting baboons that was pretty gruesome in an interesting way. Not something that I would want a child or someone of a nervous disposition to see, but comes under “nature, red in tooth and claw” sort of category.

It’s just so irresponsible of the platforms not to do more to stop this.


 
Posted : 06/01/2024 8:50 am
Posts: 1130
Free Member
 

It’s just so irresponsible of the platforms not to do more to stop this.

You’re not wrong, but [i]how[/i] is the problem. There is something like 500 hours of content uploaded to YouTube [b]every minute[/b]. It’s certainly not possible to watch it all. Machine learning will get to a point where it can cover most things, but after that they can only rely on people reporting material. And even then, the volumes reported are near-on impossible to process manually.


 
Posted : 06/01/2024 9:00 am
ayjaydoubleyou, fasthaggis, dyna-ti and 5 people reacted
Posts: 4027
Free Member
 

Multiply that by most of Gen Z and you can start to see where we are getting contributory factors to the current youth mental health crisis.

The big platforms should be banned.End of.


 
Posted : 06/01/2024 9:00 am
Posts: 77347
Free Member
 

The big platforms should be banned.End of.

Because history has proven time and again just how effective prohibition is.

YouTube may not be ideally regulated, but regulated it is. It is by any measure 'safer' than anything you might trip over on the dark web.


 
Posted : 06/01/2024 9:10 am
dc1988, ayjaydoubleyou, acidchunks and 23 people reacted
 poly
Posts: 8699
Free Member
 

The big platforms should be banned.End of.<br /><br />

because multiple small platforms would be better? I think then you drive niche weird crap into apps ordinary people have never heard of so never see the possible content etc.

Showing the graphic death of someone, even in an accident or a fake video, will be against YouTube rules and typically see a video pulled, potentially the account frozen, maybe even the user banned.

Managing this is a problem government seem to struggle with and knee jerk reaction like banning it, is the simplistic view of politicians who don’t understand technology and feel the need to appeal to the public with “solutions”.  In reality if the person who posted it is in the U.K. they will have likely committed an offence for posting a grossly offensive message via a communications network.


 
Posted : 06/01/2024 9:10 am
Posts: 477
Free Member
 

Maybe they could just not allow any content to be shown unless it has been robustly checked.

But they choose not to do that as it’s too hard / they want to make money.


 
Posted : 06/01/2024 9:25 am
kelvin and kelvin reacted
Posts: 5354
Full Member
 

It's not that they "choose not to", it's that it's simply not possible!  See @bensales staggering statistic above:

"500 hours of content uploaded to YouTube every minute". 

I CBA with the maths but how many tens of thousands of employees would you need to pay to review all that?


 
Posted : 06/01/2024 9:33 am
fasthaggis, footflaps, fasthaggis and 1 people reacted
Posts: 22922
Full Member
 

I suppose the thing to remember is - Youtube doesn't create any one the content - someone decide to upload that content, it wasn't YouTube's idea. It's a platform for other people's content. So someone created and posted that video and theres no process by which Youtube scrutinises that content before it's uploaded or before anyone views it. As with all social media - that process of scrutiny basically is outsourced to the rest of us and only starts once the content is already public - theres no mechanism for the platform to act until the content has been seen by the public and reported back to them. So their content moderation is basically on of shutting the gate after horse has bolted.

There seemingly has been a shift in YouTube's reccomnedation system recently though - up until a few months ago the way content was offered up seened to tend towards fuelling YouTube 'stars' - the stuff offered up was done so on a mix of factors that considered things you've seen and searched for before and the most viewed videos on the platform. And you can see why there is a presumption that your would want to see something that everyone else is watching. You could argue that it steered viewer towards a handful of very successful channels and made it difficult for any new venture to get started though. And the sort of point of YouTube is its somewhere to see and show anything and everything, not just the voices and faces of a few.

That seems to have flipped - I curiously get videos offered to me now that have had dozens of views in the decade since they were uploaded and I've seen items by YouTubers saying the metrics of their channel / content is now very odd - with their content clearly being promoted to wider demographics but getting a very low engagement as a result. Its not really clear what the point of this shift is but the result is weird random crap finds itself in front of more people

I’ve reported it.

What's quite grim is what happens next when you do that - there was an excellent Storyville ocumentary(not currently availnbe on iplayer unfortunately) about the outsourced teams that do the content moderation for the big social media companies - people who's job it is to view successive images and films of porn, abuse, violence and death when we click 'report'-a guy who's seen so many ISIS beheading videos that he can view a picture of a corpse and know how sharp the knife was


 
Posted : 06/01/2024 9:34 am
footflaps and footflaps reacted
Posts: 477
Free Member
 

It’s not that they “choose not to”, it’s that it’s simply not possible!<br /><br />

It’s not possible if they have the policy that anybody can upload / uploads are instant / whatever their current policy is. But if they changed their policy so that nothing could find its way online until it had been robustly checked, then such damaging content wouldn’t find its way onto YouTube. <br /><br />

But if they took this approach, they wouldn’t be able to continue making the same amount of money they currently do. So the consequences of damaging content getting online is basically seen as collateral damage, whilst they continue to make money. 


 
Posted : 06/01/2024 9:40 am
onewheelgood, kelvin, mogrim and 3 people reacted
Posts: 22922
Full Member
 

It’s not possible if they have the policy that anybody can upload / uploads are instant / whatever their current policy is. But if they changed their policy so that nothing could find its way online until it had been robustly checked, then such damaging content wouldn’t find its way onto YouTube.

You typed that paragraph clicked submit and it appeared instantly on this moderated forum. Should the nature of this forum be every sentence is viewed and vetted at every step of the conversation before being published?


 
Posted : 06/01/2024 9:43 am
blokeuptheroad, theotherjonv, footflaps and 3 people reacted
Posts: 477
Free Member
 

If the owners of STW recognised that there was a problem with damaging content being able to be instantly uploaded to their platform, then they would need to make a decision if they should continue with their platform in its current state. I don’t believe that is an issue with STW, but clearly it is with YouTube, Facebook etc


 
Posted : 06/01/2024 9:49 am
kelvin and kelvin reacted
Posts: 7561
Free Member
 

*deleted by moderator*


 
Posted : 06/01/2024 9:51 am
Posts: 6688
Full Member
 

At the dawn of video there was a rumours of some snuff movies.


 
Posted : 06/01/2024 9:56 am
Posts: 22922
Full Member
 

I don’t believe that is an issue with STW, but clearly it is with YouTube, Facebook etc

I had to raise an issue with Mark many years ago when one of STW's ad servers provided me with a lovely image of a guy who'd had the lower half of his face torn off in a motorcycle accident


 
Posted : 06/01/2024 9:58 am
Posts: 5153
Free Member
Topic starter
 

You typed that paragraph clicked submit and it appeared instantly on this moderated forum. Should the nature of this forum be every sentence is viewed and vetted at every step of the conversation before being published?

This is a false comparison, as nobody is being shown stuff on this forum by an algorithm, we're all choosing it.

There's an argument to be made that if your (YouTube, Facebook etc) business model relies on choosing what content to show people in order to make a profit, then you should be responsible for making sure that content isn't harmful.

IMV when social media started curating what to show people through algorithms, they stopped being just a platform and stepped over the line into being publishers, with all that that entails.

The US will never regulate it, after all they're mostly US companies, but that doesn't mean the rest of the world shouldn't.


 
Posted : 06/01/2024 10:04 am
kelvin and kelvin reacted
Posts: 4027
Free Member
 

I know my response was a simplistic response.

I know its not going to happen - just like with nuclear weapons, the genie is out of the bottle.

I guess this is just a bit of a raw subject for me as I have a neurodiverse child addicted to doom scrolling and I'm watching it reduce them in every way. and its tearing me apart. To think that people are making huge sums of money out of this makes me angry.


 
Posted : 06/01/2024 10:05 am
Posts: 290
Free Member
 

Don't use Instagram Reels if you don't like seeing death, every 7-10 videos on there for me is someone getting killed and the contents not always flagged with the "sensitive content click see reel to watch anyway" marker


 
Posted : 06/01/2024 10:06 am
Posts: 13594
Free Member
 

IMV when social media started curating what to show people through algorithms, they stopped being just a platform and stepped over the line into being publishers, with all that that entails.

They've always used algorithm to curate what they show you as they can't possible show you everything...


 
Posted : 06/01/2024 10:11 am
 5lab
Posts: 7921
Free Member
 

500 hours per minute means 30,000 people needed round the clock, or around 140,000 full time employees, if you only watched each video once. Assuming a need for training, a need for some videos to be viewed multiple times etc, it's probably around 200,000 employees needed to moderate all the content. On top of that you'd need a management structure, maybe another 20,000 people, plus IT support, cleaners, office space, etc. It's just not feasible.


 
Posted : 06/01/2024 10:35 am
Posts: 9093
Full Member
 

Must say Instagram is bad if you click search and random stuff comes up.


 
Posted : 06/01/2024 10:45 am
Posts: 1031
Free Member
 

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed

If this is true, and you can’t stop it despite the reports… Why the F. are you still on instagram? Your support (by viewing non death stuff) is ultimately allowing this. The mind boggle. Sack it off, you’ll be thankful for the time back if nothing else.


 
Posted : 06/01/2024 10:51 am
Posts: 22922
Full Member
 

Against 2.7 billion active monthly users for YouTube globally 200,000 is a pretty small number. It's not that there aren't moderators - there are 10s of thousands of them, all be it at arms length so they don't really show up on Google's / Meta's or whoever's rosta. What would change the nature of that work for the people that have to do it is that for the large part knowing that the content is going to be checked would give posters pause for thought. Putting something horrible up so that its there for as long as you can get away with is different to uploading something that you know won't get past moderation - moderation would require far less intervention.


 
Posted : 06/01/2024 10:54 am
Posts: 1226
Full Member
 

500 hours per minute means 30,000 people needed round the clock, or around 140,000 full time employees, if you only watched each video once. Assuming a need for training, a need for some videos to be viewed multiple times etc, it’s probably around 200,000 employees needed to moderate all the content. On top of that you’d need a management structure, maybe another 20,000 people, plus IT support, cleaners, office space, etc. It’s just not feasible.

It *is* feasible. For certain it's a lot of people, but there's nothing about it that would make it unfeasible. 250k employees isn't even all that big in the grand scheme of things.

The question really is whether it's worth it, not whether it's practically possible.


 
Posted : 06/01/2024 11:05 am
 poly
Posts: 8699
Free Member
 

This is a false comparison, as nobody is being shown stuff on this forum by an algorithm, we’re all choosing it.<br /><br />

it’s not a completely false comparison, I could start a thread right now with an innocuous and intriguing title and. Include in that any sort of evil I wanted until it was reported / removed.  The more people engage with it to write “that’s ridiculous, reported” the more it stays at the top of the page until a mod gets to it.  It not sophisticated but it’s the same idea.  On some other forums if you are reading a thread it recommends other threads that look similar - again an algorithm.

your point about “validation” of the algorithm is an interesting point though.  If I was asked to design a “safe” algorithm I’d have weightings for the users who post, the number of times this video has been watched and reports received etc and that would factor in to how the videos were propogated to new users.  BUT if you got some totally random irrelevant content it probably has already been seen by lots of people who haven’t reported it - the judgement is not about YouTube but about other users. <br /><br />

you aren’t required to use youtube (or to watch stuff it suggests - turn auto play off)

you definitely aren’t required to watch shorts

I watch a lot of YouTube but very little shorts.  So I just did a test - of the first twenty videos it shows: 13 were from channels I either subscribe to or watch fairly often; 4 were from closely related channels; 2 seemed to be adverts; 1 was a bit “random” but was just a bit of bizarre weirdness - it was certainly in no way offensive and I’d probably have watched it all the way through if I wasn’t scrolling to the next one to write this summary.  Obviously it’s not your fault if you are getting served content you don’t want - but like the people who tell me Facebook is full of people having political arguments - it’s not, Facebook has decided they want political arguments and Google have decided you want to see nasty shit (whereas my Facebook if full of family, club news etc and and my you tube is taskmaster outtakes, would I lie to you clips, educational science content etc).


 
Posted : 06/01/2024 11:17 am
Posts: 1130
Free Member
 

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed and the contents not always flagged with the “sensitive content click see reel to watch anyway” marker

Whereas I get a few dashcam videos, a whole bunch of weight lifting stuff, some motorbike stunt riders, and a shitload of cat videos thanks to my daughter.

I don’t think I’ve ever seen a sensitive content warning on there. So if it’s recommending you such stuff, it’s doing it because you’ve watching them in the past.


 
Posted : 06/01/2024 11:24 am
Posts: 20675
 

The question really is whether it’s worth it,

No.

250k employees on minimum wage is just under £5bn in salary alone, never mind all the other costs. As big as YouTube is (25bn in revenue p/a, not sure what the profit on that is), I’m not sure even they could afford that.


 
Posted : 06/01/2024 11:26 am
Posts: 7618
Free Member
 

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed

Aren't those things only 30s long? So you are seeing the death of a person every 3.5 to 5 minutes?

I got rid of it because it was showing vids of girls showing their underwear which is surely less damaging than people being killed.


 
Posted : 06/01/2024 11:29 am
Posts: 1130
Free Member
 

250k employees on minimum wage is just under £5bn in salary alone, never mind all the other costs. As big as YouTube is (25bn in revenue p/a, not sure what the profit on that is), I’m not sure even they could afford that.

Of course they could. My employer has somewhere between 25 and 30 billion euro revenue a year, about 2 billion profit, and has nearly 300k employees who get paid a hell of a lot more than minimum wage.


 
Posted : 06/01/2024 11:31 am
Posts: 5153
Free Member
Topic starter
 

I got rid of it because it was showing vids of girls showing their underwear

You got rid of it because of that? 😉


 
Posted : 06/01/2024 11:31 am
Posts: 20675
 

Of course they could. My employer has somewhere between 25 and 30 billion euro revenue a year, about 2 billion profit, and has nearly 300k employees who get paid a hell of a lot more than minimum wage.

so could you afford an additional 250k employees, which is what’s being asked here?


 
Posted : 06/01/2024 11:35 am
Posts: 7618
Free Member
 

You got rid of it because of that?

On my opticians advice.


 
Posted : 06/01/2024 11:37 am
oceanskipper, stingmered, stingmered and 1 people reacted
Posts: 11605
Free Member
 

Nobody ever saw Rotten back in the day?

IMV when social media started curating what to show people through algorithms, they stopped being just a platform and stepped over the line into being publishers, with all that that entails.

Nope, unless it's P2P then same rules apply. None of those things have ever been allowed on the platform but it happens. People post stuff on here all the time that breaks the rules and it gets removed by the same process.

Don’t use Instagram Reels if you don’t like seeing death, every 7-10 videos on there for me is someone getting killed and the contents not always flagged with the “sensitive content click see reel to watch anyway” marker

Literally never seen this.


 
Posted : 06/01/2024 11:39 am
Posts: 22922
Full Member
 

The question really is whether it’s worth it,

No.

250k employees on minimum wage is just under £5bn in salary alone, never mind all the other costs. As big as YouTube is (25bn in revenue p/a, not sure what the profit on that is), I’m not sure even they could afford that.

Like a lot of the Internet Youtube runs on a free to view to use basis funded by advertising (which frankly is often alarmingly badly moderated too) and an option to pay for an ad-free experience. The consequence of poor moderation can be felt by all users (free or paid) but it's not caused by all users, only by the ones that post content. It's the content creators / uploaders / re-uploaders that create the burden. 500 hours per mimute of uploaded content and most of that content will only be a few minutes long.

What if an upload cost a quid? Would people post content they know will be taken down soon if it cost them a bit of money, would people post illegal content if it involved a traceable transaction rather than a burner email account? The burden of moderation could be both reduced and self funding.


 
Posted : 06/01/2024 11:39 am
Posts: 5153
Free Member
Topic starter
 

@bensales

So if it’s recommending you such stuff, it’s doing it because you’ve watching them in the past.

Not necessarily. If the algorithm detects that you're a passive consumer of content, it'll take you quite quickly into extreme stuff. It's a known feature of these algorithms.

As mentioned in my original post, I've seen more than enough death IRL that I'm not in any way curious about it. I don't think it should be shown as entertainment because it's disrespectful. Most of my YouTube shorts (which crosses over with Tik Tok and Instagram reels I believe) have been mountain biking, patisserie making and barbecuing.

As mentioned, the only thing I can think of is the video that I saw yesterday of the Chimpanzees and the Baboon, which I let repeat a few times because I was checking what I'd just seen.


 
Posted : 06/01/2024 11:40 am
Posts: 24
Full Member
 

There are platforms which don't use algorithms, instead you choose what you see. It doesn't solve the need for moderation but it does mean you only get stuff you search for or from providers you trust.
I don't know if governments will ever get to grips with it, but is there mileage in regulating what the algorithms are designed for?


 
Posted : 06/01/2024 11:41 am
Posts: 1130
Free Member
 

so could you afford an additional 250k employees, which is what’s being asked here?

Google (the evil empire behind YouTube) employ about 160k people.

Their revenue is currently around 280 billion dollars.

Their profit is something like 60 billion dollars.

They can afford a few more staff if they want to. Larry, Sergey and Sundar might have to forego new yachts.


 
Posted : 06/01/2024 11:47 am
Posts: 22922
Full Member
 

As mentioned, the only thing I can think of is the video that I saw yesterday of the Chimpanzees and the Baboon, which I let repeat a few times because I was checking what I’d just seen.

It could just as easily be a manipulation of tags and other identifying criteria by the uploader. If someone though it was funny to shock the unsuspecting they could upload videos of death and mutilation and tag it a patisserie and home baking


 
Posted : 06/01/2024 11:48 am
Posts: 4027
Free Member
 

"I don’t think I’ve ever seen a sensitive content warning on there. So if it’s recommending you such stuff, it’s doing it because you’ve watching them in the past. "

That is absolutely not true.

The hook systems used are wide, varied and amazingly good at radicalizing the viewer for want of a better phrase. I.e drawing them away from the shallows and into deep water bit by bit, vid by vid. The more you scroll the more they learn. Just hovering for a split second longer on something will be enough of a trigger.

I'm not sure everybody here, especially those without children realise just how much time children and young adults can spend on these insideous sites. Obviously they platforms just want you to watch content - they don't care what it is. So if the algorithms decide a particular account gets more screetime with cat videos its unlikely that viewer will end up with violent content. But the moment the screen time drops they will be trying something else and almost always the end result is more extreme. The human brain is designed to constantly recalibrate a baseline - its the only way we can cope but this means we are very good at desensitizing ourselves in the short term.....at the huge expense of trauma in the long term.


 
Posted : 06/01/2024 12:02 pm
 poly
Posts: 8699
Free Member
 

They can afford a few more staff if they want to. Larry, Sergey and Sundar might have to forego new yachts.

lots of people who use this forum would be adversely affected too - alphabet will be a significant part of many people’s pension portfolios.  Easy to point the finger at “big corporate” but like it or not it’s not quite as simple as telling them to behave better.  


 
Posted : 06/01/2024 1:21 pm
Posts: 4397
Full Member
 

This is just one of the many unpleasant consequences of our expectation that stuff on the internet should be free. Every previous content delivery system was paid for like books, magazines, movies, LPs or ad-funded by ads that were only very crudely targeted - commercial TV and Radio, free newspapers. I've watched the evolution of the internet since the beginning and watched it become more and more corrupted by the greed of the big players. What started out as the democratisation of access to information and communication has become what we see today. It's tragic, but the genie isn't going to get back in the bottle and we will just have to figure out ways to live with it.


 
Posted : 06/01/2024 1:27 pm
 poly
Posts: 8699
Free Member
 

Not necessarily. If the algorithm detects that you’re a passive consumer of content, it’ll take you quite quickly into extreme stuff. It’s a known feature of these algorithms.

I’m not sure what a passive consumer of content is.  By the sounds of it, yesterday you were slightly less passive by repeatedly rewatching a video that many would find a bit gruesome.  My understanding is the algorithm now thinks* that you will probably like other videos that other people who also watched/liked/shared/rewatched that video also liked.  If you’ve fallen into a pool with the young lad types who share gruesome content that would explain it.  Certainly with main YouTube you can tell it you don’t want to see a particular video or channel and that has an impact on the algorithm in the other direction. (Eg if you say don’t show me this - on a patisserie channel you’ll start seeing less fine cooking content).

* the algorithm of course doesn’t think at all - we anthropomorphise it because it’s easier than accepting you’ve been manipulated to watch something by a set of mathematical calculations with absolutely no actual insight into who you are or indeed what the videos are about.


 
Posted : 06/01/2024 1:34 pm
 mrmo
Posts: 10687
Free Member
 

Can I just point out, you don't want to see those videos, now consider the impact on an employee expected to spend 7-8 hours a day watching this stuff?

There really isn't an easy solution. Just have to wonder on the mentality of the person uploading?

Though one feature I would be grateful for, across all platforms, the ability to pick the subject. I don't want to see many of the video topics served. But there seems to be little in the way of control to simply block a topic.


 
Posted : 06/01/2024 5:13 pm
Posts: 1862
Full Member
 

Accidently saw someone being decapitated in an RTA on Reddit a few months ago, not something I go looking for. I assume these things are done deliberately by the poster(s)- while I can't remember what sub-forum I was looking at I don't use it for anything controversial so deffo would not have been somewhere you'd expect to see that.

It seemed real but hard to say as closed it down as soon as my brain processed what I was looking at but I'm not really minded to go back and scrutinise it for its authenticity.

I'm 43 and it was unpleasant to watch so its a worry if these things are appearing regularly for literally anyone at any age to see.


 
Posted : 06/01/2024 6:20 pm
Posts: 1048
Full Member
 

We've just ditched Spotify Family as it turns out the podcasts are basically just Tiktok. Cue daughter not sleeping because she'd gone from watching funny videos to creepy ones. Hoping Tidal is better. 


 
Posted : 06/01/2024 7:11 pm
Posts: 30093
Full Member
 

Facebook can’t check all uploads. They should check all recommendations before they serve them up.


 
Posted : 06/01/2024 7:16 pm
Posts: 11605
Free Member
 

it turns out the podcasts are basically just Tiktok

Sorry, what?

Cue daughter not sleeping because she’d gone from watching funny videos to creepy ones. Hoping Tidal is better.

What makes you think she won't continue to watch stuff on Spotify with ads? You're not really fixing anything here.


 
Posted : 06/01/2024 7:36 pm
Posts: 1786
Full Member
 

Any of the big platforms could, if forced, validate content BEFORE making it available to users. There is nothing which compels content to be available immediately.

It should be relatively easy* to stick ever piece of content uploaded to YT etc in an auto-moderation queue and classify it before making it available to the public/users and those user should be forced to "opt in" to receiving such content.

Who cares if there's some small delay between content being uploaded and actually being available to view 🤷

Anyway, I have regular purges of content on IG and report anything I don't want to see eg I've reported golf videos as "offensive" content as I simply have no interest in that subject matter - it seems to work. And I'm also careful with who I follow (usually just bike specific or trail builders)


 
Posted : 06/01/2024 7:42 pm
Posts: 1048
Full Member
 

What makes you think she won’t continue to watch stuff on Spotify with ads? You’re not really fixing anything here.

It's a known problem, apparently. We have parental controls in place but there's no way to turn off videos.

https://community.spotify.com/t5/Premium-Family/Android-Podcasts-Disable-video-for-planmembers/td-p/5419302


 
Posted : 06/01/2024 9:13 pm
Posts: 11605
Free Member
 

Right, but it still doesn't answer my question, what's stopping her accessing it without a family plan in place?

We went through something similar with our daughter and had to get the message across that she shouldn't be watching inappropriate content. She also has no screens at bed time.

Still none the wiser on the TikTok thing...


 
Posted : 06/01/2024 9:25 pm
Posts: 1048
Full Member
 

You're right, it's mostly about education and lessons to be learned. No screens in bedrooms; talk to us about anything that bothers you; all that.

That's fine. The internet is full of rubbish but this feels like the narrow end of a wedge. I'm not entirely clear on the link between Spotify canvas and Tiktok/YouTube but it's not one I was expecting. Binning Spotify is probably an overreaction but Tidal feels better anyway. They pay performers more at least. 


 
Posted : 06/01/2024 9:54 pm
Posts: 33325
Full Member
 

Binning Spotify is probably an overreaction but Tidal feels better anyway. They pay performers more at least.

Spotify is rapidly becoming a general streaming service and moving away from specifically music, plus as you say they pay artists bugger-all. Tidal or Apple Music are just music, although I understand there is a podcast function available in Apple Music, but I’ve no idea if it has links to TikTok - I somehow doubt it. I subscribe to Apple One, mainly for the Music and Cloud storage, I have zero interest in podcasts, but the Apple Family subscription I might imagine to have tighter controls over access to things like podcasts and video content, mainly because of Apple’s tight control over content from 3rd-parties; although 3rd-parties are doing their utmost to start shoving their content into places where people don’t necessarily want it.
If I wanted endless amounts of crap from Google, then I’d access it via Google’s various portals - until Google gets bored and shuts them down, like it’s about to do with driving aids in its navigation apps… 🤷🏼


 
Posted : 07/01/2024 3:00 am
kelvin and kelvin reacted
Posts: 1130
Free Member
 

With Spotify you need to make sure you use the Spotify Kids app until they’re old enough to cope with anything. Just having a Family plan doesn’t give any sort of parental control, it just makes it a bit cheaper for multiple users.

Where Spotify does lack is catering for 12-16 age group. Not really old enough for unfettered access to everything on Spotify, but too old for the curated content on the Kids app.


 
Posted : 07/01/2024 6:53 am
gecko76, kelvin, kelvin and 1 people reacted
 poly
Posts: 8699
Free Member
 

Any of the big platforms could, if forced, validate content BEFORE making it available to users. There is nothing which compels content to be available immediately.

Well part of their offering is live video which by its nature requires real time!  Even excluding this why should there be an arbitrary delay for me uploading some rather dull technical video? Too much content to practically review - and presumably 99.9% of content is fine anyway.  Algorithms (AI or simple rules) should be able to spot 95% of the dodgy content quickly, but users who want to post bad shit are clever - say you first N videos get checked the will soon learn this and then post innocuous stuff for them.  

It should be relatively easy* to stick ever piece of content uploaded to YT etc in an auto-moderation queue and classify it before making it available to the public/users and those user should be forced to “opt in” to receiving such content.

are you prepared to pay for YT?  Do you think YT competitors would emerge specifically to target the uncensored market? 


 
Posted : 07/01/2024 1:34 pm
 zomg
Posts: 850
Free Member
 

I switched away from Spotify when they started focussing on podcasts while still ripping off artists. The Seth Rogan deal made it very clear where things were going.


 
Posted : 07/01/2024 2:05 pm
kelvin and kelvin reacted
 rone
Posts: 9325
Full Member
 

are you prepared to pay for YT?  Do you think YT competitors would emerge specifically to target the uncensored market? 

Well you could say that already exists but good luck with commercial legs for it .

Any of the big platforms could, if forced, validate content BEFORE making it available to users. There is nothing which compels content to be available immediately.

I'm with you on this.

There shouldn't be an absolute compulsion to deliver every bit of content imaginable without a technical barrier of some sort.

I don't for one minute believe they can't scan for restricted material before it's compressed for delivery. Where is good AI when you need it?

You soon get sussed for commercial music!


 
Posted : 07/01/2024 2:19 pm
kelvin and kelvin reacted
Posts: 5153
Free Member
Topic starter
 

Most content on YouTube is barely watched. Most YouTube creators are barely making any money at all. Of the ones who are making money, very few of them are making enough to cover their costs, fewer still are making a living.

It strikes me that a bit of quality control and barriers to entry wouldn’t be a bad thing.


 
Posted : 07/01/2024 2:30 pm
kelvin and kelvin reacted
Posts: 11605
Free Member
 

I’m not entirely clear on the link between Spotify canvas and Tiktok/YouTube but it’s not one I was expecting.

I'm still waiting for you to explain what you think the link is. It's your statement I'm having a hard time understanding.


 
Posted : 07/01/2024 2:51 pm
Posts: 1048
Full Member
 

it turns out the podcasts are basically just Tiktok.

An exaggeration, but as bensales says it's not suitable for 12-16 year olds (mine are 11 and 13) as it's too easy to find content which is inappropriate. I could see her getting sucked in by the funny stuff and while the creepy stuff probably isn't that bad, I don't know what else is on there. I know there are plenty of teenagers who do use tiktok, and I worry about them too tbh.

Oh, and I saw a video of a beheading on reddit maybe 15 years ago that I'm never going to forget 🙁


 
Posted : 07/01/2024 3:01 pm
kelvin and kelvin reacted
Posts: 1786
Full Member
 

are you prepared to pay for YT?

As it happens, I already do

Do you think YT competitors would emerge specifically to target the uncensored market?

Where did I say the content is censored?  I said was that users must opt in to be able to view certain types of content. Frankly, who gives a f++k about the financial impact of Alphabet or Metas bottom line. They have enough cash and resources to take the hit and fix the problems they've caused.


 
Posted : 07/01/2024 7:07 pm
kelvin and kelvin reacted
 poly
Posts: 8699
Free Member
 

are you prepared to pay for YT?<br />As it happens, I already do

interesting - do you still find them serving you content you don’t want and find grossly offensive?

Do you think YT competitors would emerge specifically to target the uncensored market?

well your sentence was missing some letters/words so I had to guess what you meant!  But you seemed to be saying YT should moderate all content, but people could choose to view unmoderated content.  That seems to give YT a significant degree of editorial control, ie censorship.

<br />Where did I say the content is censored?  I said was that users must opt in to be able to view certain types of content.

you already largely can, and if you don’t like the service they offer you aren’t compelled to keep going there, never mind paying them cash for the privilege.

<br />Frankly, who gives a f++k about the financial impact of Alphabet or Metas bottom line. They have enough cash and resources to take the hit and fix the problems they’ve caused.

as I said a few pages ago, probably all of us - unless you don’t have any pensions, etc.  


 
Posted : 07/01/2024 11:12 pm
Posts: 1786
Full Member
 

But you seemed to be saying YT should moderate all content, but people could choose to view unmoderated content. That seems to give YT a significant degree of editorial control, ie censorship.

I'm saying all content could be/should be "auto-classified" - I'd be surprised if the capability doesn't already exist (or something similar isn't already used to serve up existing content based on their fabled "algorithms"). And that ANY content has to pass thru this auto-classification system before being available to end users. (This would, possibly, incentivize providers to provide as near real time auto-classification).

And specific "opt-in" controls provided at login.

I'd be ok for certain exceptions for "live broadcast" scenarios for some "licensed/authorized" users and/or reducing "live" to near real-time (like TV typically has a few seconds delay; again, incentivize the providers to fix the problem)

as I said a few pages ago, probably all of us – unless you don’t have any pensions, etc.

Meh. Of course I have a pension (and I'll be drawing it pretty soon!) but it's a tracker so I doubt a significant percentage of the value is in Alphabet or Meta, and even if it is, I doubt such changes would make any material difference. Of course, if you choose your own stocks and choose to invest in this type of provider, you're dancing with the devil anyway and you could always dump or short them 🤷


 
Posted : 08/01/2024 1:33 am
Posts: 8613
Full Member
 

Of course they could. My employer has somewhere between 25 and 30 billion euro revenue a year, about 2 billion profit, and has nearly 300k employees who get paid a hell of a lot more than minimum wage.

And I bet, assuming your company is publicly traded, that number of employees will be about the bare minimum required in order to maintain/grow profit etc. Even if the CEO of YouTube cared enough that they wanted to employ another 250k content moderators they are beholden to shareholders (whether it's Alphabet's or whoever's) and their board. Any CEO who suddenly causes a $5b /year drop in profit (without it being due to a regulatory requirement) won't be around long.


 
Posted : 08/01/2024 7:31 am
Posts: 1786
Full Member
 

Any CEO who suddenly causes a $5b /year drop in profit (without it being due to a regulatory requirement) won’t be around long.

I'd vote for regulatory requirement but I can't see any US or UK government having the balls to try it, though the French or EU may be more likely 😄


 
Posted : 08/01/2024 7:49 am
Posts: 4027
Free Member
 

So my facebook short video things have hitherto been cats on robot vacuum cleaners, kitesurfing and wingsuit flying with the odd bike related vid.

Today when quickly scrolling to find if anyone on the local news site knew why my daughters bus time had changed I saw a woman hit and killed by a speeding car.


 
Posted : 08/01/2024 8:08 am
Posts: 1786
Full Member
 

Jeez. WTF. Please report it. 


 
Posted : 08/01/2024 8:12 am
 5lab
Posts: 7921
Free Member
 

I’d vote for regulatory requirement but I can’t see any US or UK government having the balls to try it, though the French or EU may be more likely 😄

if any one country introduced that requirement, a service would simply pull out of it as there's not enough profit in advertising in one market to pay for the $5bn in costs. I doubt there's even enough profit in all markets to cover that - revenue may be $28bn but I bet their margins are relatively small


 
Posted : 08/01/2024 8:25 am
 poly
Posts: 8699
Free Member
 

I’m saying all content could be/should be “auto-classified” – I’d be surprised if the capability doesn’t already exist (or something similar isn’t already used to serve up existing content based on their fabled “algorithms”).

there is probably some degree of analysis already!  But contrary to the media impression AI isn’t actually that smart - there will be false positives and false negatives.  You can tune the algorithm to either be very safe - then you piss off legit content providers who are blocked for no good reason, and run up your operating costs dealing with their appeals OR you can tune it to let some stuff through because users can report offensive stuff, and you’ll need processes and staff for those reports anyway because some users will report stuff you believe is acceptable.  And replacing AI with humans is not the solution as humans watching hours of footage will not be 100% robust at applying a somewhat subjective threshold either.

And specific “opt-in” controls provided at login.

You opt in by going to YouTube - it’s not compulsory.  Using auto play definitely isn’t.

I’d be ok for certain exceptions for “live broadcast” scenarios for some “licensed/authorized” users and/or reducing “live” to near real-time (like TV typically has a few seconds delay; again, incentivize the providers to fix the problem)

AI comes at huge expense, requires masses of energy - running real time AI on all live YouTube would be crazy!  “Licensed” users would be much easier but someone is then making an arbitrary decision on who is good and who is bad and therefore likely to post good/bad content in the future.  If you don’t like their current approach stop paying them money.

as I said a few pages ago, probably all of us – unless you don’t have any pensions, etc.
Meh. Of course I have a pension (and I’ll be drawing it pretty soon!) but it’s a tracker so I doubt a significant percentage of the value is in Alphabet or Meta, and even if it is, I doubt such changes would make any material difference. Of course, if you choose your own stocks and choose to invest in this type of provider, you’re dancing with the devil anyway and you could always dump or short them

i think you may be underestimating the impact across the whole tech sector, and then the ripple effect across other markets if governments were to suddenly start introducing regulations that meant their profits slashed.  It’s an uncomfortable truth that most of us ignore as we berate “corporate greed” that the biggest shareholders in those firms are often pensions funds.


 
Posted : 08/01/2024 8:48 am
Posts: 5153
Free Member
Topic starter
 

Yeah, AI is hugely data and processing intensive with inaccurate results for long tail categorisation.


 
Posted : 08/01/2024 8:53 am
 5lab
Posts: 7921
Free Member
 

AI is already scanning your uploads, if you put copyright music in there it'll block the upload, I bet if you try to just upload porn it'll get caught too.

Scanning for someone being shot vs someone's 6th form drama project of someone being shot is a lot harder


 
Posted : 08/01/2024 9:04 am
Posts: 4027
Free Member
 

"Jeez. WTF. Please report it. "

I've no idea how you do that and quite frankly I don't care. Why should it be on me. To report it would mean trying to find a vid I immediately and instinctively clicked away from which would presumably mean I have to see it again as well as nudging the algorithm once more.

As I only use facebook for local news which I can get elsewhere and selling the odd thing I've just deleted the app.


 
Posted : 08/01/2024 9:25 am
 poly
Posts: 8699
Free Member
 

I’ve no idea how you do that and quite frankly I don’t care. Why should it be on me. To report it would mean trying to find a vid I immediately and instinctively clicked away from which would presumably mean I have to see it again as well as nudging the algorithm once more.

Well don't complain then - its only there because other people have all been "not my problem" too.

In the Facebook App - on a "reel" there are three dots in the bottom right - it brings up a menu allowing you to:

- Find support or report reel

- Ask why you are seeing this

- Hide the reel and see less like it (not report it just "retrain" the algorithm that you want less of this).

Alternatively you can click the three dots above the carousel of shorts to change settings there.

But if you saw a video you didn't even mean to play then you probably want to go:
Menu (bottom right) > Settings and Privacy > Settings > Media >  Autoplay > "Never autoplay videos".


 
Posted : 08/01/2024 10:14 am
 Olly
Posts: 5169
Free Member
 

i would really prefer if you could turn off the "shorts" functionality.

I use FB for keeping in touch with family and friends, but end up getting dragged down a rabbit hole of 15 second dopamine hits too easily.

Genuine question ive wondered:

There are lots of "content creators" who post content that is basically held up by them flashing their bums and more to camera. One i can think of specifically rabbits on about her chavved up VW Polo. "follow me while i change the brake pads on my car", but then proceeds to give the viewer an eyeful in the process. Every, Single, Video. The car is not the "star" of the show, shall we say.

Question: Where is the line between "content creator" and "sexworker?".  Not a criticism of a chosen career, perse. Does a stripper describe themselves as a "dancer"?


 
Posted : 08/01/2024 1:58 pm

6 DAYS LEFT
We are currently at 95% of our target!