ThisisBigBrother.com - UK TV Forums

ThisisBigBrother.com - UK TV Forums (https://www.thisisbigbrother.com/forums/index.php)
-   Serious Debates & News (https://www.thisisbigbrother.com/forums/forumdisplay.php?f=61)
-   -   Molly Russell - Dad wins case against Social Media giants (https://www.thisisbigbrother.com/forums/showthread.php?t=382637)

Cherie 30-09-2022 01:34 PM

Molly Russell - Dad wins case against Social Media giants
 
Coroner Andrew Walker delivers his conclusion at the inquest into the death of Molly Russell. The schoolgirl ended her life in November 2017 after viewing online content linked to anxiety, depression and self-harm.

he father of Molly Russell says a "monster" has been created whereby social media products are not safe for users.

“They didn't really consider anything to do with safety,” Ian Russell said at a news conference.

“Sadly their products are misused by people and their products aren’t safe.

"That's the monster that has been created but it's a monster we must do something about to make it safe for our children in the future.

"It’s the corporate culture that needs to change, so that they put safety first instead of profits."

Mr Russell said his message to Meta boss Mark Zuckerberg would be a "simple" one - "to listen".

"Listen to the people that use his platform, listen to the conclusions the coroner gave at this inquest, and then do something about it," he added.

Asked whether the family were considering legal action against social media companies following the coroner's conclusion, Mr Russell said: “In terms of what’s been occupying our minds for the last five years it’s been the inquest.

"It's been at the forefont of our minds, and any other action, legal action or other action, has been very much at the back of our minds."

Copy link
14m ago
14:18
Molly Russell family faced 'considerable hurdles' in seeking her social media history
A lawyer for Molly Russell's family said they had faced "considerable hurdles" and only last month did Instagram owner Meta provide details of hundreds of posts viewed by the teenager before her death.

Speaking at a news conference, Merry Varney from law firm Leigh Day said "the battles bereaved families face when seeking answers from social media companies are immense".

"It was only in August this year that Meta provided over 1,200 Instagram posts Molly engaged with, less than a month before the inquest started - this included some of the most distressing videos and posts that Molly engaged with," Ms Varney said.

"Seeking to find out how your loved one died should never be a battle, and Molly’s family have welcomed the transparency shown by Pinterest, as well as their acceptance and apology for the deeply harmful material Molly was able to access.

"We all owe this family an incredible debt of gratitude and together with them I hope the conclusion that children’s lives remain at risk is acted upon urgently, so that tech giants can no longer invite them onto wholly unsafe and harmful platforms."

Copy link
24m ago
14:08
Pinterest 'committed to improvements' after Molly Russell death
Social media platform Pinterest says it has "listened very careful" to the family of Molly Russell and the coroner's remarks at the inquest into her death.

The inquest was told Pinterest sent emails to the 14-year-old such as "10 depression pins you might like" and "new ideas for you in depression".

After the conclusion of the inquest, a Pinterest spokesperson said: "“Our thoughts are with the Russell family.

"We’ve listened very carefully to everything that the Coroner and the family have said during the inquest.

"Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.

"Over the past few years, we've continued to strengthen our policies around self-harm content, we’ve provided routes to compassionate support for those in need and we've invested heavily in building new technologies that automatically identify and take action on self-harm content.

"Molly’s story has reinforced our commitment to creating a safe and positive space for our Pinners."




https://news.sky.com/story/molly-rus...death-12708085


Imagine algorithms glamorising self harm and promoting the narrative that suicide is the only way out....its sick

arista 30-09-2022 01:40 PM

Yes Nov 2017,
a real sad case of this young Molly, just 14
going to the wrong Adult sites online
taking her own life.

Cherie 30-09-2022 02:04 PM

She contacted celebs as well to help ..poor girl

Niamh. 30-09-2022 02:09 PM

Poor kid, it's a new world totally teens have to navigate these days with the internet and social media

Gusto Brunt 30-09-2022 03:10 PM

Sorry if I missed it but had this girl been to the doctor with regard to depression?

Did anyone suspect she was depressed?

Yes, a very sad case. Just saw a picture of her. Tragic. :(

arista 30-09-2022 03:19 PM

Quote:

Originally Posted by Gusto Brunt (Post 11215164)
Sorry if I missed it but had this girl been to the doctor with regard to depression?

Did anyone suspect she was depressed?

Yes, a very sad case. Just saw a picture of her. Tragic. :(


No one knew
she was going online,
looking at so much death sites Instragram

They all found out,
after she killed herself

Gusto Brunt 30-09-2022 03:22 PM

Quote:

Originally Posted by arista (Post 11215166)
No one knew
she was going online,
looking at so much death sites Instragram

They all found out,
after she killed herself

I'm shocked there were no signs. :( Very scary if you're a parent and you just don't know what secret turmoil your child is in. :(

arista 30-09-2022 04:53 PM

Even the Owners of the Sites said in Court
they would not allow their own kids to view what Molly viewed



Only those in the court Viewed the Content
it can not
go on TV news screens

rusticgal 30-09-2022 06:50 PM

Social media has a lot to answer for…it’s frightening. Such a tragedy for this young girl and her family and many many others.

LeatherTrumpet 30-09-2022 07:03 PM

I kind of think that if it wasnt social media it would have been something else with this case

arista 30-09-2022 10:12 PM

https://liveblog.digitalimages.sky/l...80ae9bdce6.png

arista 30-09-2022 10:39 PM

SkyText
[The Prince of Wales increased the
pressure on tech companies and the
government last night to keep children
safer online after a landmark ruling
in the Molly Russell inquest, The Times reports.]


https://www.bbc.co.uk/news/uk-63097739
[Prince William makes online
safety plea after Molly Russell verdict

By Jasmine Andersson
BBC News]


https://liveblog.digitalimages.sky/l...96c2eee3e1.png

arista 30-11-2023 12:51 AM

BBC News Text:
[Metro leads on comments by the father of
Molly Russell, a 14-year-old who took her
own live in 2017 after viewing suicide-related
content on social media.
Six years on from her death,
Ian Russell says tech firms have not made
enough progress at rolling out measures
to protect children online.]


https://ichef.bbci.co.uk/news/976/cp...ro-nc.png.webp

Oliver_W 30-11-2023 08:16 AM

Social media is toxic. I'm not necessarily comfortable with speech being controlled or limited, but there needs to be better filters on what can reach minors.
I'd even say don't let minors join :laugh: that'd be a bit of an undertaking, and would probably require ID verification for all account creation, and I'm certainly not happy with giving that sort of info to the tech giants. Not for the sake of social media, anyway.

Soldier Boy 30-11-2023 11:25 AM

Social media algorithmic content is not good for anyone. Full stop. It is inherently geated towards radicalisation. Young boy lingers on a video with misogynistic content? It'll serve more, and more, and more. Religious indoctrination? Same thing. Right wing ideology. Gender extremism. Autism/ADHD/"neurospicy" romaticising. Suicidal ideation. All paths that Social Media targetted content algorithms will take people down.

If you know what you're doing you can curate your tailored content to show you what you actually want to see. My TikTok for example tends to show gaming/entertainment industry stuff and funny videos. But to keep it there you have to make sure you engage with the stuff you want to see (like and comment) and swipe off of things you don't want to see, or even report/block the accounts, or it'll take you down a rabbit hole. Linger on a conspiracy theory video for a few minutes because it's so dumb it's funny? Guess what - you're getting served QAnon trash for the next week.

And if you're not aware... if you don't know that's how these sites work? Very quickly you'll start to believe that "this stuff is everywhere, this is how everyone thinks!" -- and suddenly the world can become very dark.

arista 06-05-2024 11:41 PM

Sky News Text:
[The Daily Mirror says the father
of a girl who took her own life due
to harmful web content has warned
delays to an online crackdown will cost lives]

https://liveblog.digitalimages.sky/l...01aadccfe.jpeg

Mystic Mock 07-05-2024 01:54 AM

As devastating as this is for her family, and obviously I don't want to see anyone killing themselves, especially a child.

However, she must've already been in a dark place in the first place, to have even initially been searching for this type of content in the first place.

It's an awful scenario, and I do feel for the family.

Soldier Boy 07-05-2024 09:52 AM

Quote:

Originally Posted by Mystic Mock

However, she must've already been in a dark place in the first place, to have even initially been searching for this type of content in the first place.

That's not how modern social media algorithms work - you don't have to search for anything at all. It starts by serving everyone an entirely random selection of videos and then there are very sophisticated ways of guiding and tailoring content. Linger a few seconds longer than usual on a video? The algorithm will remember and "test" a few similar videos. Linger on them a few seconds as well? It now knows that this is content that captures your attention, even if it's just for a few seconds. So you get more and more of it. The more you see, the more you get, and the more extreme it becomes.

I had to completely scrap and restart my original TikTok because I got trapped in MRA/Redpill/Incel TikTok. I did not WANT to see them, but it started with watching a video of some awful arsehole because he was so awful - also made the errors of sharing a video with my wife (to say "look at this idiot") and also commenting to (of course) troll some basement-dwellers... but the damn app doesn't know (or care) why you've watched, shared and commented on a video - it just knows that you did, and that it wants you to do it more. Literally within a week my whole TikTok was video after video of bloody Andrew Tate, Jordan Peterson and GB News. The only way to get back to funny viral videos and cat memes was to delete the whole account and start over.

The same will apply to depression/anxiety/nihilistic content. Watch one video, you'll get ten more. It's a very quick and very slippery slope, especially for a young teenager who doesn't understand that they are being served up a tailored genre of videos and it's not that the whole world is changing.

I've made sure that my daughter is acutely aware of algorithms and how online content is driven. Kids in general are thankfully becoming more savvy about it in general - you'll always hear about them "tailoring their For You page" etc - which is, basically, actively manipulating the algorithm so that it only shows the stuff you actually like seeing.

bitontheslide 07-05-2024 10:45 AM

Social media can change the way they serve content on sensitive subjects, it's the easiest thing in the world to do. Rather than serving up more of the same, they could start serving up where to get help and assistance. It's really easy to do

arista 07-05-2024 11:12 PM

https://liveblog.digitalimages.sky/l...fd92f08533.png

arista 07-05-2024 11:13 PM

Ofcom want children to prove
their age online.

https://liveblog.digitalimages.sky/l...2e770e5426.png

Mystic Mock 07-05-2024 11:59 PM

Quote:

Originally Posted by Soldier Boy (Post 11447892)
That's not how modern social media algorithms work - you don't have to search for anything at all. It starts by serving everyone an entirely random selection of videos and then there are very sophisticated ways of guiding and tailoring content. Linger a few seconds longer than usual on a video? The algorithm will remember and "test" a few similar videos. Linger on them a few seconds as well? It now knows that this is content that captures your attention, even if it's just for a few seconds. So you get more and more of it. The more you see, the more you get, and the more extreme it becomes.

I had to completely scrap and restart my original TikTok because I got trapped in MRA/Redpill/Incel TikTok. I did not WANT to see them, but it started with watching a video of some awful arsehole because he was so awful - also made the errors of sharing a video with my wife (to say "look at this idiot") and also commenting to (of course) troll some basement-dwellers... but the damn app doesn't know (or care) why you've watched, shared and commented on a video - it just knows that you did, and that it wants you to do it more. Literally within a week my whole TikTok was video after video of bloody Andrew Tate, Jordan Peterson and GB News. The only way to get back to funny viral videos and cat memes was to delete the whole account and start over.

The same will apply to depression/anxiety/nihilistic content. Watch one video, you'll get ten more. It's a very quick and very slippery slope, especially for a young teenager who doesn't understand that they are being served up a tailored genre of videos and it's not that the whole world is changing.

I've made sure that my daughter is acutely aware of algorithms and how online content is driven. Kids in general are thankfully becoming more savvy about it in general - you'll always hear about them "tailoring their For You page" etc - which is, basically, actively manipulating the algorithm so that it only shows the stuff you actually like seeing.

I mean it admittedly can be tricky on Social Media platforms to cater to your taste sometimes, especially like you've said if it's about Incel type of content then your feed does get drowned in it, and I'm speaking from my YouTube experience.

But tbf I do admittedly love to seek out a Horror Show sometimes, and nothing is more scary (but also funny) to watch on YouTube than Incel content imo.


All times are GMT. The time now is 03:48 PM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2024 DragonByte Technologies Ltd.