Hanged babies to gun rampages, horror vids social media giants failed to stop (2024)

SOCIAL media companies have been BLASTED by campaigners for "failing yet again" to stop kids being scarred by gruesome clips after the latest horror video.

Harrowing footage of US man Ronnie Mcnu*tt's suicide was shared widely on TikTok and other social media after it was live streamed on Facebook last week.

8

8

8

The horrific clip was uploaded to the video-sharing platforms, often under deceptive names to trick children into watching it.

Some users were reportedly able to continue sharing the clip more than a week after it was first streamed.

The incident is just the latest in which footage depicting violence was widely shared on social media in recent years.

And campaigners have slammed Facebook and others for STILL failing to stamp out horror content, despite having "every chance".

They also called for social media companies to “face fines and criminal sanctions” to force them to “wake up and take action”.

In February, Jakrapanth Thomma, a 31-year-old soldier from the Royal Thai Army, used Facebook to post updates about an attack on the city of Nakhon Ratchasima in which 29 were killed and 58 were wounded.

In March of last year, 51 people were killed in a terror attack carried out by a single gunman on a mosque in Christchurch, New Zealand.

A full 17 minutes of the attack, including the minutes before it began, were streamed using Facebook Live.

The attack prompted the Christchurch Call summitin Paris, in which world leaders called on technology companies to intensify their efforts to combat violent extremism.

How to report the video

TikTok provides clear instructions on how to report upsetting videos that may breach their rules and regulations.

If you find a distressing clip, open the video and press the share button, tap report and follow the online instructions.

Both accounts and comments can be reported using a similar method.

There are also reports that the video is still cropping up on Facebook and Instagram.

On Facebook, you can report a photo or video that you believe violates its Community Standards by clicking or tapping on the post to expand it.

Hover over the photo or video and hit Options in the bottom right corner.

Click Find Support or Report Photo for photos or Find support or report video for videos.

Select the option that best describes the issue and follow the on-screen instructions.

On Instagram, tap the three dots above the photo or video post, tap report, and then select why you've chosen to report the post.

In 2017, 20-year-old Thai father Wuttisan Wongtalay used Facebook to broadcast live footage showing him hanging his 11-month-old baby and then himself.

The footage stayed on his page for almost 24 hours before being removed.

The incident came just days 74-year-old Robert Godwin Sr was killed in a random shooting carried out by behavioural health worker Steve Stephens, 37, who also streamed the killing to Facebook.

Following the shooting, the site said the footage had only been reported more than 1 hour 45 minutes after it was posted, but admitted it "needed to do better".

The company, which also owns Instagram, faced similar criticism after the death of 14-year-old Molly Russell in 2017.

Following her death, Molly's family found distressing material about depression and suicide that had been viewed on her account.

Her father, Ian Russell, later said he believed that Instagram was partly responsible for his daughter's death.

In 2018, Youtube prankster Monalisa Perez was jailed after accidentally shooting dead boyfriend Pedro Ruiz during a stunt they had hoped would go viral on the platform.

Speaking at the time, a spokesperson for YouTube said the company was "horrified to learn of the tragedy" and that its thoughts were with the family.

It also said that it removes any content reported by users that violates its guidelines.

Social media companies do have measures in place to help users flag harmful content, but campaign groups say that more could still be done.

8

8

8

Speaking to Sun Online, Ian Russell, who nows chairs the Molly Rose Foundation, set up in his daughter's memory, said the tech giants had "once again lost control of the content they carry".

"[The Ronnie Mcnu*tt video] moved easilyand speedily across social media platforms," he said.

"Despite being counted among the most powerful corporations in the world, [social media companies] seem powerless to control the online harms infecting their platforms."

He said it was time for tech companies to "get a grip by using moreof their profits to produce effective safeguards and proceduressotheir users are protected from such harmful content".

"It is also time for the government to introduce the Online Harms regulation it has long promised will be world leading", he said.

"A strong Regulator is essential to make suretech companies are no longer allowedto profit fromallowing such contentto harm theirusers."

Asked about the potential impact of graphic viral clips on young people, Andy Burrows, NSPCC Head of Child Safety Online Policy, told Sun Online that viewing such content could be "hugely damaging".

He also acknowledged that tech giants have been “stuck playing an awful game of cat and mouse for years as they try to take down these horrific clips while people maliciously continue to spread them".

But he added: “Since the Christchurch terror attacks, and even before then, they have had every chance to develop technology to stamp out live and recorded content before it’s shared further.

“But yet again we’re seeing another example that proves they have failed to do this which is why we know they won’t wake up and take action unless they face fines and criminal sanctions.

“That’s why we’re calling on the Prime Minister to stand up to Silicon Valley and create a regulator with teeth to ensure tech firms protect children online and are punished if they fail."

You're not alone

EVERY 90 minutes in the UK a life is lost to suicide.

It doesn't discriminate, touching the lives of people in every corner of society - from the homeless and unemployed to builders and doctors, reality stars and footballers.

It's the biggest killer of people under the age of 35, more deadly than cancer and car crashes.

And men are three times more likely to take their own life than women.

Yet it's rarely spoken of, a taboo that threatens to continue its deadly rampage unless we all stop and take notice, now.

That is whyThe Sun launched the You're Not Alone campaign.

The aim is that by sharing practical advice, raising awareness and breaking down the barriers people face when talking about their mental health, we can all do our bit to help save lives.

Let's all vow to ask for help when we need it, and listen out for others...You're Not Alone.

If you, or anyone you know, needs help dealing with mental health problems, the following organisations provide support:

Asked what more social media companies could do to stop distressing content being spread online, a spokesperson for the Samaritans told Sun Online: “Whilst we have seen positive progressin how technology companies detect and respond to harmful content posted on their platforms, there is still much more that needs to be done to protect vulnerable people.

“It will take all of us working together including government, industry, charities, and also users who generate and post content online, to make it a safe place.

"We need improved technology to enable [harmful] content to be identified and removed faster and comprehensively, but we also need users to stop sharing and engaging with it.”

In response to a request for comment, a spokesperson for Facebook said: “Our hearts go out to the victims, families and communities affected by these tragedies.

"We care deeply about the safety of our community and we invest billions of dollars each year to prevent harmful content from appearing on our platforms.

"We know there’s more to do and will continue to work with mental health experts like Samaritans, local community groups and law enforcement to get people help when it’s needed.”

Most read in News

CRUEL ACT Killer dad jailed for 25 years after forcing son onto treadmill for being 'fat'

UNFINISHED BUSINESS Inside world’s tallest abandoned skyscraper ditched in doomed project

W-RITTEN OFF Kyle Rittenhouse makes dramatic U-turn after saying he won't vote for Trump

A spokesperson for TikTok said: "OnSundaynight, clips of a suicide that had originally been livestreamed on Facebook circulated on other platforms, including TikTok.

"Our systems, together with our moderation teams, have been detecting and blocking these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.

"We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who've reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.

"If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app andin our Safety Center."

The Samaritans can be contacted 24/7 via freephone 116 123. For more information, visit samaritans.org.

8

8

Hanged babies to gun rampages, horror vids social media giants failed to stop (2024)

References

Top Articles
Latest Posts
Article information

Author: Kareem Mueller DO

Last Updated:

Views: 6408

Rating: 4.6 / 5 (46 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Kareem Mueller DO

Birthday: 1997-01-04

Address: Apt. 156 12935 Runolfsdottir Mission, Greenfort, MN 74384-6749

Phone: +16704982844747

Job: Corporate Administration Planner

Hobby: Mountain biking, Jewelry making, Stone skipping, Lacemaking, Knife making, Scrapbooking, Letterboxing

Introduction: My name is Kareem Mueller DO, I am a vivacious, super, thoughtful, excited, handsome, beautiful, combative person who loves writing and wants to share my knowledge and understanding with you.