Mail Online

Children’s online safety still isn’t a top priority for Instagram, insider claims

After coroner’s damning verdict in Molly inquest...

By Jim Norton Technology Editor

AN Instagram moderator has accused the social media giant of ‘cutting costs’ when it comes to prioritising safety in the wake of the Molly Russell inquest.

Speaking on condition of anonymity, she said the multi-billion-pound company could do ‘much better’ to protect children on the platform.

The moderator, who joined a year after 14-year- old Molly died, said most of those posting self- harm content appeared to be teenage girls. She admitted she was ‘not qualified’ to deal with mental health issues and told how colleagues had been ‘concerned’ by the amount of harmful content allowed to remain on the site.

The moderator added: ‘I think they still could do much, much better. I feel like they are cutting costs sometimes.’

Molly became the first child in the world in which social media was formally found to have contributed to their death on Friday. Following the conclusion, her family released a series of photos of young Molly.

The inquest heard how the 14year- old engaged with thousands of self-harm posts in the lead up to her death.

Giving evidence, Instagram owner Meta’s head of health and wellbeing Liz Lagone claimed the platform was ‘safe’ for children and defended its policy of leaving such content up if it was ‘admissive’.

Shown the content Molly had interacted with, the senior executive claimed it did not ‘promote’ self-harm and instead benefited users by letting them ‘express themselves’. The social media giant changed its policy in 2019 to ban all such material after expert advice said it could ‘ unintentionally’ encourage users to carry out such acts.

The Instagram moderator, who very recently moved to another job within Meta, said they would have to get through around 200 accounts a day – giving them around one to two minutes on each.

She added: ‘The scale of what we were doing could sometimes feel overwhelming.’

She said she had no experience or training in dealing with mental health issues, adding: ‘If there is any indication the user was posting something like a farewell, or a picture of something that maybe a method for suicide, and it was within 24 hours, we would be escalating it to police. But it was up to us to make that call.’

The Daily Mail was put in touch with the moderator by Foxglove, which is working with content moderators to fight for better conditions and pay.

Meta said it invested ‘billions each year’ in its moderation teams and technology to keep its platform safe, adding that its content moderators go through an ‘ in- depth, multi- week training program’.

For help or support, visit samaritans. org or call the samaritans for free on 116 123

CONSERVATIVES IN BIRMINGHAM

en-gb

2022-10-03T07:00:00.0000000Z

2022-10-03T07:00:00.0000000Z

https://mailonline.pressreader.com/article/281861532389441

dmg media (UK)