Instagram is taking further steps to tackle bullying on its platform.
The social network is employing machine learning to help proactively detect bullying in photos, which will then be reviewed by a human moderator.
An Instagram spokesperson said its bullying classifier detects “attacks on a person’s appearance or character, as well as threats to a person’s well-being or health” in a photo.
If a human moderator deems the photo is in breach of the platform’s community guidelines, the photo will be removed, and the poster will be notified of its deletion and told why.
“This change will help us identify and remove significantly more bullying — and it’s a crucial next step since many people who experience or observe bullying don’t report it,” Instagram’s new head, Adam Mosseri, explained in a blog post.
“It will also help us protect our youngest community members since teens experience higher rates of bullying online than others. This new technology has begun to roll out and will continue to in the coming weeks.”
Instagram will also roll out a bullying comment filter on Live videos, which it already has done for the Feed, Explore and Profile sections.
Instagram is also introducing a new filter, called the “kindness camera,” to help spread positivity on the platform.
When you select the effect in selfie mode, hearts will fill the screen, and you’ll be encouraged to tag a friend. It’s kinda cheesy, but hey, it’s Instagram, right? Your tagged friend will be notified as per usual, and they can then share this post to their own story, or “spread kindness” with a selfie of their own. If you switch it on over to the rear camera, you can add a filter featuring the hashtag #kindcomments, and examples of these in multiple languages.
A 2017 survey claimed Instagram had overtaken Facebook as the worst social media platform for bullying.
In highly-publicised incident, Star Wars actress Kelly Marie Tran reportedly left the platform due to harassment earlier this year, and we’re sure she isn’t alone in her decision to leave Instagram.