We’ve seen just about everything from Instagram in recent years, from Instagram stories to non-chronological feeds to evidence showing that the app makes users feel the worst about themselves compared to other social media.
The Facebook-owned company is trying to fix that, though, according to Eva Chen, the head of Instagram’s fashion partnerships.
“[The team’s] entire focus is focusing on the wellbeing of the community,” Chen said of Instagram’s parent company, Facebook, at Bloomberg last week. “Making the community a safer place, a place where people feel good, is a huge priority for Instagram, I would say one of the top priorities.”
Why does Instagram need a wellbeing team?
Focusing on the community’s wellbeing would be a major step for Instagram, considering the app is notorious for lowering people’s self-esteem. Instagram can create unnecessary stress for users who feel pressured to be more like their friends or people they follow on the app. This can be especially harmful when users with major followings like Khloe Kardashian freely admit to Photoshopping their images, which can give young women unrealistic goals for their appearance and lead to unhealthy behaviors like disordered eating.
The app also struggles with policing toxic interactions between its users. In perhaps one of the largest demonstrations of this, Taylor Swift’s Instagram account was flooded with comments featuring the snake emoji after her feud with Kim Kardashian and Kanye West escalated back in 2016. Swift herself had to block users from using the emoji in future comments, but that didn’t stop her fans from doing the same with the rat emoji on Kardashian’s account — or even Karlie Kloss’s page later on.
In each instance, the celebrities themselves had to take action by disabling harmful comments or adjusting certain settings so they wouldn’t be overwhelmed with negativity from the app. The same thing can — and does — happen on a smaller scale with everyday users who might not think to adjust (or even be aware of) these settings when they’re being bullied. The app has tools to block comments including racist and sexist insults or keywords the user chooses themselves, but most users grew up with the app — they know ways around it.
So what exactly is a “wellbeing team”?
While it’s a step in the right direction for the app to have a “wellbeing team,” Instagram has yet to give a solid definition of what that even means. The future leader of the wellbeing team, Ameet Ranadive, said in a Tweet that the team intends to “promote positive behavior on Instagram” and “protect the community from bad behavior and content.”
Ranadive doesn’t explain what might be deemed “bad” content, but other platforms have seen this method backfire. Take, for instance, YouTube: its “Restricted Mode” launched last year and hid videos that featured LGBTQ+ content, including coming out stories and other videos that weren’t considered explicit. The move did more harm than good, with Twitter users starting a #YouTubeIsOverParty hashtag and content creators calling out the platform for discrimination. On the other hand, social media users have also called out Twitter for its lack of intervention or inaction in removing content from its site.
Users have yet to see if Instagram will be more active in its approach by removing hurtful messages or content, or by providing a support system, as it does for people who use words related to self-harm on the app (a helpful tool that users have already found their way around). Will it censor certain words or hashtags? Will it focus on bullying, misleading or harmful images, toxic communities, or all of the above? Will an algorithm make these changes, or will actual employees be responsible for assisting the app’s users? Only time (and future updates in the App Store) will tell.