In mid-September, Frances Haugen, a former Product Manager at Facebook released a number of internal documents to lawmakers and the press, informing the public that Facebook was knowingly harming the mental health of young teenage girls. Research conducted by Instagram, a Facebook platform, showed that 13% of British and 6% of American teens that had suicidal thoughts attributed it to Instagram. Furthermore, 1 in 5 teens says that Instagram makes them feel worse about themselves.
Now for many, this isn’t anything new. We’ve all learned from personal experience or documentaries, such as the Social Dilemma, about the dangers of social media and the negative impact it has on mental health. However, the issue here lies in two specific details. First, the research conducted by Instagram, a Facebook platform, shows that many of the mental health issues that teens face from social media can be specifically attributed to Instagram, rather than other social media platforms or social media in general. This is due to Instagram’s focus on personality, body, and lifestyle which TikTok or Snapchat, for example, do not have. Instagram created the sense of a “perfect image,” whether it be through the content of posts themselves or through damaging filters which alter the appearance of your face to make it more “ideal.” Second, Facebook was aware of the issues Instagram was causing but failed to reveal that information or do anything about it. Andy Mosseri, head of Instagram, even said in a statement to reporters that Instagram’s effect on the well-being of teens was “quite small”. Furthermore, Zuckerberg, Facebook CEO, said that research revealed the effects of social media apps are positive on mental health . Both of these statements contradict the research done by the company on the effects of Instagram on mental health.
There are a number of potential reasons why Facebook didn’t reveal the results of its research to the public, including protecting the company’s reputation and profit. However, this raises a number of ethical questions. When should the safety of users come before the profit of a company? Should companies be required to disclose research around how they may be harming their users?
Furthermore, in her recent testimony to Congress, Haugen compared Facebook to tobacco companies and auto makers, explaining that social media should be regulated in the same manner tobacco or seatbelts are to protect the safety of its users. Should the government regulate social media? At what point should the government step in?