Facebook follows the lead of the tobacco industry

In September of this year, the Wall Street Journal devoted some notable and critical articles about Facebook's actions. It would later turn out that Frances Haugen had provided the business newspaper with the necessary documents and data.

Teenage girls

One of the articles found that, based on its own internal investigation, Facebook knew or could have known that platforms such as Instagram can adversely affect the mental well-being of teenagers and teenage girls in particular. This concerns issues such as feelings of anxiety and depression. Thoughts about suicide and dissatisfaction with one's own body.

Of course, when the business newspaper published these articles, Facebook immediately jumped into action. The company claimed that the data had been taken out of context, that the story was therefore subjective, and that there was nothing to blame. Facebook then published its own version of the leaked slides to the Wall Street Journal to give more context in their defense. From the notes, it can be concluded that the data is based on the subjective observations of the participants. The study was not per se intended to map the positive or negative effects of visiting Instagram.

Qualitative research

According to Facebook spokespersons, this was also a qualitative study. It would be purely subjective information based on questionnaires and conversations with Instagram users. The study would not be intended to determine, based on data, how often users experienced feelings of depression or developed problems with their own bodies. Sure, some users brought up these issues, but Facebook said it was too sporadic to establish a clear link between teenage girls' mental health and their use of Instagram. The study was said to be primarily intended to gain insight into how users experience Instagram's products.

Outside experts, such as Melissa Hunt of the University of Pennsylvania and Kaveri Subrahmanyam, are somewhat shrugging about Facebook's defense. They recognize that much sociological research on the relationship between mental wellbeing and social media is quite an ad hoc in nature. Asking how you feel every now and then yields little substantial and lasting information.

In that sense, Facebook's defense is not untrue. You cannot draw serious conclusions based on the data from the company itself. However, the company's research does not actually stand alone. A multitude of serious studies by outside researchers are already available, including for Facebook. A good researcher links their own data to the findings of third parties who have already seriously studied the influence of social media on the mental health of the user.

If the same conclusions could be drawn from those studies over and over again, Facebook could at least have taken a more serious look at its own findings. Then researchers within the company could have concluded that the results of their studies at least resemble the results of other serious research. That should have set off at least some alarm bells. That has not happened and that alone is alarming. All the more so, since Frances Haugen's documents show that much was already known within the company about the possible impact of its platforms on its users. It was already known to Facebook that a change in the use of algorithms made the tone of the conversations more aggressive, more focused sensational. From there, it is no longer a particularly big step to extreme expressions.

The knowledge about the possible negative consequences of the use of platforms is most likely already present at the company. Back in 2012, Facebook, in partnership with Cornell University, found that it was quite possible to manipulate the user's mood by changing the content of the newsfeeds. When the results of the study were published in 2014, it turned out that the researchers had not asked the user for permission. That wasn't comme il faut, but it wasn't illegal either. However, the publication also made clear how easy it is for a company like Facebook to collect mountains of data from and about its users.

Big Tobacco

Facebook's defense and actions are reminiscent of Big Tobacco's actions and defenses several decades ago. Both Facebook/social media benefit from minimizing possible negative outcomes. Both benefit from users using their product as often as possible, even if that use is not healthy.

A few decades ago, tobacco companies knew full well that their product was highly addictive. In fact, they had taken care of that themselves. When the industry faced the threat of prosecution, they tried to defame the whistleblowers, just as Facebook tries to portray Ms. Haugen in an unfavorable light.

Facebook naturally opposes this analogy. According to a spokesperson, smoking cigarettes cannot be compared to downloading a social media app on your smartphone. That way, millions upon millions of people can enjoy the app, the spokesperson said. That was a somewhat unfortunate statement. Once, in the 1960s, a top executive of the tobacco industry made the exact same argument at a Congressional hearing: millions of people around the world enjoy lighting up a cigarette.


The Verge, Facebook isn’t telling the whole story about it’s mental Health research. October 28, 2021

Washington Post, The case against Mark Zuckenberg: Insiders say Facebook’s Ceo chose growth over safety. October 25, 2021