If a service is free, then the users are the product

In 2019, the BBC launched an investigation into the safety of children who were active on TikTok. The results were shocking. Over a period of 3 months, the BBC collected hundreds of sexually explicit comments and messages uploaded by children and teenagers. TikTok stated that it is doing its best to prevent such texts or at least remove them from the platform as soon as possible.

However, the BBC also found that TikTok had indeed removed the majority of these texts within 24 hours. However, the accounts from which the messages were sent were still up in the air. The research also showed that it was very easy for children under 13 to open an account, while on paper the minimum age was stated to be 16 years old.

The platform also says it has no plans to actually take better steps to check the age of users. The shocked Commons Digital Culture, Media and Sport Committee demanded further investigation!

At the time of writing, 2021 is coming to an end. The question is whether anything fundamental has changed in the past 2.5 years. In September 2021, a series of articles appeared on the Wall Street Journal. The business newspaper stated, based on numerous internal Facebook documents, that the company put its self-interest above the public interest. In addition, the company would know that Facebook caused damage to the mental health of children and adolescents in particular. More than a month later, Frances Haugen appeared before a Congressional inquiry committee. She was the whistleblower who forwarded the documents to the Wall Street Journal. In her testimony, she referred, among other things, to the harmful influence of Instagram, a daughter company of Facebook, on the mental and physical health of adolescents. According to her, this was a deliberate policy and the top of Facebook and Instagram were aware of it.

Drugs

Of course, Zuckenberg et al. denied it in all tones and called the evidence out of context. Facebook and Instagram did their best to protect and nurture the child's soul. There is quite a bit to argue with that claim, according to several studies that took place in various places in the world. For example, the Tech Transparency Project Group (TTP) found that it was childishly easy for children and adolescents to contact drug traffickers via Instagram. According to the director Katie Paul, it took TTP 20 seconds to contact a trader. Even worse, when an account connects with a drug dealer, Instagram recommends other accounts that may also have drugs on offer. Cynically, Paul noted that it takes 2 steps to reach a trader, but it takes 5 steps before someone can log out.

Instagram's bad behavior is not of today or yesterday. As early as 2018, research by Tim Mackey showed that Instagram is trying to capitalize on opiate addiction in the US. The problem with Instagram is that it never lived up to the expectations of its mother company, Facebook. That is why the platform deployed very heavy artillery from 2018 onwards. From that year on, the platform spent roughly its entire marketing budget of approximately $390 million targeting mostly teens in the 13-15 age bracket.

Alcohol

TTP's research does not stand alone, but fits within a broader group of research. For example, a report was published in Australia in April of this year. The title of the study was: profiling children for advertising. Facebook's monetization of young people's personal data. The report details how Facebook collects personal data from teens from the age of 13-18. Based on this data, it establishes profiles of minors who may be sensitive to alcohol or other high-risk matters, such as gambling and smoking. It sells those profiles to advertisers. That's no small beer at all. These are very large numbers. Another report prepared by VicHealth, the foundation for alcohol research and education and the obesity policy coalition, found that in 2018, perhaps 940,000 children were profiled by Facebook as being interested in alcohol.

Sex

However, it could always be worse. This is the result of an investigation into online child sexual abuse in Kenya. The so-called Disrupting Harm report, drawn up by Interpol and Unicef, among others, shows that Facebook accounts for 90% of all reports of online sexual abuse. This platform is followed at an appropriate distance by WhatsApp, Instagram and Youtube. Facebook and WhatsApp show the most abuse as they are the most popular in Kenya. There would be more than 14,000 incidents and occurrences in 2021 alone. The corona crisis only seems to worsen the situation as adults and children are more housebound. The research also shows that it is often acquaintances who approach the children and teenagers. The main target group of these predators is teenagers between the ages of 13 and 17. The predators don't care if they are boys or girls.

It's not just about the predators' private pleasures, by the way. There is also often a commercial side to it. They also try to encourage teenagers and children to engage in online sexual activities that are then live-streamed abroad. This fits in with Kenya's tradition as a center for sex with minor for foreigners.

Legislation

Every investigation ends with the complaint that surveillance and legislation fall horribly short to protect the children and teens. It falls short or is virtually non-existent. In the US, for example, the Children's Online Privacy Protection Act (COPPA) dates from 1998. The law was amended for the first time in 2013. That seems to be changing, thanks to the testimony of Frances Haugen and the Wall Street Journal. However, some skepticism is in order. Congress has already used big words during interrogations of the top management of companies such as Facebook and Google. However, it never came to action. The legislation was still lagging hopelessly behind reality.

Things are better in Europe, but it is anything but ideal. At the beginning of the article it reads that on the basis of the BBC investigation, the call for better legislation has grown larger. As things stand, the Online Safety Bill will come into effect in 2022. This law should better protect children and adults against the excesses of online platforms. That sounds better than it actually is, because the protection does not go as far as originally intended. At least that's what Nadine Dorries, Secretary of State for Digital, Culture, Media and Sport claims. The power of Big Tech is far from broken. This also applies to continental Europe. Supervisors are also starting to move here, but there is still a long way to go with conclusive legislation.