Interview with Frances Haugen, the Facebook Files whistleblower

2023-06-08 13:45:35

The whistleblower publishes his biography The Power of One on June 13, almost two years after his revelations on Meta. Simple and effective solutions exist to protect its youngest users, but are still not applied, she explains.

“If everyone had done their job to protect Lindsay, she would be alive. These are the words of the mother of Lindsay, a 13-year-old schoolgirl who committed suicide at the end of May after months of harassment. According to the parents, those responsible are the Lille academy, the college, the police, but also the Facebook social network deemed “completely faulty” by the family lawyer. Four complaints have been filed against each of these actors. Lindsay is unfortunately no exception. The drama is repeated elsewhere. In the United States, a series of lawsuits against Meta (owner of Facebook and Instagram) and Douyin (TikTok) are being carried out by families of teenagers and a network of schools. The reason is often the same: their negative impact on the mental health of the youngest, which in some cases can lead to suicide.

In October 2021, whistleblower Frances Haugen reveals that Facebook (now Meta) is acutely aware of what these platforms are doing to younger people. The former employee of the firm unveils a series of internal surveys called the “Facebook Files”. One of them is precisely about the harmful repercussions of the Instagram social network on adolescents. According to the survey, 32% of young girls say that Instagram makes them feel even worse about themselves. The company knows it, and does nothing. Frances Haugen also accuses her ex-employer of betraying democracy, and of constantly choosing profit over the safety of its users.

Almost two years after her alert, Frances Haugen, whose biography appears on June 13, takes stock for DNA. Regulations and awareness are changing, but she still feels the platform isn’t doing anything meaningful to address the issue.

Tell us about your book, The Power Of One. What do you have to add since your appearances in the media nearly two years ago?

Frances Haugen : More and more parts of the economy are run by what I call opaque systems. Unlike a car or any other physical product, Facebook cannot be inspected, found to have manufacturing defects, or confirmed what the platform claims. All decisions are made behind curtains, hidden from users. In my book, I insist on this fundamental difference. If we leave it to the platforms, they will continue to make choices guided solely by their economic performance.

You yourself were part of a team dedicated to fighting misinformation and understanding the biases created by algorithms in order to correct them. Are there people at Meta dedicated to the protection of minors?

F. H. : All we know is that Facebook says it cares deeply about protecting children online. But we don’t know much more. And that is precisely the problem.

Shortly after your revelations, the platform announced a new feature allowing children to set an alert when they spend too much time on the application. What do you think of this solution?

F. H. : This is a very stark example of how Facebook seems to put actions in place, which in reality doesn’t yield much results. This functionality is linked to the goodwill of the user. If you don’t like it, you can opt out. I have an alert telling me to turn off YouTube and go to bed. And I tend to reject it.

Many health problems come from lack of sleep. Teenagers who lack sleep have more recourse to drugs, nicotine, develop more anxiety disorders, depression… It is also an important risk factor increasing the probability of dying from an accident. You would think that this is of great interest to Facebook and any other social network. However, they do not publish any data on the effects of the features they put in place on children’s sleep. They could survey users, asking them questions like, “Is Instagram causing your lack of sleep?” “There would also be other features that are fairly simple to set up. We have known for twenty years that if a product is slightly slower, a few milliseconds are enough, users tend to use it less. Meta could ask a teenager in the morning what time he wants to go to bed that night, and make his Instagram feed around bedtime slow down to entice him to go to bed. This feature exists on Instagram: accounts suspected of being bots are slowed down this way. In two weeks, Instagram could implement a feature that would probably save the lives of teenagers, and prevent depressive syndromes. And it’s not done. In 1960, car manufacturers faced a similar problem. Manufacturers did not want to be the first to address the subject of security, on the pretext that customers would not understand them.

Have things changed since the Facebook Files?

F. H. : For a while, yes. New features have been implemented, parental control, in particular. Instagram did not verify the age of its users until August 2021. That is to say shortly after Facebook had to comment on this internal study showing the impact of Instagram on the youngest. What worries me is that user security is a significant short-term cost, and there are still budget cuts at Meta. New waves of layoffs are underway and part of it concerns the moderation and safety teams. I fear that the mental health of teenagers is still not their priority, simply because it is not profitable.

Stricter regulation of platforms may cause them to change… You recently promoted a law called Age-Appropriate Design Code Act. What is it about ?

F. H. : In the UK, companies must now consider children when designing a product. This is already the case for physical products in the United States. Applying this rule to digital services means defining the most secure features by default, and then allowing users to configure them. For example, one could imagine that children cannot, by default, communicate with adults via a messaging service. The idea is to think first of the well-being of the users rather than the growth of the application from the design stage. This law was passed in the UK, then in California, and chances are it will be passed elsewhere.

You also praise the merits of Digital Services Act (DSA), a recent European regulation…

F. H. : Many psychologists and pediatricians have raised the fact that Meta poses a problem for the youngest, but the company has still managed to evade this question. For me, the DSA is going in the right direction, because it recognizes the existence of these opaque systems, as I mentioned above, and tries to change that. The DSA forces these companies to share everything they’ve been hiding for years, including the risks associated with their products. If a government asks them questions, they will have to answer. The lack of answers to basic questions is one of the things that shocks me the most with these platforms. On many occasions, governments say they have asked how many moderators speak French, German, or Spanish at Meta. And they never got an answer. Furthermore, with the DSA, citizens will be able to question a platform about a risk that they themselves have identified.

In France, we are starting to regulate the exposure of children on the Web by their parents. And in the United States, voices are also rising to demand more supervision. What is your take on this phenomenon?

F. H. : I think this is an example that shows us that we are entering an era with very different standards. Laws emerge because children are exploited. I strongly support the idea of ​​giving children rights over their own image, giving them the right to be treated as they deserve, and to have workers’ rights.

What changes should the platforms make?

F. H. : In my opinion, we do not need radical changes. We simply need to rethink our collective relationship to these technologies. It’s a relationship in which we would ask Facebook or TikTok to show us respect, to show us that the platform is actively trying to protect us. Today, we expect nothing from these companies, because we feel grateful to use these products for free. The truth is that they are not free. My mission is to explain that there can be a more desirable future, where the distribution of power between users and platforms is different. That’s what we’re going to start seeing in the next five years, and I’m pushing for it to happen.

Concretely, what forms could this new relationship take?

F. H. : It could be new rights for users. For the moment, for example, we do not have the right to reset our recommendation algorithms. These systems give the impression of having learned about our desires, but these may be different from what we really want. When I was in my twenties, I had to learn to walk again, because I was paralyzed. At this point in my life, I watched some pretty depressing movies and series on Netflix. And then I got tired of it, because this type of content depressed me even more. I moved in with new people, who had their own Netflix account, and I was very surprised. I realized that Netflix was also fun, and there was great content. The algorithm “thought” that I was a person that I no longer was, and it continued to feed me content that no longer corresponded to me. Imagine that it is possible to say to the algorithm: we start from scratch, I am no longer the same person.

Today, the way to respond to these problems is often through control. This is particularly what we see in China, with censorship mechanisms. What would a design that would push us to more autonomy and freedom look like? The platforms consider us as products, the goal being to keep us online as long as possible. These platforms should see us as worthy people who deserve respect. For example, when Instagram identifies that certain content depresses teenagers, and accentuates their physical complexes, the platform could alert them: “You are watching more and more content that can depress you, do you want to continue to do so ? Do you want to watch less? This is technically possible, since the platform knows how to sort images on Instagram by categories: cat, breakfast, clothes, etc.

Last summer you launched an NGO called “Beyond The Screen”, what is it working on?

F. H. : In particular, we are trying to define a duty of care, a kind of standard of conduct for social networks, so that they best protect users. We bring together technicians, legislators and organizations to pool their knowledge. Because, when it comes to social media regulation, not everyone has the same understanding of what is technically possible. Therefore, it is difficult to find a consensus on what the law should be. Our idea is therefore to list the different types of violence that exist online and the multiple solutions that exist.

TO READ

Frances Haugen, The Power of OneLittle, Brown and Company, 2023 (en anglais)

This interview is taken from number 33 of the magazine L’ADN, Children for sale, Survey: What has the Internet done to our children?, to be published on June 19, 2023 and in pre-sale by clicking here.

1686326105
#Interview #Frances #Haugen #Facebook #Files #whistleblower

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.