Lots of internal communications have allowed us to take a rare and unadorned look at Facebook’s self-assessments and deliberations about the influence the company’s product designs and decisions have on people.
Perhaps the public and Facebook would benefit if these little glimpses weren’t so unusual. Facebook and other internet powers could help us understand the world by showing us a little more of the messy reality of hosting virtual meetings for billions of humans.
Something that has pleasantly surprised me from the reports on the documents compiled by Frances Haugen, the former Facebook product manager, is the amount of attention and care that Facebook employees seem to have devoted to evaluating the company’s applications and ways in which they shape what people do and how communities and societies behave. Facebook, show us this side of you!
Casey Newton, a technology writer, made an argument about this last month: “What if Facebook routinely published its findings and allowed its data to be audited? What if the company made it much easier for qualified researchers to study the platform independently? “
And what if other companies in the tech sector did the same?
Imagine what would have happened if Facebook had clearly explained the difficulties it faced in restricting posts with false information about fraud after the 2020 US presidential election, and if that process risked silencing legitimate political debates.
What if Facebook had publicly shared its private assessments of how many post easily share features amplified hate or bullying posts?
Imagine what would happen if Facebook employees involved in major product design changes could – just like US Supreme Court justices – write dissenting opinions and explain their disagreements to the public.
I know that part of it, or all of it, sounds like fantasy. Organizations have legitimate reasons for keeping secrets, including protecting their employees and customers.
But Facebook is not just any organization. It is part of a small number of corporations whose products help shape the way humans behave and what we believe.
Learning more about what Facebook knows about the world would help improve our understanding of ourselves and of Facebook. It would give outsiders a chance to validate, challenge, and contribute to Facebook’s self-assessments. And it could also make the company a little more trustworthy and understood.
Facebook has already said that it believes the reports on its internal communications lack context and nuance. Part of his reaction has included taking drastic measures in internal deliberations to minimize leaks. In my conversations with people linked to the tech industry this week, the fear emerged that the reaction of Facebook, YouTube, Twitter and other companies to these weeks of incessant coverage of Facebook is to investigate less the effects of their products or hide more what they discover.
However, another alternative would be to be more transparent and reveal much more. And that wouldn’t be crazy for Facebook.
In 2015, the company published and discussed research by its data scientists that found that the social network did not make the problem of “bubble filters” worse, in which people only see information that confirms their beliefs. In 2018, Mark Zuckerberg published a lengthy post detailing the company’s analysis of how people on Facebook responded to obscene or offensive material. That same year, Facebook unveiled an ambitious plan to share vast amounts of user data and posts with outside researchers to study harmful information.
These efforts weren’t perfect at all. In particular, the independent research consortium was affected by erroneous data and disputes over the preservation of people’s privacy. But the initiative shows that Facebook, at times, has wanted to be more transparent.
Nathaniel Persily, a professor at Stanford University School of Law who was previously co-chair of the research consortium, recently drafted text for legislation that could give independent researchers access to information about the inner workings of internet companies.
Persily told me that he considered the research consortium an “animal run over on the road to something glorious.” That glorious would be both voluntary and forced transparency on the part of the big internet companies. Persily praised Twitter, which last week published an analysis of the ways its computer systems in some cases amplified political views on the right more than on the left.
The Twitter investigation was incomplete. The company said it did not know why some messages circulated more than others. But Twitter was honest about what it knew and what it didn’t, and gave the public and researchers opportunities to do more research. I mean, it showed us the mess.