Bringhurst: Facebook Regulation is Long Overdue

http://maxpixel.freegreatpicture.com/

I Like It 3d Like Share Social Networks Facebook

By Maggie Bringhurst, Opinion Writer

 

In September 2021, former Facebook employee Frances Haugen leaked thousands of the company’s internal documents to the Wall Street Journal. This information consequently sparked a bipartisan conversation about the harms of social media.

“Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism and polarization — and undermining societies around the world,” Haugen said in her written testimony to Congress.

Facebook has felt the heat from Congress even before this incident. This time, legislation seems more feasible since many now consider Facebook as a legitimate threat to democracy, becoming an issue across party lines.

Facebook utilizes hate speech and misinformation to incite engagement on the platform, creating a polarizing environment that’s leaked into the real political world.

The United States government should impose regulations on big tech companies to limit harmful algorithms and promote safer social media usage.

No Regulation Means Valuing Profit over People

Facebook’s engagement-based ranking algorithm consistently prioritizes content that provokes user engagement through shares and comments. Unfortunately, people will likely engage more with inflammatory content. And because Facebook values engagement over possible negative ramifications, this content will more likely show up on someone’s feed.

“Facebook’s own research says they cannot adequately identify dangerous content,” Haugen said. Facebook is aware that their algorithm promotes divisive content, and that any efforts to censor such content has been widely unsuccessful.

Though much of this knowledge is new to the public, it isn’t new to Facebook. In 2016, an internal presentation reported that Facebook wasn’t only home to many online extremist groups, but it also actively recruited users to join. “64% of all extremist group joins are due to our recommendation tools,” said the presentation regarding the “Groups You Should Join” and “Discover” features.

“Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” Facebook researchers said in internal memos.

“It is clear that Facebook is incapable of holding itself accountable,” Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) said in a joint statement, calling Facebook’s leadership a “growth-at-all-costs mindset that valued profits over the health and lives of children and teens.”

The Need for Government Intervention

Mark Zuckerburg approved the creation of the Oversight Board in 2018, where most of Facebook’s current regulation comes from. The board makes rules and sets precedent for content moderation on the platform.

In May 2021, the Oversight Board responded to the January insurrection on the Capitol by supporting Facebook’s decision to suspend Donald Trump and suggested additional changes. Though Facebook implemented some of these suggestions, it didn’t implement all.

One suggestion included a request to release reports on the error rates and consistency of cross-check compared with ordinary enforcement procedures. Facebook rejected this change, saying it wasn’t feasible.

Thanks to the whistleblower, we now know the cross-check system allowed accounts to share claims that Facebook’s fact-checkers deemed false, including that vaccines are deadly and that Hillary Clinton covered up “pedophile rings.”

Facebook implemented a third party fact-checking system after being criticized for allowing misinformation to spread during the 2016 election. This also hasn’t been entirely successful. “Facebook failed to put fact-check labels on 60% of the most viral posts containing Georgia election misinformation that its own fact-checkers had debunked,” Business Insider reported in 2020.

If Facebook can simply deny requests from their supposed regulators, then this type of regulation isn’t enough. We cannot rely on companies to value people over profit — we need enforceable intervention such as a government agency.

What Regulation Should Look Like

Congress debated Facebook’s role in Muslim genocide and COVID-19 misinformation, but had difficulty landing legislation because the issues were partisan. The new information showing Facebook’s disregard for children incited a bipartisan conversation about people and values.

Currently, Section 230 of the Communications Decency Act prevents websites from being liable for the content posted on their platforms. Now, legislators are considering making reforms to this section.

Companies shouldn’t shoulder the blame for unlawful posts, because it’s impossible to approve all engagement on a widely-used site such as Facebook. But they should be held liable for the algorithms that they create and sustain.

We need a new government agency dedicated to monitoring and enforcing positive algorithms. While companies would still develop algorithms, the enforcement agency would develop behavioral standards for companies.

A new agency designated to regulate social media would invite transparency and “address the worst negative externalities of virality,” said Renée DiResta, research manager at the Stanford Internet Observatory.

Others argue a government agency wouldn’t be enough. “Lawmakers would be more effective in going after the company’s primary source of profit: personal data,” said Siva Vaidhyanathan, director of the Center for Media and Citizenship at the University of Virginia. Requiring companies to delete data after a certain time would limit targeted ads and algorithms.

Some have concerns about creating legislation for today’s problems, as they may be irrelevant by the time it passes. But a new government agency could help research current and timely social media behaviors to aid this.

 

Facebook proved their incapability to regulate themselves time and time again. Its rapid growth has left lawmakers stunned and the public vulnerable. With the impact Facebook and other tech companies have on our personal lives and political world, we must create regulations to ensure a safer online future.