SXSW 2022: Whistleblower Says Decentralization May Fix Facebook’s Fake News Problem

Frances Haugen compares Facebook’s practices to those of Google and Twitter

Deborah Yao, Editor, AI Business

March 16, 2022

2 Min Read
Frances Haugen

Frances Haugen made waves in 2021 when she stepped forward to reveal the extent to which Facebook allegedly prioritized profits over materially curbing misinformation on its platform. The former Facebook product manager said this torrent of fake news has real, harmful consequences to people and society.

She lay the blame at the feet of founder and CEO Mark Zuckerberg: “He knows he has tools he can use today to stop misinformation.”

But to use those tools could mean reducing the amount of content and user engagement on the platform, which translate to lower profits that would disappointment shareholders and Wall Street. “The system today is more profitable,” she said at SXSW 2022 conference in Austin, Texas.

Asked if AI will solve this problem, she said “at some point, AI will be able to” but the issue is teaching computers about nuance. “It’s hard for us to teach nuance to computers” such as learning to discern what is hate speech and what content could inflame people.

Haugen said she used to be skeptical about decentralization – the core feature of Web 3.0 and the idea of having myriad centers of control instead of a few mega-platforms dominating – and the DAO (decentralized autonomous organizations) as a possible solution.

Now, she thinks “this is doable” if there is social media only with content from family and friends.

Facebook vs. Google and Twitter

Haugen, who used to work at Google, said the tech giant does not have Facebook’s problem of misinformation because it is more open about its data and activities.

For example, users can download Google search results to discover what is included and what is excluded. The company has full-time engineers on search who write blogs on how it works.

“Facebook is different. Facebook is a closed system,” she said. “We can’t easily download all the results. That means a lot of personalization. … We don’t know if the experience we are having on Facebook is the same as everyone else’s. Facebook uses this to its advantage.”

Twitter, which also has been accused of being a platform for inflammatory content, at least has a tool that would reduce the amount of misinformation on its platform, Haugen asserted.

“They put a human in the loop. If you want to share a link on Twitter, you have to click on it,” she said. “That little piece of friction lowers misinformation by 10% to 15%.”

Haugen mentioned another solution Facebook could easily deploy: “If you require someone to cut and paste instead of click to share, it’s like having a third-party fact checker.”

But these changes could serve to lessen content and user engagement, which would impact Facebook’s metrics such as monthly or daily active users when it is time to report quarterly earnings.

Haugen makes a point to say that employees at Facebook are not “bad.” Rather, “it’s about product choices that give the most reach to the most extreme ideas.”

About the Author

Deborah Yao

Editor, AI Business

Deborah Yao is an award-winning journalist who has worked at The Associated Press, Amazon and the Wharton School. A graduate of Stanford University, she is a business and tech news veteran with particular expertise in finance. She loves writing stories at the intersection of AI and business.



Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like