From political manipulation to dangerous addiction, Roger NcNamee insists that Facebook needs to change its business model to be less harmful.
Last month it was a former Facebook executive saying the social media behemoth is "ripping apart the social fabric." This month it is Facebook founder and CEO Mark Zuckerberg's former mentor, Roger McNamee, saying in an interview with CBC Radio last week that Facebook's current business model is "parasitic", that it "harms users and can't go on."
Facebook has faced strong criticism over the past year. The two main issues are its role in news manipulation during the presidential election and growing public health concerns stemming from the site's addictiveness.
Zuckerberg responded by promising on January 4 to fix Facebook:
"Facebook has a lot of work to do -- whether it's protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent."
Zuckerberg said he will get together a group of experts to discuss and work through "questions of history, civics, political philosophy, media, government, and of course technology."
It sounds vaguely noble, but McNamee, who is a managing director at Elevation Partners and was an early investor in Facebook, said it's likely to fail for two reasons:
First, Zuckerberg and Facebook view the recent criticisms of Facebook as a PR problem, rather than a reflection of a systemic issue in the product itself.
Second, problems like addiction and 'fake news' are embedded in the very architecture of Facebook. Russians did not hack Facebook; they simply used it in a way Facebook allows it to be used, but that raises important ethical questions. As McNamee says,
"The design makes it exceptionally vulnerable to manipulation by bad actors... There is no way to address them without making the business significantly less profitable than it is today, or by scrapping the model in favor of a different one."
McNamee suspects that Facebook, as a company, is having difficulty processing negative feedback after 12 years of being loved and adored. It is tough to admit there's a dark side to something that is so fun and good in other ways. But Facebook has a responsibility to address this dark side, as other companies have had to do in the past, even if they're not the ones directly at fault for the way in which their product has been used.
For example, consider the Tylenol poisonings of 1981. As McNamee argues in an opinion piece for the Washington Post, the poisonings were not the fault of Tylenol maker Johnson & Johnson, but the company still took "immediate and aggressive action. It took every bottle of Tylenol off every retail shelf and redesigned the packaging to make it tamper-proof."
McNamee wants Facebook to do something similar. It must change the model that rewards advertisers for promoting fear and anger (currently the most shared and, therefore, most profitable posts on Facebook). Not surprisingly, calm and constructive information does poorly by comparison.
Facebook is the only entity that can break through the arguably dangerous filter bubbles it has created for its users: "When Facebook says we give people what they want, what they're really saying is that their goal is to reinforce existing beliefs, to make them more extreme."
It will be interesting to see how Zuckerberg and his team proceed throughout 2018. Facebook is losing respect and appeal at a rapid rate, particularly among younger generations who view it as somewhat dated and gravitate toward other social media platforms. Without a doubt the company has a tough road ahead.