Software Engineering

Facebook Redefines Social Platform

A social platform? A news platform? A brand safe platform? Can Facebook really be all of these things? And most importantly, can it do all of these things well?

A social platform? A news platform? A brand safe platform? Can Facebook really be all of these things? And most importantly, do all of these things well?

Zuckerberg made a clear stand in his testimony this week about what he wants Facebook to be. It will be interesting to see how the rubber meets the road.  

He said, "My top priority has always been our social mission of connecting people, building community and bringing the world closer together. Advertisers and developers will never take priority over that as long as I’m running Facebook.”  

Yes, users have to come first in order to have a platform with enough reach and engagement to warrant significant advertising spend (i.e. generate revenue). That priority makes perfect sense. The real test will be to enable a social environment that drives the engagement he's looking for that is ALSO brand safe so that advertisers remain confident.

What does it mean to put the user first? It means to provide a valuable experience and maintain trust. Facebook understands the valuable experience part of the equation. They have 1.7B active users on the platform worldwide (eMarketer 2018).

Maintaining user trust is where they certainly have work to do. Users need to have confidence that they have control over their own information and that the terms they agreed to are upheld. User control translates to having easy-to-understand and easy-to-navigate settings. Facebook is working to improve this. User data terms is also something that needs to be improved. Terms need to be written in a fashion that most people can understand. Unless you’re an attorney or work in the data/tech field, user terms are not user-friendly – and that’s not an issue exclusive to Facebook.

The #deletefacebook subject is not something to ignore either even though Zuckerberg noted that it has had no “meaningful” impact. In a representative sample of 1,000 U.S. Facebook users, 17% of respondents said they deleted the Facebook app from their phone over privacy concerns. 35% said they were using Facebook less than they used to over the privacy issue. (Business Insider)

If Facebook wants to be a news platform – or in the least, a platform that doesn’t spread misinformation (or as Zuckerberg also notes in his testimony, “divisive” information), they will need an editorial staff just like a news organization. Policing misinformation is in the realm of operating as a news organization. And so we’re all on the same page, misinformation is defined as “false or inaccurate information, especially that which is deliberately intended to deceive.” Teams can fact check, hold to balanced and fair treatment, and have an editorial process before content is allowed to be published. There could also be a reactive review process for published content. How will this hamper the timeliness and richness of social sharing?

This notion of mitigating “divisive” information treads dangerously on infringing on free speech if we consider its full ramifications. How does this support or subvert Facebook's core mission? Divisive is defined as “tending to cause disagreement or hostility between people.” Facebook is proud of its role in social change and activism which by virtue of auguring change, is divisive. Zuckerberg also stated "It’s not enough to just connect people, we have to make sure those connections are positive." News is not always positive. Social activism is not always positive though it may lead ultimately to a positive social outcome. How will Facebook draw the line and then walk that line when it comes to what types of information it deems divisive yet worthy of the risk in publishing it?

This point of being a news like platform goes hand-in-hand with being a brand safe platform. If Facebook has editorial staff, or a content review staff, it will be able to police user generated content to the degree that it can provide a reasonably brand safe environment. Human review, intervention and inputs are essential to augment AI systems in place that review content.

The big question is, will Facebook invest enough in this review team to bring about the change they’re looking for? And will this activity truly put users first to maintain a valuable social communications platform and also an environment that supports advertising? Zuckerberg stated in his testimony that Facebook now has 15,000 people working on security and content review, and will have 20,000 by the end of this year. He closed this statement on security by saying, "I want to be clear about what our priority is: protecting our community is more important than maximizing our profits." There is certainly a balance that can be struck but this is no simple task.

Zuckerberg's 4/11/18 Testimony

Photo Credit: AP

Malinda Gagnon

Malinda is CEO at Uprise and has more than 20 years of experience in business strategy and technology at companies including Google and WPP, and has advised clients such as Procter & Gamble, General Electric, VW, BlackRock, and Walmart.

Latest Posts

IT
30
Jun
2025

Whether you're an owner, operations leader, or IT decision maker, this webinar will equip you with actionable strategies to safeguard your business from cyber threats, outages, and natural disasters. So hop into the DeLorean for for a deep-dive journey into Disaster Recovery (DR) with a practical, execution-focused lens to help you build or refine your organization’s response plan—no flux capacitor required.

Leadership
11
Jun
2025

Disasters—whether natural, cyber, or operational—can strike any business, but small and mid-sized businesses (SMBs) face unique challenges when it comes to preparing for and recovering from the unexpected.

Uprise monthly newsletter —
Get our latest news and updates!