A common complaint levied against social media landscape is there are lots of “bad actors” spreading disinformation. While some social media platforms lay blame at these individuals or groups, much of the problem is driven by the platforms’ own business models and algorithms.
The Advertising Model is Broken
The biggest problem with the current social media industry is the internal architecture and design of platforms. While algorithms are meant to support optimized user engagement, they are instead collecting user data, and turning them into highly profitable advertising models. As outlined in “The Social Dilemma” Netflix documentary, the business model for social media platforms tracks people’s activities and then provides them with more of the same content. This creates a feedback loop where the most emotionally provocative content wins out, even if it’s factually misleading and potentially dangerous. This is the most socially damaging dimension, where interest in a fringe idea exponentially spreads, making it “valid” as it enters mainstream thought.
The users are manipulated by these algorithms because they’re designed to increase time spent on the platform. It’s an addiction-based model that doesn’t account for any negative impacts of the user’s participation, only the calculus of the time spent on the site and the monetary conversion into advertisements. Interviews throughout “The Social Dilemma” discuss the negatives of the social ad model, especially the ways these platforms harvest personal user data to offer targeted ads and to keep users addicted to scrolling and clicking.
For an example of the content algorithm problem in action, consider Facebook. To shape the Facebook algorithm and receive their own desired content, a user can “game the system” by liking a lot of related content. Or, if a company annually hosts a big event, there is a high likelihood that interested users who want to attend will see a paid advertisement for the event produced by the Facebook algorithm, before they receive an invitation from the host.
Re-establishing A Basis for Reality Online
The feedback loops created by social media’s algorithms keeps people away from the content they want to see. This problem presents the opportunity for a new type of social platform that’s based on quality vetted content and sharing. For example, new platforms will better enable sharing of social activism messages about environmental issues or causes such as BLM. The current social media landscape prevents people from engaging in constructive discourse about a given issue, because they aren’t leveraging verified fact-based content as a sounding board. There’s a pressing need for a new type of social media platform that services networks of professional people, such as journalists, policy makers, and concerned citizens.
Ideally, relevant information might flow through social networks in ways that helps people work together to solve difficult problems, such as climate or income inequality. Groups formed by people with similar interests and areas of expertise could perform content filtering, as opposed to being at the mercy of a revenue-focused algorithm. Together groups can create and publish credible and sourced information, establishing a system amongst people based on trust and accountability. This encourages conferring trust in the group as they continue to share useful information.
Tru Social, a newly developed social network, still in private beta, appears to be the only social platform available that uses that sort of design approach. The company created technology that allows people to see the chain of publishing history for content and to verify, ignore or dispute information published on the platform. Although the company does not use a blockchain, it does use some of the same key technology, but in an entirely different paradigm. Tru’s approach to content sharing allows connected groups of people to separate credible sources of information from trolls, bad actors or confused well meaning people that promote misinformation or propaganda.
People build networks based on trust that enable them to engage and make more educated decisions based on information that has been vetted by networks of other parties that have a credible reputation. This in turn allows the broader community to begin to know what to accept. People and groups using Tru develop a sort of “reputational currency” based on the record of what they’ve done so far, where third party validations and verifications build as they publish more content. Instead of sharing emotionally charged content in a single step, Tru adds group curators that act more like editorial publishers who have a reputation at stake. This in turn drives discussion and conversation centered around credibility and facts within the group.
An Opportunity for Change
Changing how social media platforms operate won’t happen overnight. The widely used social media platforms can’t just remove misleading content and implement small feature updates that adjust algorithms, because the model itself is broken. While the prominent platforms are free to use and many companies people benefit from them, most companies have strayed far from their mission of connecting people, and instead they have shifted their focus to increasing the bottom line. They are using people’s personal data to generate massive sums of money. Beyond paying lip service to moderation or “fake news” there’s little financial incentive for the platforms to enact widespread changes to the way they function. And, for the individual person, even if they leave Facebook or Twitter, they’re still “there” because they left behind all their data impacting the content other people see.
A growing population of people want more transparency and control over their social media engagement and stronger agreements from companies about where their data is going and what it is being used for. The attention garnered by “The Social Dilemma” brought broader awareness to the tactics used by social platforms. It encouraged people to reconsider participation in these mainstream social media platform’s environments and to actively use some of the new social outlets already out there that are designed on the basis of factual trust instead of advertisements and algorithms.