We need to talk... About social media.
By Jedai Saboteur
December 14, 2019 11:42 AM
The rights people have on the internet, specifically social media platforms, has become a contested topic lately. In watching the discourse, I can’t help but feel that many entering the discussion don’t have as firm a grasp of the relationship between platform operators and end users as they believe they do. I’d like to clarify some things along those lines.
With our ability to move in a seamless way throughout the web, the true nature of how things work in the background can easily get lost. We can think of the internet as a public space in a sense— like roads and highways leading to different buildings. Those buildings are the websites we access, and like the real world, those buildings have owners that determine who is and isn’t allowed access to the space. You can think of having an account as permission to enter and participate in whatever’s going inside— permission granted by the operators of the platform.
What’s important to take away here is that these buildings— sites— are privately owned and while many must follow the ‘laws of the land’, they each also have their own sets of rules that they may or may not enforce.
So, with that out of the way, I want to make one thing clear: No one is entitled to an account on a web platform. When you access a website/service, you are (in keeping with our analogy) in someone else’s building and you are there of your own will. You may leave at any time or may also be told to leave.
At this point in time, that’s reality. That’s how it works. That brings me to moderation.
In that regard, the operators of the site have the final say. It is, again, their space. Their ‘digital building’ that’s being occupied.
Claims of bad moderation abound on Twitter and Facebook— and sometimes for good reason. Moderation isn’t always done right, especially when the moderation involves hundreds of thousands (or millions) of users. Bad calls will be made. People will be unfairly terminated, suspended, or otherwise see consequences. Moderation tools are even utilized by bad actors to remove people who’s ideas they disagree with. Moderation hasn’t yet ‘scaled up’ like much of the tech that needs it has.
Even with its failings, moderation is a necessary tool. Social media platforms are just as much communities as they are services. As these communities grow ever larger, bad actors will seek to use them to exploit people, amplify hate, and stoke violence (though, often large platforms tend to move far too late). When your personal door is so wide open, so to speak, that anyone in the world can just walk in on a whim from wherever they are, you need a form of keeping the community you have safe.
The most contentious point I’ve observed is the late claim that conservative/right leaning voices are being discriminated against on large platforms. This is a claim that hasn’t been proven, but it’s stoked enough anger in people to protest, whether true or not. Often cited are figures like Alex Jones (who’s conspiracies have led to real life ramifications for the survivors of shootings and their families) and Milo Yiannopolous (who really just peddled in hateful provocation left of himself and very possibly stealth-nazism) as well as some of the other well known far right/alt right names (who have their own various baggage many other people have gone over at length, so I will not).
One point of frustration I’ve seen from this group of angry folks is that their freedom of speech has been taken away from them. That they’ve been censored. That sites like Facebook and Twitter should operate with neutrality in terms of moderation or be punished.
But that’s not how it works. Social media platforms aren’t government entities (they should not be). They aren’t utilities (they should not be). They’re private entities that allow people to use their services (well, you pay with your personal data but… that’s another story). They make the rules. The rules make might not be fair— though in the aforementioned case, there’s still no proof-of-unfairness to begin with.
The point is that ‘fair’ and ‘unfair’ are moot here. Fair, for the end user, is a state that can change at any second on any platform. A platform's terms of service can always be updated, and they don't even have to follow those to ban or suspend people. The only fair that matters is what they say is fair in the end.
That does also hit people who actually do mean well and get unfairly terminated or suspended— and ideally there should be good appeals processes, though the ideal and reality often differ. It still, unfortunately, comes down to the issue of who owns the site and has the final say.
On a deeply personal level, I don’t believe Facebook or Twitter is on campaign against conservatives, but I do believe that they’re (very slowly) taking instances of hate speech, threats, and blatant attempts to recruit people into violent ideologies more seriously. People are realizing they can put pressure on social media companies to take action against some of the forces driving hateful agendas…
At least sometimes.