In the recent past we have heard how social media has been used to drive governments out in different parts of the world. Some governments censure social media to potentially prevent such fate. We employ some of the same censor practices in education as well; at least it is true where I live.
The bigger question is what role should the social media platform operators/creators play in censoring content? For example Facebook has over 800 million users, this is more than the number of citizens in most countries. Should Facebook decide what those 800 million users can say? Since we in the U.S. think they we can say whatever we want I decided to investigate if this was really true. In my graduate class this week we were discussing how we should or should not moderate social media posts by employees of businesses. Since most of the social media tools are designed to be self-governing we assume that it is completely up to the general public to police content and that it would turn out OK. It does for Wikipedia so why not on other tools. What I found was that the tool creators routinely censor what people say if it does not conform to their ideals. Examples I found this week were from Facebook, where a disgruntled employee released a censorship list that Facebook maintains. Some of the content they do censor seems to stem from making Facebook clean, but when it censors politically charged content is it somewhat dangerous? Should Facebook decide what ideology we should subscribe to or should we be the ones who decide what we consume? This goes against the very nature of the social network policing itself.