Thinking About User Generated Censorship
by chris on Jun.14, 2012, under general
This fall I will be taking a leave from my job to be a full-time graduate student in CMS at MIT. More on that later. For now, this post lays out the contours of my proposed master’s thesis, both to help me organize my own thoughts and also in the hopes others will help me think about them.
In 2009 a loosely-knit group of conservative Diggers founded the Digg Patriots, a highly active “bury brigade.” Hosted in a Yahoo!Group and facilitated by a variety of post-tracking technologies, the Digg Patriots would link each other to what they deemed unacceptably “liberal” posts or posters so that they could team up to “bury” them by downvoting into obscurity. According to Phoenixtx, a founder of the Digg Patriots, “The more liberal stories that were buried the better chance conservative stories have to get to the front page. I’ll continue to bury their submissions until they change their ways and become conservatives.â€
In 2008, a conservative blogger accused “the left” of similarly strategizing to flag conservative YouTube videos as spam or abusive for takedown. And, almost a year ago today, links to a U.K. strike site began being blocked as spammy on Facebook under strange and unexplained circumstances.
These incidents differ in important respects but they are characterized by a common dynamic: end-users repurposing certain algorithms to remove content from the stream of conversation.
It is my argument that today’s dominant information ecosystem which has widely distributed the means of information production has also widely distributed the means of informational removal, and that as Internet intermediaries have designed and deployed tools to incorporate “social†feedback into quality assurance algorithms, users have begun to strategically repurpose these tools in order to silence speech with which they disagree. And the goal of my research is to document and define user generated censorship as an emergent practice in relation to the mediating technologies which enable it.
Why “user generated censorship”?
For one, it nicely mirrors and invokes user generated content. Besides the rhetorical flourish, the invocation actually has an intellectual purpose, because the technological affordances and social practices which are associated with user generated content are the same affordances and practices which allow for their opposite. Put more plainly: the design of reddit lends itself to the earnest upvote but also the strategic downvote. The sorts of end-user power and input which characterize social production / Web 2.0 / whatever empowers users not only to produce content but also to remove it.
For another, the word “censorship” is controversial and contested, and I am going to try to use that historical weight to hammer home why this matters. Censorship – as opposed to repression – is something that we think of as being an exercise of centralized power. A pope censors. A king censors. Even a local autocrat draws their power ex officio.
But the reason we worry about censorship has nothing to do with the structure of power which enables it but rather the results which obtain: the silencing of ideas, of culture, of alternative perspectives.
“Internet censorship” has been done to death in the academic (and popular) literature. But it is all the old dynamic in a new medium. One worries about Google in China – or just plain China or Google alone – because of the power that large centralized authorities can wield over their constituents (and each other).
The Digg Patriots, on the other hand, have no office and no formal power which exceeds that of any other individual user. But through their strategic behavior they were able to repurpose the power usually reserved by and for centralized authority towards their own ends.
This is interesting and new and different, I think. Facebook has a lot of centralized power over the links shared in its news feed. It would never, I think, explicitly put content up for vote: “should we allow people to link to J30Strike?” Nor would it, I believe, allow its engineers to block content with which they politically disagree. But by allowing end users to make a nominally neutral decision (“is this spam”) and then enforcing that decision with the full power of a centralized network, Facebook – and everyplace else – has effectively delegated the power associated with the center of a network to a subset of the nodes at the edges.
So there is my project as a series of concentric circles. At its core, it is a journalistic enterprise, documenting what I believe to be an emergent dynamic between users and technology. But that dynamic operates within a larger context, not only of why information matters but how this new dynamic is an entirely new configuration of user power in the context of networked social intermediaries.
June 15th, 2012 on 11:37 pm
Something similar happens on Tumblr. On there, when you block people, you get an option to click a button to report a user for spam or harassment — and it seems that when a user reaches a certain number of such reports then their account is (apparently automatically) semi-deactivated (we call it “ghosted”): their posts won’t appear on public tag pages, their messages aren’t received, and notes by them don’t show up anywhere, and people won’t notice when the ghosted user follows them. They’re effectively secluded from the rest of the community and only their previously existent followers know they exist.
It’s a good way to take care of spammers, I’m sure, but one too many folks try to take advantage of the system to silence people who say stuff they don’t like. Lately I’ve been seeing at least one post biweekly on average where someone makes a public call to block and report for harassment a specific user. They never say it’s for the purpose of getting that user ghosted, but it’s implicit, otherwise they’d just block the user and not invite others to dogpile on them. Serious offenses like (actual) hate speech, and aggressive bullying of certain minors should of course be reported, I won’t argue that at all, but most of the time it’s like “this user disagrees with me on political issues, block for harassment” and “this user insulted me and hurt my feelings, block for harassment” and “this user is saying things I’m uncomfortable with, block for harassment”. And it’s not a false threat either, I’ve seen a lot of okay people basically get their accounts semi-deactivated because they offended someone. It’s not a rare occurrence.
June 18th, 2012 on 6:38 pm
Fascinating topic, I’m looking forward to reading more. Have you considered doing any research into any similarities and differences between how this user generated censorship operates in the online realm, and how a subject or idea might become taboo in the offline world? Both are proxies for the judgement of a community, I think, but clearly the user generated censorship phenomenon examples you’ve given are more strategic and organised.
Once again, it’s a thrilling idea. Best of luck with your graduate studies. I’m about to leave my full time job in Australia to do an MA at Columbia; it’s inspiring to see your direction.
June 18th, 2012 on 11:28 pm
Thanks for the thoughts Fergus. I’ve thought a bit about this taboo problem. My sense has been generally that these sorts of taboos fall more closely into Foucault’s idea of repression: the things that are unsaid and unthought. This is a bit different, I think, because things are said / thought but then removed through algorithmic intervention.
But I’m glad you like the idea, and good luck in NYC!
October 1st, 2012 on 8:01 pm
[…] by chris on Oct.01, 2012, under general I have recently spent a great deal of time thinking about technology. This may come as no surprise. After all, I am a student at the Massachusetts of Technology, working in a lab which makes civic technology, and writing my thesis about how people use social technology. […]
March 2nd, 2013 on 11:07 am
[…] My apologies if it was (and/or) remains obscure: I’ve been distracted writing my thesis. In any case, if you’re interested in learning more about actor-network theory, you should […]
March 5th, 2013 on 2:01 pm
[…] Chris Peterson thesis […]