Charismatic Megatrolls

What we miss when we focus only on the hardest problems in online toxicity

John Bauer — Illustration of Walter Stenström’s The boy and the trolls or The Adventure in childrens’ anthology Among pixies and trolls, a collection of childrens’ stories, 1915. Via Wikipedia Commons.

I just read a characteristically wonderful essay from Whitney Philips and Ryan Milner in which they talk about the need of ethics, not civility, in our online platforms. They used an analogy/framework that makes a ton of sense. Here’s what they wrote:

When considering how ethical reflection can cultivate civility and help stymie information disorder, biomass pyramids provide a helpful, if unexpected, entry point.

In biology, biomass pyramids chart the relative number or weight of one class of organism compared to another organism within the same ecosystem. For a habitat to support one lion, the biomass pyramid shows, it needs a whole lot of insects. When applied to questions of online toxicity, biomass pyramids speak to the fact that there are far more everyday, relatively low-level cases of harmful behavior than there are apex predator cases — the kinds of actions that are explicitly and wilfully harmful, from coordinated hate and harassment campaigns to media manipulation tactics designed to sow chaos and confusion.

When people talk about online toxicity, they tend to focus on these apex predator cases. With good reason: these attacks have profound personal and professional implications for those targeted.

But apex predators aren’t the only creatures worth considering. The bottom strata is just as responsible for the rancor, negativity, and mis-, dis- and mal- information [MDMI, as Seema Yasmin calls it] that clog online spaces, causing a great deal of cumulative harm.

This is totally right — and it got me thinking about all the conversations I’ve personally been having in the MDMI space, which is a strong gravitational pull to talk about the hardest problems: Russian trolls, state-sponsored botnets, violent white supremacists, etc. etc. What’s frustrated me about these conversations is how they suck up all the air in the room, when, for those of who’ve been studying and critiquing internet dynamics for a while now, the problems are much more systemic.

And now, thanks to their framework, I suddenly thought up a word that encapsulates this: charismatic megatrolls. They’re what Philips and Milner call the “apex predators” of the online toxicity world, and they consequently draw all the attention, resources and funding away from some of the deeper systemic issues such as design, interactions, algorithms, interfaces and business models that enable MDMI to exist in the first place. Some of these issues are innocuous at first and then manipulated, while others are already toxic at the outset and then amplified by the megatrolls.

Here’s what else they wrote about the “bottom strata” of the biomass pyramid. It appeared in one paragraph but I’m bulleting it out because I think it’s a valuable and important list:

* posting snarky jokes about an unfolding news story, tragedy, or controversy;

* retweeting hoaxes and other misleading narratives ironically, to condemn them, make fun of the people involved, or otherwise assert superiority over those who take the narratives seriously;

* making ambivalent inside jokes because your friends will know what you mean (and for white people in particular, that your friends will know you’re not a real racist);

* @mentioning the butts of jokes, critiques, or collective mocking, thus looping the target of the conversation into the discussion;

* and easiest of all, jumping into conversations mid-thread without knowing what the issues are.

* Regarding visual media, impactive everyday behaviors include responding to a thread with a GIF or reaction image featuring random everyday strangers,

* or posting (and/or remixing) the latest meme to comment on the news of the day.

One of their final paragraphs is the kicker (emphasis mine):

The biomass pyramid shows that the distinction between big harm and small harm is, in fact, highly permeable. The big harms perpetrated by apex predators are exactly that: big and dangerous. Smaller harms are, by definition, smaller, and on their own, less dangerous. But the harm at that lower strata can still be harmful. It is also cumulative; it adds up to something massive. So massive, in fact, that these smaller harms implicate all of us — not just as potential victims, but as potential perpetrators.

One of my go-to analogies for the internet is that it’s like the open ocean: exciting, expansive, and filled with hidden danger. But focusing on the large, spectacular dangers misses the fact that the ecosystems themselves can be quite dangerous, or contain the conditions that enable the big harm problems. As with charismatic megafauna, focusing on charismatic megatrolls (or charismatic mega trolls?) captures headlines, wins awards and raises funds for big projects—they in fact have a lot of symbolic value for people trying to lessen MDMI in our online spaces. But it can come at the cost of exploring the deeper issues. As Nat Gyenes and I have written, we need to look at ecosystems, too (more on that soon).

Read the entire article from Whitney and Ryan here: The Internet Doesn’t Need Civility, It Needs Ethics.

author and technologist. words and commentary in ny times, bbc, atlantic, hyperallergic, etc. meedan. opinions my own.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store