Intellectual denial of service attacks

We live in an era that devalues conformity, while simultaneously preserving it in many interesting ways. Everyone is allowed to have an opinion. Divergent views produce conflict, however, and disagreement, argument, and debate define our current moment.

If we merely disagreed on matters of taste – our favorite color, music, movies, etc. – we could avoid such conflicts. Increasingly, though, we disagree on more fundamental ideas. Some deny the spherical shape of the Earth and the heliocentric model of the solar system (I highly recommend Behind the Curve, a movie about this movement). Arguments of all shapes and sizes spring up everywhere: capitalism vs. socialism, humanity’s role in climate change, on and on.

The democratization of virality amplifies these disagreements. Previously obscure ideas can quickly become widely known. Competing ideological camps endlessly try to score points on one another. The internet rewards this behavior with fame and other social capital. Various forms of what I’ll call “intellectual denial of service” act to reinforce this dynamic. I’ll describe one of these attack vectors in this post.

Bad infinitum

Say that you stumble upon an idea, X, that contradicts widespread consensus views. X explains something you previously didn’t understand or doubted, in a way that now makes perfect sense. The consensus believers have their own idea, Y. They may have degrees in a relevant field, popular best-selling books, or any number of other indicators of social cachet and expertise.

You take your idea, and you present it to one or more of them as a challenge: “here is why you’re wrong about Y.” They’re likely to respond indignantly, as you’ve just attacked their competence and expertise, perhaps even their livelihoods. Sadly, defensiveness rarely produces the best arguments.

They might quote-tweet you with a snarky comment: “get a load of this rube.” Or maybe they simply ignore you. If you’re lucky, they will attempt to engage with you and present evidence for their beliefs that contradicts your premises. This can go back and forth for a while, but it seldom ends with someone changing their mind.

Once we have publicly attached our name to an idea, the path of least resistance is to continue believing it. We love to preserve self-consistency and hate to admit when we’re wrong. [Insert obligatory list of cognitive biases like confirmation bias, the Dunning-Kruger effect, the availability heuristic, etc.]

Now you each go your separate ways, likely more convinced of each of your positions than before. Triumphant, you post a tweet about your idea and go to bed. When you wake up, you see that others have found your idea compelling. They, too, have decided to confront the other side with X and evidence supporting it. The X crowd inundates the Y crowd with more demands for evidence and challenges of their expertise.

The Y’s may start off by responding politely to each challenger, but they will run out of energy at some point. They’ll say “I’m done talking about X,” or merely shut down and stop responding. Tired of treading over the same ground repeatedly, they simply give up in exhaustion.

This is the ultimate coup! The opposing army has thrown down its arms and the castle is undefended! The conversation becomes more and more one-sided. Lots of proponents of idea X shouting on one side, annoyed silence or open hostility on the Y side.

The bad infinitum cycle has started. As experts in the Y camp become increasingly defensive and hostile, the X camp gains prominence through attrition. Non-experts deride experts as weak, corrupt, or misguided. A feedback loop forms: pro-X people attack the experts, who eventually get exhausted and give up. The pro-X people present this as further evidence for X. More people flock to X based on this supposed victory, and so on. New X proponents rehash the same arguments over and over again, frustrating and bogging down the Y’s. Returning to harmony requires breaking this feedback loop.

Alberto Brandolini, originator of the bullshit asymmetry principle

Asymmetric warfare

This dynamic contains an important asymmetry. Far more people can grok simple, intuitive (but wrong) ideas than can grasp nuanced and complex ones in any given field. Something like a Pareto distribution can form, with maybe 20% of people having 80% of the understanding and expertise. Further, if you believe in the Dunning-Kruger effect, experts become increasingly unsure of their expertise as they gain more of it, weakening their defenses.

Quickly, the 80% can overwhelm the 20% with demands for explanations and evidence. Such demands require little effort, while placing a large burden on the other side to carefully craft a counter-argument and assemble available data. Once assembled, counter-arguments may be misinterpreted or simply ignored (remember, this is a conversation between experts and non-experts), representing wasted effort on the part of the expert.

Every minute spent refuting X takes away energy that could be spent refining Y. Experts, rejecting this bargain, concede the commons to non-experts and go back to their own insulated communities. The X crowd cheers with its victory and Veritas, the goddess of truth, weeps.

Anyone familiar with internet denial of service attacks may recognize some similarities here: as unproductive traffic overwhelms available bandwidth, productive requests time out. Wasting intellectual horsepower by refuting bad arguments, experts have less and less time for more productive endeavors.

Bad infinitum is but one form of this attack, and I will cover others in future posts. I hope you’ll bear with me, as I’m likely to spawn even more neologisms in the process 🙂

Part 2: The Map to Nowhere

21 Replies to “Intellectual denial of service attacks”

  1. Hey Charles, Thank for the nice, thought provoking write up. The image is of Alberto Brandolini, from XP2014, Rome.

    Refer :


  2. Brilliant! The denial of service analogy is so true. There will be probably be an uneducated group that will refute the existance of DOS attacks! 😉


  3. lokendra sharma March 3, 2019 at 3:44 am

    Let’s say the internet denial of service is controlled by a hacker who owns bots, and is the mastermind. It is possible that in an intellectual denial of service by non-experts there is an expert mastermind.


    1. There exist botnets of people, with the command and control channel being social media.


      1. Social botnets, aka cults.


  4. You’re mistaken. Why? You assume that “experts” (as a social group) is tha same as “experts” (as the holders of truth). Examples from history: “Experts” believed in the geocentric theory and actively punished heliocentrists. “Experts” believed that washing hands before surgery is unnecessary. “Experts” believed eugenics is the right way to go. “Experts” believed that smoking tobacco is healthy. And so on in the humankind history. In effect you mistake “social acceptance” scale with “real truth scale”. It is manipulation, because those are two metrics not one.


    1. Far from believing that experts are always right, what I wanted to convey with this post is that there are unhelpful ways of expressing disagreement with experts. I’m well aware of many failures of experts: “scientific” agricultural policies leading to famines and misery in many places in the 20th century, financial models that ignore hard-to-quantify risks, on and on.

      To challenge the experts, you have to be capable of interpreting what they’re telling you and building on top of it, rejecting only the parts that are _wrong_ and not merely rejecting experts out of hand as elitist/etc. The failure mode I describe here is a stubborn insistence on denying their expertise without seriously engaging with it.


  5. This is greats. And I think an essential part of really understanding the information economy. Misinformation, flawed beliefs, are key elements, but so are the flaws in our systems of exchange and resolution. Many incentives are aligned against collective progress wisening humanity.


  6. Interesting how you write about people who actually believe in X.

    My favourite example of this kind of attack is the whole “there are no Russian troops in Crimea” thing. Putin said it over and over again, it was repeated online thousands of times, and yet years later when asked about it he laughs and says “of course there were troops in Crimea”.

    It doesn’t matter that the statement is clearly false simply because it takes more effort to refute it. And then when our world is filled with plain mistruths, the truth becomes indistinguishable from the background noise.


  7. *They* say the first step on the road to success (or the road less taken) is identifying a problem, and you’ve done a marvelous job in summarizing such a pox. I tried and failed in doing the same before, calling the phenomenon “bar sinister diatribes”. Something especially curious though, is what I see as the old adage of “agreeing to disagree” doesn’t exist in practice, to the extent of really being absolute meaninglessness. It’s not just that parties X, Y and Z disagree, they are obliged to then disparage one another with ultimate aims of either converting or totally destroying differing ideas. I think online, that we do not truly accept differences, and our respective media bubbles, collegiate safe spaces and social media echo chambers all support this. Maybe this is an older, deeper psychological issue enabled greatly by communicating without pheromones, as folks like Spanish film-maker Nicolas Alacala have suggested.


  8. I feel lucky in that (1) I remain significantly unknown on the Internet and (2) I don’t manage to get my back up, or get offended, by what other people say, and how they say it, on the Internet. The prevalence of misunderstanding, running bidirectionally, is, er, prevalent. If more of us manage to take more deep breaths (as necessary) and more steps back, we might be able to nip this DOS in the bud—

    Well, no. Probably not. It’s just a lazy, rainy, Sunday-afternoon thought.

    Carry on.

    (($; -)}™


  9. […] working my way through a couple of texts and a travel piece, but I wanted to briefly comment on this blog post by Techiavellian.  Whether you know what a denial of service attack is or you don’t, he makes a couple of […]


  10. i wonder how many smug people will read this and automatically applly it to confirm their bias against people who dispute that man made activity releasing c02 in the atmosphere will immanently lead to global apocalypse which an only be prevented through global governance and massive implementation of artificial scarcity.

    Flat earthers and ‘electric universe’ are another set of fools altogether. It really depends on what kind of bullshit you are talking about.

    there is defensible bullshit and then there is outrageously stupid bullshit. and then there is critical thinking delivered in an off the cuff fashion, combined with basic wisdoms, that is easily dismissed or smeared by others as bullshit. and eventually we like to pray to the god veritas , that the truth prevails. funny that in greek mythology the spirit of truth was possibly born of prometheus, a god of technology and light.


  11. Excellent characterization of how online (and offline) deliberation is currently suffering some major challenges!

    If you get the chance, check out our white paper on the Canonical Debate. Both of your DOS attacks reference the problem of needing to support or refute arguments multiple times. The Canonical Debate is all about de-duping arguments so we can spend our time on new information, rather than on repetition.


  12. Retórica chavista (Venezuela). “El imperio es malo”, pero les encanta viajar para allá y sus hijos viven fuera del país, en el imperio. “Tenemos el mejor sistema de salud”, pero ellos se atienden en otros países. “Ser rico es malo”, pero son infinitas las denuncias de corrupción y robo de recursos del Estado. Fue tanto lo que se refutó que, por fin, los que creyeron en ese discurso ya no lo aplauden como foca. Funcionó la teoría.


  13. “If you believe in the Dunning-Kruger effect”: Good call.

    Going back to the original paper, the graphs are very different from what popularizers present. I am not a statistician, but it is hard for me to see any difference between the actual measurements presented there, and ordinary regression to the mean. The incompetents were more certain than they should have been, and the experts less, but the slope was still positive. Ignorant pigheadedness abounds, but it is not clear that Dunning-Kruger accounts for much of it.

    Details aside, I am happy for evidence of someone thinking deeply about this problem. It may be the challenge of our age, that must be solved before we can tackle global climate disruption. May heaven have mercy on us.


    1. I didn’t want to trot out the same well-worn list of cognitive biases as definitive, but enough people believe strongly in them that I decided to include several. Caveat emptor 🙂


  14. Reminds me of this observation by Isaac Asimov:

    Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.’


  15. “humanity’s role in climate change”

    this is the dog whistle right there.

    no one is really debating humanity’s role in it, people are debating the nature of how supposedly catastrophic this change is, regardless of humanity’s role in it, and catastrophe is , of course, a prediction of timing and magnitude of change in many places to produce an average of overall prediction for the whole world.

    Climate of the ‘world’ is a complex system, and there are many many many people who rationally and sensibly oppose baseless nonscientific predictions of total destruction and catastrophe resulting from ‘global’ climate change over the next 50 years let alone the next few centuries.

    There is nothing at all scientific about many of the predictions of total disaster occuring soon. they are actually not even ascientific, they are worse, they are religion masquerading as science, and you can tell this by the way people are lead towards reaffirmation of their eschatological thinking, a psychological pattern and social mess often seen in the wake of many religious movements and cults.

    ‘the world is ending’ eschatological cults often rise up in the face of many legitimate problems and challenges facing humanity. from diseases, to eclipses, to unexplained or hard to explain phenomena of all types, you get eschatological cults led by a worshipper celbrating doom and preaching nothing but the same old answer; do as I tell you to do not as i do.

    cults are about control, that is their calling card. and someone dropping their dog whistle on this makes me very suspect of their otherwise reasonable arguments about ‘intellectual ddos’ . perhaps this dialogue contains a little bit of an apology for cultish unintellectual thinking in itself, and is guilty of the very accusation is chooses to hurl on others.

    and as for experts, they have their place, however, if you read nassem taleb and anti-fragile. experts are used to justify systems, like the one we have now based on creating limitless amounts of debt and debasement cycles, that —are simply paid to tow the line whether they know it or not. so experts are part of justifying the problem and hiding it by jargonizing , distracting and flat out lying much of the time. if you don’t believe we have a choice but to do that to ourselves, then you think the problem is so bad we cannot solve it or you don’t want to solve it. otherwise, this disservice can be easily remedied by accepting that experts can totally full of malarky at times and are a huge problem.

    this is doubly so when they opine on complex unpredictable systems , the predictive outcomes of which may influence the system—in particular economics and finance experts predictions can be self serving lies.

    of course experts in every industry have something to gain but some areas are more prone to bias than others.


  16. […] [16] See also the work of Alberto Brandolini, originator of the bullshit asymmetry principle; Brandolini’s law emphasizes the difficulty of debunking bullshit and, development of the ‘intellectual denial of service’ concept and ‘bad infinitum,’ “a tendency for non-experts to overwhelm experts with repetitive costly, and often unproductive demands for evidence or counter-argument to oft-debunked or misleading claims.” ( […]


Leave a Reply

Your email address will not be published. Required fields are marked *