When Isaac Asimov penned "The cult of ignorance" in the 1980s, he was documenting what he saw as a troubling American pathology. What he couldn't have fully anticipated was how this cult would evolve into something far more insidious: a deliberately engineered system of mass confusion that serves as the perfect substrate for an emerging technofeudal order.
Asimov identified the core problem with surgical precision: "The strain of anti-intellectualism has been a constant thread winding its way throughout political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge'". But what was then a cultural tendency has now been weaponized and systematized in ways that would have horrified even his pessimistic outlook.
The cult that Asimov described has transformed into an "industry of ignorance". It's no longer just a cultural strain but a carefully marketed product. The anti-intellectual sentiment that once bubbled up organically has been captured, refined, and fed back to us through algorithms that are designed to maximize engagement rather than understanding.
The phenomenon of hyperindividualization has supercharged this transformation. Asimov's critique centers on the notion that "my ignorance is just as good as your knowledge" - a fundamentally individualistic claim that rejects communal standards of truth and expertise. Today's digital landscape has weaponized this tendency to a devastating effect.
What's particularly perverse about our current moment is that the very technological tools that could have realized Asimov's dream of universal access to knowledge have instead been deployed to create epistemological chaos. The internet didn't make us all "members of the intellectual elite" as Asimov hoped; it created siloed realities where facts became matters of tribal identity rather than empirical reality. Our media ecosystems now optimize for personalized content delivery, leading to what we might call "personalized epistemologies" where even basic facts differ between individuals based on their digital consumption habits.
The polycrisis as a feature, not a bug
The polycrisis thrives in this setting of manufactured confusion and hyperindividualization. When Asimov wrote about people who "can't read and don't read", he was describing a passive problem of the time. Today's broken information systems actively discourage coherent understanding of the complexity around us while atomizing us into isolated information silos.
The polycrisis isn't just happening along this mass confusion, but it's enabled by it. Climate denialism doesn't need to convince people that climate science is wrong. It just needs to create enough doubt to prevent coordinated action. Authoritarians don't need to persuade you of their rightness, they just need you to believe that truth itself is unattainable.
Here, the tech oligarchs present themselves as enlightened problem-solvers, but their business models fundamentally depend on the confusion Asimov warned against and the hyperindividualization that has followed. What is surveillance capitalism if not the exploitation of people who don't understand the value of their own data? The commodification of knowledge treats information not as a collective good but as a consumer product to be selected based on personal preference and comfort. As Asimov observed: "better off without any of that tripe", like science and mathematics.
The supreme irony is that even as AI promises to make knowledge more accessible, it ends up further eroding the conditions for actual understanding. AI systems trained on the collective output of human knowledge now regurgitate plausible-sounding nonsense that further muddles the distinction between expertise and bullshit. When Asimov lamented that "hardly anyone can read," he was worried about literacy. Today's problem is that we're drowning in content while starving for meaningful comprehension. We've built machines that can produce infinite text but haven't solved the human problem of discernment.
Reclaiming knowledge
What would Asimov think of this today? He'd likely see that his warning wasn't dire enough. We need to recognize that the degradation of shared knowledge isn't accidental but a structural feature of late stage capitalism. The tech bros who position themselves as our cognitive saviors are often the very ones who profit from our confusion and isolation. Their platforms are designed not to elevate the intellectual capacity of humanity but to capture and monetize our attention regardless of the social cost. They've fractured the social learning mechanisms that Asimov argued were essential.
Addressing this requires creating social structures that actually enable collective intelligence. Treating information not as a commodity to be enclosed but as a commons to be protected. Asimov was right that "every human being with a physically normal brain can learn a great deal and can be surprisingly intellectual". But he underestimated the forces that would align against this possibility. The cult has become an industry, the industry has become an empire, and the high priests of this new order are men who build rockets while dismantling the social foundations of shared understanding.
If there's hope to be found, it's in the persistent human desire to comprehend the world rather than consume information about it. This desire remains a most potent weapon against both the old cult of ignorance and its new incarnation. The question is whether we can organize this desire into something that challenges not just individual ignorance but the structural conditions that make ignorance so profitable for those at the top.