top of page
Search

The Acid and The Glue: Science and the Paradox of Progress

  • didiermoretti
  • Dec 28, 2025
  • 14 min read

Updated: Jan 8



For 2,000 years, Europe believed women had fewer teeth than men. Why? Because Aristotle said so, and nobody bothered to count. Questioning Aristotle was heresy. (1)


That is how most of human history worked. The ancients—Aristotle, Confucius, the Prophets—knew everything worth knowing. Our job was to preserve their wisdom, not question it. Progress wasn't a consideration because the golden age lay in the past.


Then something extraordinary happened. We developed Science, a systematic process to dissolve such dogma and discover truth through empirical testing of claims about how the world works. Like an acid, Science dissolves false claims and dogma to reveal empirical facts (women don't have fewer teeth, smoking does cause cancer, the sun doesn't orbit the Earth).


But here's the problem: civilization also needs shared myths and beliefs to function. Science (the acid of empirical truth) requires questioning everything. Myth (the glue of belief) demands faith in something; it is the necessary social glue that enables people to cooperate at scale.


These two forces are fundamentally at odds and can be unstable in proximity. When the acid is used to corrode indiscriminately, it may attack the social myths we need. When the glue hardens and resists all questioning, it suffocates adaptation. The miracle of the last 400 years is that we kept them in somewhat productive but incredibly shaky balance. This gave us steam engines and vaccines, but also upended economies, toppled kings, and transformed progress from hope into expectation.


The crisis today is that we're losing the discipline required to sustain this balance.


This is the story of how we got here—how we learned to apply the Acid and the Glue selectively, and what happens when they turn against each other.


Audio/Video summary of the article

The Antecedents: Giants in the Dark

It is a common error to believe that science sprang fully formed from the head of the likes of Francis Bacon and Galileo in the 1600s. In reality, the "method" had been flickering in and out of existence for millennia, usually suffocated by the heavy blanket of dogma. As in many things in human culture, this was a cumulative effort across civilizations.


Babylon and Egypt achieved sophistication in observation and prediction. The Babylonians kept daily astronomical diaries for seven centuries (747 BCE–61 BCE), the longest continuous data set in history. They predicted eclipses with remarkable accuracy. Yet these efforts yielded no explanatory theories; knowledge was owned by priests keen on preserving cosmic order, not questioning it.


Ancient Greece: Aristotle was brilliant but clearly not a scientist by modern standards. He did insist knowledge should come from observing particulars, then generalizing. He just rarely bothered with the experimentation part. He believed heavier objects fall faster—he never dropped two stones to check.


Tellingly, the greatest minds of medieval Europe spent their energy reconciling Aristotle with Christian theology, not testing whether his claims were true. John Philoponus, a 6th-century Byzantine scholar, actually performed the falling-stone experiment and published results contradicting Aristotle—1,000 years before Galileo. (2) The data was ignored. Aristotle's authority prevailed over empirical evidence, a pattern that would echo through centuries.


Still, the Greeks laid important foundations in mathematics, logical reasoning, classification, and observation - which the Islamic world would then build upon.


The Islamic Golden Age: Key contributions to the modern scientific mind came from the Arab world. From the 8th to the 13th centuries, the real action was in Baghdad, Cairo, Cordoba, and Samarkand.

  • Al-Haytham (Alhazen) wrote the Book of Optics, where he described the scientific method almost verbatim, made breakthroughs in optics, and invented the camera obscura. (3)

  • Al-Khwarizmi invented algebra and the algorithm. Al-Biruni calculated the Earth’s circumference with remarkable accuracy for his time, using trigonometry and the horizon line.

  • Ibn Sina (Avicenna) pioneered evidence-based medicine and experimental methodology and systematized medical knowledge into an encyclopedia. Al-Razi (Rhazes) arguably conducted the first controlled medical trials in history. Their books became part of the medical curriculum in Western universities. (4)


But these sparks lacked institutional tinder. Islamic learning relied on royal patronage; when the patronage dried up due to Mongol invasions or political fragmentation, the infrastructure collapsed. Unlike European universities, which developed legal rights that made them somewhat independent of rulers, Islamic scholarship remained tethered to the state, vulnerable to the winds of dogma.


Why Not China? By any objective metric, China should have led the world in science. Throughout the Middle Ages, China was the technological hyperpower, gifting the world the compass, gunpowder, papermaking, and printing. It had a sophisticated bureaucracy and a unified empire while Europe was a muddy backwater of warring tribes.


Part of the answer lies in the distinction between Technology (solving practical problems) and Science (understanding abstract laws). China mastered technology to address specific needs. But the Chinese intellectual tradition, dominated by Confucianism, focused on social order, not abstract laws of nature.


Crucially, the relationship between knowledge and power differed. In China, knowledge was the domain of the scholar-bureaucrat, whose authority rested on mastery of classical texts. Innovation came from craftsmen who had low social status. The people with prestige weren't studying nature, and the people studying nature lacked prestige. Furthermore, the Chinese state was a monolith. If the Emperor decided that ocean exploration was a waste of money (as happened in the 1430s), the ships were decommissioned and allowed to rot. There was no alternative center of power to continue the work. (5)


Europe's Lucky Break

In 17th-century Europe, a rare alignment of circumstances created the conditions for a lucky break.


The discovery of the Americas shattered ancient authority (Aristotle knew nothing of these continents; what else was he wrong about?). The Reformation shattered Church authority. The printing press shattered the monopoly on information. And Europe's political fragmentation meant that new ideas, even heretical ones, had places to hide and grow. Early successes (navigation, mining) attracted patronage, which funded instruments, which enabled further discoveries, which attracted more patronage-creating a reinforcing positive feedback loop. What made Europe different may have been less cultural superiority than historical accident compounded by self-reinforcing success—a contingent path, not an inevitable destiny.


Into this breach stepped the thinkers who synthesized prior work and firmly established the scientific method:


  • Francis Bacon (The Empiricist): Argued for data over logic. He declared "Knowledge is power," reframing science as a tool to master nature rather than contemplate divine wisdom. (6)

  • René Descartes (The Skeptic): Championed the idea that nature operates like a vast clockwork. His Discourse on Method (1637) articulated a systematic approach: doubt everything, break problems down, and use mathematics to model phenomena.

  • Galileo Galilei (The Measurer): Combined Bacon’s emphasis on observation with Descartes’ mathematics to confirm Copernicus' heliocentric theory. Galileo's crime was not claiming the Earth moved but claiming he could prove it with his own eyes, independent of scripture.

  • Isaac Newton (The Synthesizer): His Principia Mathematica (1687) demonstrated that the universe followed knowable, mathematical rules.


As Yuval Noah Harari puts it, this was a Revolution of Ignorance. (7) Pre-modern maps were filled with dragons to hide ignorance. Modern maps had vast empty spaces—a psychological invitation to Come find out.


This admission of ignorance was the engine of the "Scale Amplifier" we call Learning. Once you admit you don't know, you are free to look. And once you look, you find. This is precisely why the scientific method scales so effectively. It turned learning from a preservation act into a discovery engine.


But geniuses die; institutions survive. A critical breakthrough was the creation of the Royal Society of London (1660). Its motto was Nullius in verba—"Take nobody's word for it"—a direct rejection of the medieval reverence for ancient authority. The Society pioneered the peer-reviewed scientific journal (Philosophical Transactions, 1665). Ideas could be published, critiqued, and replicated. This turned individual genius into cumulative knowledge. Similar academies followed in Paris and Berlin. The social practice of science emerged—not just a method, but a community organized around that method. Yet this triumph created an unexpected tension.


The Great Paradox: The Acid and the Glue

Here we arrive at the tension that haunts us to this day. Civilization requires two opposing forces to function.



To cooperate at scale, we need Myths (shared beliefs) and Values—nations, religious narratives, money, and human rights. These act as "social glue," allowing millions of strangers to trust one another and work toward common goals. Without shared beliefs, there is no basis for cooperation beyond kinship or direct reciprocity.


But to advance technology, we need Science (shared reality). Science acts as an acid, dissolving any belief that can be proven wrong by empirical testing. It questions everything. It privileges evidence over authority. (8)


Like all acids, science's power lies in its selectivity. Hydrochloric acid dissolves steel but not glass. Science dissolves false claims about how the world is, without necessarily dissolving claims about what matters or how the world ought to be. (9)


Science cannot—by itself—tell us whether justice is more important than mercy, whether beauty has intrinsic value, or what makes a life worth living. This doesn't mean science is irrelevant to morality. Science can demolish moral claims based on false facts"homosexuality is unnatural", "women are inferior", "capital punishment deters crime". And once we state our values, science reveals whether our actions achieve them (this policy will increase suffering; that one won't). But the values themselves—the choice of what to care about—are ones we derive from philosophy, culture, and lived experience. (Humans have deeply held shared values, as we saw in Homo: an Ultra-Social Animal )


This balance is inherently unstable. Maintaining it requires constant adjustment. History provides instructive examples with widely different outcomes.


When the Acid Spills Over: it can dissolve the myths that hold society together, leading to nihilism or social fragmentation. The French Revolution demonstrated this vividly. Enlightenment rationalism dissolved the absolute authority of the Church and King—a genuine achievement. But it kept dissolving, attacking guilds and local traditions until nothing remained but abstract 'Reason' and the guillotine.


More recent examples of overreach are scientism and attempts to use Science to define what is ultimately good: eugenics (which led to the forced sterilization of those deemed "inferior") and Social Darwinism (applying natural selection to social groups, often used to justify racism, inequality, and imperialism).


The acid of Science and the glue of Philosophy need each other. The acid needs something to preserve (values, meaning, purpose). Moral reasoning requires more than just factual premises; it necessitates a framework of values and ideals. Likewise, the glue needs something to prevent it from hardening into delusion (empirical testing).


When the Glue Hardens into dogma, the results are equally catastrophic. Stalin's USSR chose ideological purity over empirical reality. Trofim Lysenko’s agricultural theories fit Marxist dogma but contradicted genetics. Stalin declared genetics 'bourgeois pseudoscience.' Crop yields collapsed and famines killed millions.


Nikolai Vavilov, Lysenko's former mentor and one of the world's greatest botanists, was imprisoned for defending genetics. He died of starvation in 1943 in a Soviet prison, ironically while studying famine-resistant crops. The man who could have saved millions starved to death because ideology trumped evidence. Meanwhile, in the West, the same genetic science that the Soviets rejected as ideological poison enabled the Green Revolution, which saved hundreds of millions from starvation.


Nazi Germany made similar mistakes, crippling its own physics program by expelling scientists for the crime of doing "Jewish science." The talent that fled—Einstein, Fermi, Szilard, and others—built the atomic bomb for the Allies instead. Ideological purity made for satisfying rallies but terrible engineering.


Moral philosophy without empirical input is blind; empirical data without moral framing is mute. Philosophy and science need each other. The danger is science claiming to replace morality (scientism) or morality refusing to be informed by science (dogma).


When the Balance Holds: The societies that thrived found ways to keep both forces in productive tension. The US Apollo space program embodied this. When NASA engineers calculated trajectories, no one invoked patriotic slogans—they used physics. The laws of orbital mechanics don't care about flags. But when Kennedy sold the moon mission to the public, he didn't cite delta-v calculations; he invoked national glory and the frontier spirit. The engineers needed truth; the taxpayers needed meaning.


We are constantly walking a tightrope between the useful myths that unite us and the inconvenient truths that save us.

Pitfall: When the Acid Crystallizes. There is another danger, subtler and more insidious: when science itself hardens into unquestionable authority. The scientific method promises liberation through empirical truth, but if that truth becomes as unquestionable as papal decree, the Acid has merely crystallized into a new Glue, replacing one priesthood with another.


History provides stark examples:

  • The Lobotomy Tragedy: In 1949, Egas Moniz won a Nobel Prize for the lobotomy. For two decades, thousands were mutilated despite early evidence of terrible harm. Why? Because questioning the procedure meant questioning the Nobel committee and the medical consensus. The Acid of inquiry had been replaced by the Glue of professional hierarchy. (10)

  • The Tobacco Deception: By the 1950s, the tobacco industry knew smoking caused cancer. Yet they used the apparatus of science—funding studies, publishing in journals—to manufacture doubt. They weaponized the scientific method's demand for skepticism against emerging truth. (11)

  • The Opioid Crisis: More recently, regulatory capture allowed Purdue Pharma to use the veneer of scientific consensus to push addictive drugs. (12)

  • The mRNA Near-Miss: For decades, the biologist Katalin Karikó was demoted and repeatedly denied grants because the academic consensus held that mRNA therapeutics were unstable and a dead end. The "Glue" of institutional orthodoxy nearly suffocated the technology that would later save millions during the COVID-19 pandemic. Her eventual success wasn't due to the grant system working as intended, but to her sheer stubbornness in the face of it. (13)


The lesson? The scientific method has no built-in immunity to corruption. Left to institutional drift, scientific consensus calcifies just like any other orthodoxy. The human tendencies that created medieval scholasticism—deference to authority and fear of ostracism—don't disappear just because you replace theology with peer review. Scientific progress requires active maintenance of skepticism. Vigilance is exhausting. Institutions are lazy. The method is fragile.


Consequences: The Unintended Revolution

The expected results from the scientific revolution came swiftly. Within a century of Newton's Principia, humanity understood more about the physical world than in all previous history combined. Technology followed, eventually igniting the Industrial Revolution. Serendipity played a role; many discoveries arose from accidents, demonstrating the method’s openness to surprise.


As important as the material achievements, the scientific method fundamentally rewired our views on human society.


The Invention of "Progress": The Scientific Method introduced a linear concept of time. If we know more today than yesterday, the future can be better than the past. This optimism is the bedrock of modern capitalism and liberal democracy. For the first time in history, large numbers of people believed tomorrow could be better than today. This wasn't just psychology—it was economics. Why invest in the future if the golden age lay in the past?


The Challenge to Authority: Science is inherently democratic. In the laboratory, a pauper can prove a prince wrong if he has the data. This corrosive idea leaked into politics. If the King isn't the ultimate authority on physics, why is he the authority on law? If religious texts can be wrong about the movement of planets, what else might they be wrong about? This is the acid/glue paradox in action. It is no coincidence that the Scientific Revolution preceded the Age of Revolutions.


The Fallout: Scientific power also came with costs its pioneers never imagined. The belief that "there's always a technical fix" led to ecological hubris. Fossil fuels and synthetic chemistry created environmental crises that took centuries to recognize. We treated the atmosphere as an infinite dumping ground because the harm was invisible, diffuse, and delayed. The scientific method struggles to address problems where the experiments take generations to complete.


Meanwhile, scientific superiority became a tool of empire. Superior navigation and weaponry facilitated colonization, while the scientific worldview was weaponized to dismiss non-Western knowledge traditions as "primitive." Science promised universal truth; it delivered unequal power. The irony is that science itself is genuinely universal—the laws of thermodynamics don't care about your culture—but access to scientific education and institutions was anything but universal.


The Paradox Today

The acid/glue tension isn't just historical philosophy—it's the defining fracture of our time.


Consider climate change. The acid has done its work: thousands of peer-reviewed studies, multiple independent lines of evidence, predictive models confirmed by observation. By any rational standard, the empirical case is settled. Yet this empirical truth runs headlong into the glue of economic myths and political tribalism. "Climate science is a hoax" isn't a scientific position—it's a tribal marker, a badge of loyalty. Addressing climate change requires rigorous science and collective belief—a task that demands the acid and the glue work together.


Consider COVID-19. Masks and vaccines became tribal markers rather than public health measures. When scientific recommendations threaten social myths (liberty, distrust of government, religion), the acid meets the glue head-on. The result was a society that struggled to respond effectively while tearing itself apart over competing loyalties.


Most critically, our shared epistemology—agreement on how to determine truth—is under severe strain. We are retreating into tribal universes, selecting facts to reinforce belief rather than test reality. Social media algorithms accelerate this drift by optimizing for engagement, not accuracy: outrage spreads faster than nuance, conspiracy theories cluster in recommendation feeds, and the distinction between fact and fabrication collapses into "what my tribe believes."


This creates a pernicious paradox. In the medieval world, authority was transparent: the Pope claimed truth on faith, Aristotle on ancient wisdom. Today, we fragment into competing realities where everything claims the authority of science. Pharmaceutical companies deploy "studies show." Political tribes cherry-pick data. Internet communities manufacture "alternative facts." The language of empirical inquiry becomes a costume for tribal performance.


The danger isn't just disagreement—it's the loss of a common method for resolving disagreement. A society where nothing is true cannot maintain social trust. But a society where everything masquerades as scientific truth while nothing can be genuinely tested may be worse. At least medieval peasants knew the Pope's authority rested on faith. We've lost even that clarity.


The Question That Remains

The scientific method taught us that progress is possible, that the future could surpass the past through systematic inquiry and accumulation of knowledge. This was revolutionary.


But it created a paradox: progress requires questioning everything, yet civilization requires believing in something.


The scientific method is not a permanent achievement—it rests on a fragile scaffold. It is humanity's most unnatural practice, requiring us to suppress our deepest instincts for tribal loyalty and certainty. It requires us to protect dissenters and fund research that contradicts our beliefs. It means distinguishing between myths that enable cooperation and myths that prevent adaptation—a line that keeps moving as some of yesterday's essential myths become tomorrow's dangerous delusions. And like any discipline, if you stop practicing it, you lose it.


Sustaining this balance demands constant vigilance—skepticism about our own certainties, consensus that remains open to challenge, and authority that doubts itself. This is exhausting and unnatural. It's why the scientific method remains one of humanity's most fragile achievements.


The scientific revolution proved we could do it once. Whether we can sustain it remains an open question.


As always, progress is optional.


Next Article: Coming Soon


(1) The Internet Classics Archive | The History of Animals by Aristotle "Males have more teeth than females in the case of men, sheep, goats, and swine; in the case of other animals observations have not yet been made"

(7) Yuval Noah Harari, Sapiens: A Brief History of Humankind

(8) Credit to Danniel Dennett for his concept of "universal acid" when referring to "Darwin's dangerous idea" Darwin's dangerous idea. The "danger" Dennett refers to is the total loss of teleology (purpose). The universe has no goal. We were not the plan; we are just the current result. For Dennett, this isn't depressing—it's liberating. It means we are the only species capable of fully realizing how we got here, and perhaps, defying the algorithm that created us.

(9) The is-ought problem articulated by David Hume Is–ought problem - Wikipedia This was later referred to as "Hume's guillotine" to emphasize that any attempt at deriving a moral conclusion from factual premises is fundamentally flawed, akin to being "beheaded" logically. Eugenics and Social Darwinism take note!

(10) António Egas Moniz won the Nobel Prize in Physiology or Medicine in 1949 "for his discovery of the therapeutic value of lobotomy in certain psychoses." By the 1960s, the procedure was widely recognized as harmful. The Nobel Prize has never been revoked. See: Lobotomy - Wikipedia

(11) Internal tobacco industry documents revealed companies knew of cancer risks by the early 1950s but publicly denied them until approximately 2000. See: "Everyone knew but no one had proof": tobacco industry use of medical history expertise in US courts, 1990-2002

(12) OxyContin was approved by the FDA in 1995 based on a single two-week clinical trial. Purdue Pharma marketed it as having low addiction risk due to controlled-release formulation—a claim not supported by adequate evidence. See: How FDA Failures Contributed to the Opioid Crisis

(13) See Chapter 4 in "Abundance" by Ezra Klein and Derek Thompson.

 
 

We'd love to hear from you! Send us your thoughts, comments, and suggestions.

Thank You for Reaching Out!

© 2021-2025 And Now What? All Rights Reserved.

bottom of page