Prior to the invention of writing — which is to say, for more than 90 percent of the time homo sapiens have existed — people learned things mainly by interacting with things. Spoken words helped, of course, but to a considerable degree our distant ancestors must have learned how to hunt and fish, and how to make axes and baskets, by watching their elders do it and trying it for themselves. In short, they learned by doing.
Writing and printing changed that. Books made it possible to learn a great deal without physically doing much of anything.
A new class emerged — the intellectuals.
Being an intellectual had more to do with fashioning fresh ideas than with finding fresh facts. Facts used to be scarce on the ground anyway, so it was easy to skirt or ignore them while constructing an argument. The wildly popular 18th-century thinker Jean-Jacques Rousseau, whose disciples range from Robespierre and Hitler to the anti-vaccination crusaders
Freud discovered nothing and cured nobody. Marx was a hypocrite whose theories failed in about as hideously spectacular a way as can be imagined.
currently bringing San Francisco to the brink of a public health crisis, built an entire philosophy (nature good, civilization bad) on almost no facts at all. Karl Marx studiously ignored the improving living standards of working-class Londoners — he visited no factories and interviewed not a single worker — while writing Das Kapital, which declared it an “iron law” that the lot of the proletariat must be getting worse. The 20th-century philosopher of science Paul Feyerabend boasted of having lectured on cosmology “without mentioning a single fact.”
Eventually it became fashionable in intellectual circles to assert that there was no such thing as a fact, or at least not an objective fact. Instead, many intellectuals maintained, facts depend on the perspective from which are adduced. Millions were taught as much in schools; many still believe it today.
Reform-minded intellectuals found the low-on-facts, high-on-ideas diet well suited to formulating the socially prescriptive systems that came to be called ideologies. The beauty of being an ideologue was (and is) that the real world with all its imperfections could be criticized by comparing it, not to what had actually happened or is happening, but to one’s utopian visions of future perfection. As perfection exists neither in human society nor anywhere else in the material universe, the ideologues were obliged to settle into postures of sustained indignation. “Blind resentment of things as they were was thereby given principle, reason, and eschatological force, and directed to definite political goals,” as the sociologist Daniel Bell observed.
While the intellectuals were busy with all that, the world’s scientists and engineers took a very different path. They judged ideas (“hypotheses”) not by their brilliance but by whether they survived experimental tests. Hypotheses that failed such tests were eventually discarded, no matter how wonderful they might have seemed to be. In this, the careers of scientists and engineers resemble those of batters in major-league baseball: Everybody fails most of the time; the great ones fail a little less often.
So it could be said that humanity ran an experiment, over the past couple of centuries, in two competing approaches — intellectualism and ideology on one hand, science and technology on the other. The stark differences between the two was the subject of C. P. Snow’s influential 1959 essay, “The Two Cultures.”
Their outcomes proved starkly different, too.