I shall here insert a problem of that very ingenious and studious promoter of real knowledge, the learned and worthy Mr. Molyneux, which he was pleased to send me in a letter some months since; and it is this:- "Suppose a man born blind, and now adult, and taught by his touch to distinguish between a cube and a sphere of the same metal, and nighly of the same bigness, so as to tell, when he felt one and the other, which is the cube, which the sphere. Suppose then the cube and sphere placed on a table, and the blind man be made to see: quaere, whether by his sight, before he touched them, he could now distinguish and tell which is the globe, which the cube?" To which the acute and judicious proposer answers, "Not. For, though he has obtained the experience of how a globe, how a cube affects his touch, yet he has not yet obtained the experience, that what affects his touch so or so, must affect his sight so or so; or that a protuberant angle in the cube, that pressed his hand unequally, shall appear to his eye as it does in the cube."- I agree with this thinking gentleman, whom I am proud to call my friend, in his answer to this problem; and am of opinion that the blind man, at first sight, would not be able with certainty to say which was the globe, which the cube, whilst he only saw them; though he could unerringly name them by his touch, and certainly distinguish them by the difference of their figures felt. This I have set down, and leave with my reader, as an occasion for him to consider how much he may be beholden to experience, improvement, and acquired notions, where he thinks he had not the least use of, or help from them. And the rather, because this observing gentleman further adds, that "having, upon the occasion of my book, proposed this to divers very ingenious men, he hardly ever met with one that at first gave the answer to it which he thinks true, till by hearing his reasons they were convinced.
We shall expose the origin and trace the history of general errors, which have more or less contributed to retard or suspend the advance of reason, and sometimes even, as much as political events, have been the cause of man’s taking a retrograde course towards ignorance.
Those operations of the mind that lead to or retain us in error, from the subtle paralogism, by which the most penetrating mind may be deceived, to the mad reveries of enthusiasts, belong equally, with that just mode of reasoning that conducts us to truth, to the theory of the development of our individual faculties; and for the same reason, the manner in which general errors are introduced, propagated, trasmitted, and rendered permanent among nations, forms a part of the picture of the progress of the human mind. Like truths which improve and enlighten it, they are the consequence of its activity, and of the disproportion that always exists between what it actually knows, what it has the desire to know, and what it conceives there is a necessity of acquiring.
It is even apparent, that, from the general laws of the development of our faculties, certain prejudices must necessarily spring up in each stage of our progress, and extend their seductive influence beyond that stage; because men retain the errors of their infancy, their country, and the age in which they live, long after the truths necessary to the removal of those errors are acknowledged.
In short, there exist, at all times and in all countries, different prejudices, according to the degree of illumination of the different classes of men, and according to their professions. If the prejudices of philosophers be impediments to new acquisitions of truth, those of the less enlighted classes retard the propagation of truths already known, and those of esteemed and powerful professions oppose like obstacles. These are the three kinds of enemies which reason is continually obliged to encounter, and over which she frequently does not triumph till after a long and painful struggle. The history of these contests, together with that of the rise, triumph, and fall of prejudice, will occupy a considerable place in this work, and will by no means form the least important or least useful part of it.
One property of Bayes’s theorem, in fact, is that our beliefs should converge toward one another—and toward the truth—as we are presented with more evidence over time. In figure 8-8, I’ve worked out an example wherein three investors are trying to determine whether they are in a bull market or a bear market. They start out with very different beliefs about this—one of them is optimistic, and believes there’s a 90 percent chance of a bull market from the outset, while another one is bearish and says there’s just a 10 percent chance. Every time the market goes up, the investors become a little more bullish relative to their prior, while every time it goes down the reverse occurs. However, I set the simulation up such that, although the fluctuations are random on a day-to-day basis, the market increases 60 percent of the time over the long run. Although it is a bumpy road, eventually all the investors correctly determine that they are in a bull market with almost (although not exactly, of course) 100 percent certainty.
In theory, science should work this way. The notion of scientific consensus is tricky, but the idea is that the opinion of the scientific community converges toward the truth as ideas are debated and new evidence is uncovered. Just as in the stock market, the steps are not always forward or smooth. The scientific community is often too conservative about adapting its paradigms to new evidence,64 although there have certainly also been times when it was too quick to jump on the bandwagon. Still, provided that everyone is on the Bayesian train,* even incorrect beliefs and quite wrong priors are revised toward the truth in the end.
Our own view of what is and is not possible in reality affects how we perceive identical evidence. But that view shifts with time, and thus, evidence that might at one point seem meaningless can come to hold a great deal of meaning. Think of how many ideas seemed outlandish when first put forward, seemed so impossible that they couldn’t be true: the earth being round; the earth going around the sun; the universe being made up almost entirely of something that we can’t see, dark matter and energy. And don’t forget that magical things did keep happening all around as Conan Doyle came of age: the invention of the X-ray (or the Röntgen ray, as it was called), the discovery of the germ, the microbe, radiation—all things that went from invisible and thus nonexistent to visible and apparent. Unseen things that no one had suspected were there were, in fact, very there indeed.
In that context, is it so crazy that Arthur Conan Doyle became a spiritualist? When he officially embraced Spiritualism in 1918, he was hardly alone in his belief— or knowledge, as he would have it. Spiritualism itself, while never mainstream, had prominent supporters on both sides of the ocean. William James, for one, felt that it was essential for the new discipline of psychology to test the possibilities of psychical research, writing: “Hardly, as yet, has the surface of the facts called ‘psychic’ begun to be scratched for scientific purposes. It is through following these facts, I am persuaded, that the greatest scientific conquests of the coming generation will be achieved.” The psychic was the future, he thought, of the knowledge of the century. It was the way forward, not just for psychology, but for all of scientific conquest.
This from the man considered the father of modern psychology. Not to mention some of the other names who filled out the ranks of the psychical community. Physiologist and comparative anatomist William B. Carpenter, whose work included influential writings on comparative neurology; the renowned astronomer and mathematician Simon Newcomb; naturalist Alfred Russel Wallace, who proposed the theory of evolution simultaneously with Charles Darwin; chemist and physicist William Crookes, discoverer of new elements and new methods for studying them; physicist Oliver Lodge, closely involved in the development of the wireless telegraph; psychologist Gustav Theodor Fechner, founder of one of the most precisely scientific areas of psychological research, psychophysics; physiologist Charles Richet, awarded the Nobel Prize for his work on anaphylaxis; and the list goes on.
My definition of the word believe means to accept an explanation of phys¬ ical phenomena without any experiential evidence. At the outset of my re¬ solve not only to do my own thinking but to keep that thinking concerned only with directly experienced evidence, I resolved to abandon completely all that I ever had been taught to beheve. Experience had demonstrated to me that most people had an authority-trusting sense that persuaded them to believingly accept the dogma and legends of one religious group or another and to join that group's formaUzed worship of God.
I asked myself whether I had any direct experiences in life that made me have to assume a greater intellect than that of humans to be operative in Universe. I immediately referred back to my good education in the sciences and my directly experienced learning of the operation of a plurality of physical laws—such as the interattraction of celestial bodies, varying inversely as the second power of the arithmetical distances intervening—which laws could only be expressed in the purely intellectual terms of mathematics, which plurality of laws always and only related to eternal relationships existing between and not in any one of the interrelated phenomena when considered only separately. None of the eternal and always concurrently operative laws had ever been found to contradict one another—ergo, they were all designedly interaccommodative like a train of gears. Many also were interaugmentative. I said that when we use the word design in contradistinction to randomness, we immediately infer an intellect that sorts out a complex of potentials and interarranges components in complementary ways—ergo, human mind in discovering a plurality of these only mathematically expressible eternal laws, all of which are interaccommodative, is also discovering the intellectually designed scenario Universe, whose designing requires the a priori eternal existence of an intellectual integrity of eternally self-regenerative Universe. I said to myself, I am o'erwhelmed by the only experientially discovered evidence of an a priori eternal, omnicomprehensive, infinitely and exquisitely concerned, intellectual integrity that we may call God, though knowing that in whatever way we humans refer to this integrity, it will always be an inadequate expression of its cosmic omniscience and omnipotence.
For much of the modern era, scientists followed Nicolaus Copernicus, Galileo Galilei, and Isaac Newton in believing the cosmos to be eternal and unchanging. But in 1917, when Albert Einstein applied his theory of relativity to space-time as a whole, his equations implied that the universe could not be static; it must be either expanding or contracting. This struck Einstein as grotesque, so he added to his theory a fiddle factor called the "cosmological constant" that eliminated the implication and held the universe still.
It was an ordained priest who took relativity to its logical conclusion. In 1927, Georges Lemaître of the University of Louvain in Belgium worked out an expanding model of the universe. Reasoning backward, he proposed that at some definite point in the past it must have originated from a primeval atom of infinitely concentrated energy. Two years later, Lemaître's model was confirmed by the American astronomer Edwin Hubble, who had observed that the galaxies everywhere around us were receding. Both theory and empirical evidence pointed to the same verdict: The universe had an abrupt beginning in time.
Einstein overcame his metaphysical scruples about the big bang not long before his death in 1955, referring to his earlier attempt to dodge it by an ad hoc theoretical device as "the greatest blunder of my career."
Scientists are trained to avoid rhetorical arguments, the "vulgar Induction" Bacon warned against, and let the chips of reality fall where they may. They highly prize this intellectual honesty because the stakes for them are very high. They know how value judgments, prejudices, and habits of thought can blind you to the truth you are seeking, which will limit or end your career as a scientist.
The lay public does just the opposite. They form frames of reference. prejudices, and value judgments as guides for navigating life and then make rhetorical arguments to get what they need. What feels good is good. The idea that the ignorant or stupid public just needs to be better educated in order to see the light is called the deficit model—the assumption by scientists that the public thinks the same way they do, and therefore that the public's differences with science are because of a knowledge deficit. If that's the case, it makes perfect sense for scientists to simply try to pour in more knowledge—fill the deficit—to win support and eradicate the willful inculcation of stupidity that Michael Webber bemoans in Texas.
Georges Lemaitre was a pudgy, pinkish Belgian Jesuit abbe—a Catholic priest—who also happened to be a skilled astronomer. Lemaitre had noticed that Einstein's general theory of relativity would have implied that the universe was expanding but for a troublesome little mathematical term called the cosmological constant that Einstein had inserted into his equations. Lemaitre saw no convincing reason why the cosmological constant should be there. In fact, Einstein himself had originally calculated that the universe was expanding, but he was a theoretician, not an astronomer. When he turned to astronomers for verification of verse existed in a steady state and there was no motion on a grand scale. So in deference to their observational experience, Einstein adjusted his general theory calculations with a mathematical "fudge factor"—the cosmological constant—that made the universe seem to be steady.
Lemaitre had independently been working off the same mathematical principles that Einstein had originally laid out, and in 1927 he wrote a dissenting paper in which he argued that the universe must be expanding, and that if it was, the redshifted light from stars was the result of this expansion. This redshift had been observed by a number of astronomers. but until then there had been no consensus on what the cause could be.
Lemaitre saw Hubble's self-evident observations and clear logic and immediately realized that it confirmed his math and refuted Einstein's general theory. Furthermore, he deduced, if the universe was exppanding equally in all directions, it must have initiated in a massive expcplosion from a single point. This meant to him that the universe is not infnfinitely old; it has a certain age, and that the moment of creation—which I British astronomer Fred Hoyle later mockingly called the "big bang";"—was analogous to God's first command at the beginning of the good abbe's most cherished book, the Bible: Let there he light.
Hubble's meticulously reported logic and observations convinced Einstein that he had been wrong about the cosmological constant. He made a pilgrimage from Germany to Mount Wilson Observatory outside of Pasadena, where he joined Hubble, Humason, Lemaitre, and others tc make a stunning public announcement. Unlike Shapley, Einstein changed his position and removed the cosmological constant from his general theory of relativity, later calling it "the biggest blunder of my life." The universe was indeed expanding.
We base our understanding of the world on what we can perceive with our senses and comprehend with our minds. Anything that is said to make sense should make sense to us as humans; else there is no reason for it to be the basis of our decisions and actions. Supposed transcendent knowledge or intuitions that are said to reach beyond human comprehension cannot instruct us because we cannot relate concretely to them. The way in which humans accept supposed transcendent or religious knowledge is by arbitrarily taking a leap of faith and abandoning reason and the senses. We find this course unacceptable, since all the supposed absolute moral rules that are adopted as a result of this arbitrary leap are themselves rendered arbitrary by the baselessness of the leap itself. Furthermore, there is no rational way to test the validity or truth of transcendent or religious knowledge or to comprehend the incomprehensible. As a result, we are committed to the position that the only thing that can be called knowledge is that which is firmly grounded in the realm of human understanding and verification.
One of the great advantages of the naturalist worldview is that it serves as a basis for bringing people together under a common set of ground rules. Knowledge in science is public, not private, because it must be submitted to others for verification or falsification. A naturalist believes that the empirical truth is waiting to be discovered, and that we can all agree on the empirical truth so long as we believe in a few important criteria. Science can exist in any culture and any nation. It is a worldwide enterprise where people with radically different backgrounds can converge on the same truth. In an age when disagreements on issues of truth and opinion loom so large, the ability of naturalism to forge agreement on hard issues is one of its great attractions.