Cargo Cult Science

Ever hear of the term “Cargo Cult Science?” It was invented by physicist Richard Feynman when he gave his Caltech commencement address back in 1974. He explained the term as follows:

In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they’ve arranged to imitate things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas–he’s the controller–and they wait for the airplanes to land. They’re doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn’t work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they’re missing something essential, because the planes don’t land.

So cargo cult science is a process that looks like science, but it does not work. And how is it that science is supposed to work? It is supposed to deliver the truth, or at least a close approximation of the truth, about the natural world around us.

So why is it that cargo cult science does not work? Feynman explains as follows:

But there is one feature I notice that is generally missing in cargo cult science. That is the idea that we all hope you have learned in studying science in school–we never explicitly say what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty–a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid–not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked–to make sure the other fellow can tell they have been eliminated.

Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can–if you know anything at all wrong, or possibly wrong–to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.

In summary, the idea is to try to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.

It sounds to me like cargo cult science is simply the act of allowing confirmation bias to run free in science. Confirmation bias is essentially cherry picking data to support preconceived or desired conclusions.  We would like to believe that science is largely immune from confirmation bias given its reliance on experimental controls and repeatability of its experiments. Yet is it?

Feynman provides some examples where confirmation bias has nested in the very practice of everyday science:

We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It’s a little bit off, because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of the electron, after Millikan. If you plot them as a function of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.

Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of–this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong–and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that. We’ve learned those tricks nowadays, and now we don’t have that kind of a disease.

Is it really true that “now we don’t have that kind of a disease?”

And then there is this:

When I was at Cornell, I often talked to the people in the psychology department. One of the students told me she wanted to do an experiment that went something like this–it had been found by others that under certain circumstances, X, rats did something, A. She was curious as to whether, if she changed the circumstances to Y, they would still do A. So her proposal was to do the experiment under circumstances Y and see if they still did A.

I explained to her that it was necessary first to repeat in her laboratory the experiment of the other person–to do it under condition X to see if she could also get result A, and then change to Y and see if A changed. Then she would know that the real difference was the thing she thought she had under control.

She was very delighted with this new idea, and went to her professor. And his reply was, no, you cannot do that, because the experiment has already been done and you would be wasting time. This was in about 1947 or so, and it seems to have been the general policy then to not try to repeat psychological experiments, but only to change the conditions and see what happens.

But is this just a problem from back in the 1940s?  We’ll answer this in the next posting.

This entry was posted in Science and tagged . Bookmark the permalink.

3 Responses to Cargo Cult Science

  1. Bilbo says:

    I would be curious how long the natives tried their cargo cult experiments and whether they tried refining them: “They had metal boxes with lots of wires inside them. Perhaps we need to make some of those.”

  2. Carl Degrasse Dawtchins says:

    If you weren’t brainwashed into your beliefs, how did you come by them?
    It seems you’re not one of the millions (if not billions) of people around the world who are raised from birth already within a particular religious belief system. Clearly you didn’t have one particular religion that shaped your life for many of your early, impressionable years. As a child you must have been different than so many other children. You must not have been swayed by religious beliefs impressed upon you before you learned how to assess those beliefs critically. I’ve got to say I’m impressed. How do you do it? Given that so many people are reared within a particular religion from birth…given that children don’t develop the essential critical thinking skills before having their parents’ religion a constant presence in their life (similar to Santa, though a bit more significant)…you must have some secret. Tell us, how do you avoid believing the same stuff as your parents, thus avoiding the brainwashing effect (perhaps indoctrination would work better for you)?

    P.S. since you weren’t brainwashed/indoctrinated into a religious belief system, did you take the time as you grew to examine the various world religions to determine which one worked best for you? What criteria did you use to determine which religion was truthful and which ones were not? Am I correct in thinking that you chose a belief system with a more progressive mentality? I mean no one in their right mind would pick a belief system that advocates stoning teenagers, chopping off the hand of a woman, blaming people for the actions of their ancestors, casting off ones family to follow a prophet, killing homosexuals, owning slaves, raping women, or committing worldwide genocide by way of a flood. Those who are part of such a system must obviously have been brainwashed from a young age. They were force fed a particular belief before they were able to think about it critically. Sadly, unlike Santa Claus, these children are never informed that said belief system was made up a long time ago by some mysterious men in a faraway land.

  3. Bilbo says:

    Hi Carl,

    John Loftus would be very proud of you! But perhaps we should take this discussion to a thread of its own. I don’t know if Mike wishes to pursue it here. If not, I’ll try to write something up at my own blog.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.