Up to this time, the oldest reliable dates went back no further than the First Dynasty in Egypt from about 3000B.C. No one could confidently say, for instance, when the last ice sheets had retreated or at what time in the past the Cro-Magnon people had decorated the caves of Lascaux in France.
Libby's idea was so useful that he would be awarded a Nobel Prize for it in 1960. It was based on the realization that all living things have within them an isotope of carbon called carbon-14, which begins to decay at a measurable rate the instant they die. Carbon-14 has a half-life—that is, the time it takes for half of any sample to disappear1—of about 5,600 years, so by working out how much a given sample of carbon had decayed, Libby could get a good fix on the age of an object—though only up to a point. After eight half-lives, only 1/256 of the original radioactive carbon remains, which is too little to make a reliable measurement, so radiocarbon dating works only for objects up to forty thousand or so years old.
Curiously, just as the technique was becoming widespread, certain flaws within it became apparent. To begin with, it was discovered that one of the basic components of Libby's formula, known as the decay constant, was off by about 3 percent. By this time, however, thousands of measurements had been taken throughout the world.