Until the late 1940's, archaeologists and anthropologists faced an uphill battle when it came to dating early modern human remains and other prehistoric artifacts. At the time, the most reliable technique consisted of a relative dating method devised by geologists known as stratigraphy. The basic premise of stratigraphy is that fossils and artifacts found between two undisturbed layers of rock must be younger than the rocks beneath them but older than the rocks deposited above them. Based on the rate of decay of uranium and other radioactive isotopes, a ballpark estimate could be obtained for the age of some rock formations, and by extension, the approximate age of fossils found within them.
Although stratigraphy was obviously better than random speculation, it was almost useless for dating fossils found at geologically recent sites, especially those less than 100,000 years old. In humans terms, 100,000 years is a vast stretch of time, however, in the grand scheme of Earth's natural history, it represents the proverbial blink of an eye. Matters stood at an impasse until 1947, when Willard Libby developed a new dating technique based on the half life of carbon 14, a radioactive isotope discovered in 1940 by Kamen and Reuben.
At this juncture, a brief explanation of isotopes and half life is in order. The vast majority of carbon atoms in the universe occur as carbon 12, meaning their nuclei contain 6 protons and 6 neutrons. From a nuclear standpoint, carbon 12 atoms are completely stable and do not undergo any form of radioactive decay. This pattern holds hold true for the vast majority of atomic elements, with the exception of very massive nuclei like uranium, of which all isotopes are inherently radioactive.
Unlike stable isotopes, radioactive nuclei decay at a constant rate over time. That rate may be very rapid, on the order of minutes for a synthetic element called Technetium-99, or unimaginably slow, for example, 4.5 billion years in the case of Uranium-238. A half life is defined as the amount of time it takes for half of the atomic nuclei of a radioactive element to decay into some other form.
A crucial point to remember is that only half of a sample of a radioactive isotope decays during each successive half life. To illustrate this point, suppose you start with 100 grams of a radioactive isotope whose half life is 100 days. 50g of the isotope would remain after 100 days; 25g after 200 days; 12.5g after 300 days; 6.25g after 400 days, etc. After ten half lives, the amount of radioactivity would fall to a nearly undetectable level; in this hypothetical scenario this would be equal to 1,000 days. Scientists consider ten half lives to be the maximum length of time for which a given radioactive isotope can yield an accurate date.
In the case of carbon, less than 1% of this element exists as carbon 14, which contains 6 protons and 8 neutrons in it nucleus. The two extra neutrons destabilize carbon 14, so that at some point in time, its nucleus decays by a process known as beta emission. In this form of radioactive decay, a neutron is converted into a proton as well as a high energy electron, which gets ejected from the atom as a beta particle. In this case, beta emission produces an atom of nitrogen 14, whose nucleus consists of 7 protons and 7 neutrons. (Electrons have virtually no mass; only the mass of protons and neutrons are counted in radioactive decay equations).
Carbon 14 is an incredibly significant isotope for two reasons. First, carbon is ubiquitous in the cells of humans and all other organisms. Proteins, carbohydrates, lipids, and DNA are all carbon based macromolecules. In fact, the only elements present in humans in greater abundance than carbon are hydrogen and oxygen (considering that almost 60% of human body weight consists of water). As such, a small amount of carbon 14 is likely to be trapped inside of a person's bones for centuries or millenia after that person's death. Under certain circumstances, like burial sites in the desert or permanently frozen regions, body parts other than bone may be preserved. As a general rule, however, only bones and teeth are likely to be found intact.
Second, the half life of carbon 14 is estimated at 5,700 years, an ideal span of time for dating both ancient and prehistoric artifacts. Because the amount of carbon 14 present in a sample becomes indistinguishable from levels of background radiation after ten half lives have elapsed, objects can be dated accurately to a maximum age of 57,000 years. Except for prehistoric sites in Africa (and possibly the Middle East), all other modern human remains, and by extension human artifacts, fall within this age range. This knowledge has proved extremely valuable in dating prehistoric human sites outside of Africa, especially those in the Americas, which humans colonized toward the end of the last Ice Age 12,000 years ago. Artifacts from the ancient world, including Egyptian mummies and the Dead Sea Scrolls, have also undergone carbon 14 dating, accurately establishing or corroborating their ages to within a few centuries.