Radioactive decay and fossil dating

This predictability allows the relative abundances of related nuclides to be used as a clock to measure the time it takes for the parent atom to decay into the daughter atom(s).

Radiometric dating, often called radioactive dating, is a technique used to determine the age of materials such as rocks.

It is based on a comparison between the observed abundance of a naturally occurring radioactive isotope and its decay products, using known decay rates.

By establishing geological timescales, radiometric dating provides a significant source of information about the ages of fossils and rates of evolutionary change, and it is also used to date archaeological materials, including ancient artifacts.

The different methods of radiometric dating are accurate over different timescales, and they are useful for different materials.

A fossil will always be younger than fossils in the beds beneath it and this is called the principle of superposition.

In an undisturbed sequence of rocks, such as in a cliff face, it is easy to get a rough idea of the ages of the individual strata – the oldest lies at the bottom and the youngest lies at the top.

It is the principal source of information about the absolute age of rocks and other geological features, including the age of the Earth itself, and it can be used to date a wide range of natural and man-made materials.

The best-known radiometric dating techniques include radiocarbon dating, potassium-argon dating, and uranium-lead dating.

Example of a radioactive decay chain from lead-212 (212Pb) to lead-208 (208Pb) .

Tags: , ,