Has radiometric dating ever been wrong

Rated 4.77/5 based on 594 customer reviews

Actually the assignment of a certain number of millions of years to a rock formation does not derive from the strata itself.

The standard Geological Column became the reference point, even though it does not appear anywhere on earth except in text books.

It caused massive sedimentary layering and sorting and fossilizing of the creatures buried therein.

These dating methods rely on a series of assumptions about the amounts of the parent-daughter elements, and a constant rate of decay. It has been accepted that a rock is formed when it first cools down from a molten or semi-molten state, which may include a variety of elements, including radioactive ones. For the last 100 years we have been able to measure the decay rate, and during this time it has been very steady, very consistent.

Radioisotope dating, using the trace amounts of radioactive elements within the rock, was quickly accepted as proof the earth is millions and millions of years old. The Radioactive elements decay from heavier larger atomic elements (parent) into smaller atomic elements (daughter) that are more stable. The rate of decay and the amount of parent / daughter elements present today in a rock sample is used to calculate back to the estimated age of when the rock was first formed.

To assume the rock starts with only U and no Pb is a big assumption.

(Isochron dating, which relies on multiple rock samples, is an attempt to correct this, but still has underlying assumptions based on 1 and 2 above.) Examples of Problems with Radiometric dating of rocks: Grand Canyon Lava flows: Sedimentary rocks make up the layers of the Grand Canyon and these are not dateable by radiometric dating.

Leave a Reply