Could radiometric dating be wrong
In a related article on geologic ages (Ages), we presented a chart with the various geologic eras and their ages.
In a separate article (Radiometric dating), we sketched in some technical detail how these dates are calculated using radiometric dating techniques.
However scientists tested the hypothesis that solar radiation might affect the rate at which radioactive elements decay and found no detectable effect.First, it provides no evidence whatsoever to support their claim that the earth is very young.If the earth were only 6000–10 000 years old, then surely there should be some scientific evidence to confirm that hypothesis; yet the creationists have produced not a shred of it so far.The rules are the same in all cases; the assumptions are different for each method.To explain those rules, I'll need to talk about some basic atomic physics. Hydrogen-1's nucleus consists of only a single proton.Atoms of radioactive isotopes are unstable and decay over time by shooting off particles at a fixed rate, transmuting the material into a more stable substance.For instance, half the mass of carbon-14, an unstable isotope of carbon, will decay into nitrogen-14 over a period of 5,730 years.Radiometric dating of rocks and minerals using naturally occurring, long-lived radioactive isotopes is troublesome for young-earth creationists because the techniques have provided overwhelming evidence of the antiquity of the earth and life.Some so-called creation scientists have attempted to show that radiometric dating does not work on theoretical grounds (for example, Arndts and Overn 1981; Gill 1996) but such attempts invariably have fatal flaws (see Dalrymple 1984; York and Dalrymple 2000).Recent puzzling observations of tiny variations in nuclear decay rates have led some to question the science of using decay rates to determine the relative ages of rocks and organic materials.Scientists from the National Institute of Standards and Technology (NIST), working with researchers from Purdue University, the University of Tennessee, Oak Ridge National Laboratory and Wabash College, tested the hypothesis that solar radiation might affect the rate at which radioactive elements decay and found no detectable effect.