Difference between radiometric and radiocarbon dating
That is, the analysis of the isotopic and chemical composition of the sample has far greater uncertainty than any uncertainty in the decay rate itself.The major reason that decay rates can change is that the electric field, from the atom's electron cloud, can change due to chemical changes.In the case of carbon dating, it is not the initial quantity that is important, but the initial ratio of C, but the same principle otherwise applies.Recognizing this problem, scientists try to focus on rocks that do not contain the decay product originally.In some cases radioactive decay itself can be observed and measured in distant galaxies when a supernova explodes and ejects unstable nuclei.There are a few effects that can alter radioactive half-lives, but they are mostly well understood, and in any case would not materially affect the radiometric dating results.Radiometric dating requires that the decay rates of the isotopes involved be accurately known, and that there is confidence that these decay rates are constant. The physical constants (nucleon masses, fine structure constant) involved in radioactive decay are well characterized, and the processes are well understood.Careful astronomical observations show that the constants have not changed significantly in billions of years—spectral lines from distant galaxies would have shifted perceptibly if these constants had changed.
Another assumption is that the rate of decay is constant over long periods of time.
There are a number of implausible assumptions involved in radiometric dating with respect to long time periods.
One key assumption is that the initial quantity of the parent element can be determined.
That is, electrons can move closer to or farther away from the nucleus depending on the chemical bonds.
This affects the coulomb barrier involved in Alpha decay, and therefore changes the height and width of the barrier through which the alpha particle must tunnel.