Wednesday, October 8, 2014

Changing Land of Promise—Part XIV – So How Old Are Earth Rocks? Part II – The Assumptions

As shown in the previous post, the geologic long-term dating methods are fraught with error, though no geologist is going to say so—however, the method used for dating rocks that makes a claim the Earth to be 4.54 billion years old is based on erroneous beliefs. In the last post we presented three major, insurmountable and unprovable assumptions geology makes in order to date the rocks they use to date the Earth. Let’s take a deeper look at these three assumptions.
Who would have been around in the geologist’s beginning to know what rocks were like, what they contained, and how they were affected by other factors?
    Assumption 1: Conditions at Time Zero: No geologists were present when the vast majority of Earth rocks formed, so they cannot test whether the original rocks already contained daughter isotopes alongside their parent radioisotopes. For example, with regard to the volcanic lavas that erupted, flowed, and cooled to form rocks in the unobserved past, evolutionary geologists simply assume that none of the daughter argon-40 atoms were in the lava rocks.
    For the other radioactive “clocks,” it is assumed that by analyzing multiple samples of a rock body, or unit, today, it is possible to determine how much of the daughter isotopes (lead, strontium, or neodymium) were present when the rock formed (via the so-called isochron technique, which is still based on unproven assumptions 2 and 3).
    Yet, lava flows that have occurred in the present have been tested soon after they erupted, and they invariably contained much more argon-40 than expected. For example, when a sample of the lava in the Mt. St. Helens crater (that had been observed to form and cool in 1986) was analyzed in 1996, it contained so much argon-40 that it had a calculated “age” of 350,000 years! Similarly, lava flows on the sides of Mt. Ngauruhoe, New Zealand, known to be less than 50 years old, yielded “ages” of up to 3.5 million years.
    So it is logical to conclude that if recent lava flows of known age yield incorrect old potassium-argon ages due to the extra argon-40 that they inherited from the erupting volcanoes, then ancient lava flows of unknown ages could likewise have inherited extra argon-40 and yield excessively old ages.
    There are similar problems with the other radioactive “clocks.” For example, consider the dating of Grand Canyon’s basalts (rocks formed by lava cooling at the earth’s surface). We find places on the North Rim where volcanoes erupted after the Canyon was formed, sending lavas cascading over the walls and down into the Canyon--obviously, these eruptions took place very recently, after the Canyon’s layers were deposited.
    However, these basalts yield ages of up to 1 million years based on the amounts of potassium and argon isotopes in the rocks. But when we date the rocks using the rubidium and strontium isotopes, we get an age of 1.143 billion years. This is the same age that we get for the basalt layers deep below the walls of the eastern Grand Canyon.
    How could both lavas—one at the top and one at the bottom of the Canyon—be the same age based on these parent and daughter isotopes? One solution is that both the recent and early lava flows inherited the same rubidium-strontium chemistry—not age—from the same source, deep in the earth’s upper mantle. This source already had both rubidium and strontium.
    To make matters even worse for the claimed reliability of these radiometric dating methods, these same basalts that flowed from the top of the Canyon yield a samarium-neodymium age of about 916 million years, and a uranium-lead age of about 2.6 billion years!
Who would have been around millions of years ago to know what kind of contaminants might have affected the rocks, from ground water to other factors?
    Assumption 2: No Contamination: The problems with contamination, as with inheritance, are already well-documented in the textbooks on radioactive dating of rocks. The radioactive “clock” in rocks is open to contamination by gain or loss of parent or daughter isotopes because of waters flowing in the ground from rainfall and from the molten rocks beneath volcanoes. Similarly, as molten lava rises through a conduit from deep inside the earth to be erupted through a volcano, pieces of the conduit wall rocks and their isotopes can mix into the lava and contaminate it.
    Because of such contamination, the less than 50-year-old lava flows at Mt. Ngauruhoe, New Zealand, yield a rubidium-strontium “age” of 133 million years, a samarium-neodymium “age” of 197 million years, and a uranium-lead “age” of 3.908 billion years!
    Assumption 3: Constant Decay Rate: Physicists have carefully measured the radioactive decay rates of parent radioisotopes in laboratories over the last 100 or so years and have found them to be essentially constant (within the measurement error margins). Furthermore, they have not been able to significantly change these decay rates by heat, pressure, or electrical and magnetic fields. So geologists have assumed these radioactive decay rates have been constant for billions of years.
    However, this is an enormous extrapolation of seven orders of magnitude back through immense spans of unobserved time without any concrete proof that such an extrapolation is credible. Nevertheless, geologists insist the radioactive decay rates have always been constant, since it makes these radioactive clocks “work”!
    New evidence, however, has recently been discovered that can only be explained by the radioactive decay rates not having been constant in the past. For example, the radioactive decay of uranium in tiny crystals in a New Mexico granite yields a uranium-lead “age” of 1.5 billion years. Yet the same uranium decay also produced abundant helium, but only 6,000 years worth of that helium was found to have leaked out of the tiny crystals.
    This means that the uranium must have decayed very rapidly over the same 6,000 years that the helium was leaking. The rate of uranium decay must have been at least 250,000 times faster than today’s measured rate! 
    The assumptions on which the radioactive dating is based are not only unprovable but plagued with problems. As this article has illustrated, rocks may have inherited parent and daughter isotopes from their sources, or they may have been contaminated when they moved through other rocks to their current locations. Or inflowing water may have mixed isotopes into the rocks. In addition, the radioactive decay rates have not been constant.
The Geologic Time Clock which shows the Quaternary Period, the time man has been on the geologic earth—a mere 17 seconds on the geologic clock
    So if these clocks are based on faulty assumptions and yield unreliable results, then scientists should not trust or promote the claimed radioactive “ages” of countless millions of years, especially since they contradict the true history of the universe as recorded in God’s Word--yet they do constantly and with great vigor.
    What is really disheartening about all this is that geologists will not even consider the negative side of their assumptions, but cling to them as though they are infallible and their assumptions unquestionable! 
(See the next post, “Changing Land of Promise—Part XV and Our Changing World,” for an understanding of the Assumptions made by Geologists to date the Earth)

No comments:

Post a Comment