Another explanation that creationists attempt to use in order to explain light appearing to us from places more than 6000 light-years from Earth is that the speed-of-light was much higher in the past. There are many problems with this explanation as well; one of which I will address.
The speed-of-light is more than just the speed that light travels through a vacuum. It is actually one of the most basic constants in the universe. Most people are familiar with the equation that Albert Einstein derived as a part of the Theory of Relativity: E = M*C**2. While most people have heard that formula, not everyone knows that ‘C’ in that equation stands for the speed-of-light.
That equation describes the amount of energy (E) that will result from a change of matter (M). Because the speed-of-light is a very big number, an awful lot of energy results from a change of matter. Most famously, the atomic bombs that were dropped on Hiroshima and Nagasaki were the result of very large amounts of energy being released from relatively small amounts of mass.
In fact, however, atomic bombs are not the only place that this equation becomes relevant. Masses change on the atomic level as well. Every time there is a radioactive decay, such as when an atom of uranium decays and is detected by a Geiger counter, energy is given off, the amount of which can be calculated by this equation. Because the mass is so very small when you are dealing with individual atoms, the amount of energy is relatively small as well. Nonetheless, the total amount of energy released from such radioactive decays in the Earth’s crust is enough to keep the Earth from turning into a cold and frozen rock[1].
So what would happen if the speed-of-light was significantly higher than what it is now?
Let’s say, hypothetically, that it was formerly ten-times higher than it is now. What effects would that have?
Since Einstein’s equation converts mass to energy by the ‘square’ of the speed-of-light (i.e. “C**2”) then the amount of energy released from each radioactive decay in the Earth’s crust would be actually 10-squared or 100 times higher than it is now.
That would melt the Earth’s crust.
Furthermore, the Sun’s energy is a type of nuclear energy (fusion energy resulting from hydrogen converting into helium). The energy released would also be proportional to the square of the speed-of-light. So the Sun would also shine 100 times brighter than it does now.
That would kill all life on Earth – as would the melting of the Earth’s crust.
The bottom line – if the speed-of-light is ten-times what it is now, all life on Earth dies.
To emphasize the absurdity of their claims, some creationists insist that the speed-of-light may have been a billion times higher than it is now during and soon after the creation week! That would mean that energy would be increased by a factor of 10**18 (one billion squared)! Under those circumstances, each radioactive decay would have the effect of an atomic bomb. The sun and the other stars would also shine so brightly that there would never be any darkness anywhere in the universe.
(I can’t resist mentioning that most creationists are fond of pointing out how finely-tuned the universal constants are and suggesting that this “fine-tuning” is evidence of a creator God. It is, therefore, amusing how quickly they will suggest that one of the most fundamental of those constants can be changed willy-nilly with no ill effects on the universe! Consistency is not a common characteristic among creationists.)
[1] At the web site http://en.wikipedia.org/wiki/Age_of_the_Earth ( referenced on December 30, 2008) Wikipedia has a discussion of this effect.
Friday, February 20, 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment