Sunday, May 9, 2010

Decaying Speed-of-light?

A creationist I was debating recommended that we look at the web site '' and comment on it.

I looked at it.

Like all other creationist web sites, it is a fraud. It is actually worse than some others.

The creationist didn't reference any specific page on that site, I get to choose which one I want to comment on. One of them discussed one of my favorite topics - the speed-of-light.

Specifically the web page at talks about 'evidence' that the speed-of-light has been diminishing historically. For support it quotes the work of Barry Setterfield. A diminishing speed-of-light would help to explain why there are stars many light-years from Earth if the Earth is only a few thousand years old.

I like this example because it shows that creationism is wrong but it also shows what a fraud creationism is.

Setterfield himself is a fraud. Anyone who would use his data is also a fraud. I can support those claims.

First, a bit of background.

The way that you measure the velocity of anything is to see how long it takes for that thing to move a known distance. You then divide the distance by the measured time and you thereby calculate the velocity.

The speed-of-light is VERY fast - 300,000 kilometers per second (186,000 miles per second).

If you are using the distance of 1 mile to measure that velocity, then you must be able to measure about 5 microseconds accurately in order to correctly calculate the speed-of-light. So you either need very long distances to measure over or you need a clock able of measuring very small time increments. Clearly with older technologies measuring the speed-of-light accurately is very, very difficult.

When people first tried to measure the speed-of-light they didn't have an adequate technology to measure it anywhere close to accurately. Galileo tried to measure the speed-of-light by putting people with lanterns on mountain peaks fairly far apart and tried to measure how long it took to have one person open his own lantern when he saw the distant lantern turned on. But human reaction time swamped the actual time required for the light to travel from peak to peak. Galileo determined that the speed-of-light was infinite.

Was it?

No. (Even Setterfield says that it was much less than infinite during Galileo's lifetime.)

It's just that with Galileo's methods, the margin of error was larger than the thing that he was trying to measure.

It would be like measuring the speed-of-light as 300,000 kmps +/- 10,000,000 kmps. The measurement gets lost in the error of the measurement.

As new technologies - particularly those associated with measuring time increments - have improved the error margin has diminished. But up until recently the error margin has still been significant.

But you can use data points with significant error margins to show any sort of trend, depending on the points that you pick.

For example, if the error margin is +/- 10,000 kmps then you might expect to see readings between 290,000 kmps and 310,000 kmps. If the error margin diminishes later to +/- 1000 kmps then you might expect to see readings between 299,000 kmps and 301,000 kmps.

If you look at ONLY the highest readings you see the measured speed-of-light going from 310,000 kmps to 301,000 kmps to 300,000 kmps. In that case it seems to be decreasing.

If you look at the lowest readings you see the speed-of-light going from 290,000 kmps to 299,000 kmps to 300,000 kmps. So it seems to be INCREASING.

Within that narrow set of data, you can see all sorts of different trends due really to nothing but the margin of error of the particular data points that you decide to use.

Enter Barry Setterfield.

What does he do?

He looks at the historical measurements for the speed-of-light and tries to find a trend. But he does so in a very fraudulent way.

There is a long explanation for Setterfield's fraud at but in summary Setterfield:

1. Ignores many data points

For example, in the 17th century, Christiaan Huygens measured the speed-of-light as 220,000 kmps. In that same century, in 1675 Ole Roemer measured it at 200,000 Kmps. Setterfield doesn't use those data points. Why not? Because they are LOWER than the current speed of light so that they don't match what he is trying to prove. As I show above, if you arbitrarily ignore anyset of data points you can show any trend that you want.

2. Setterfield can't match his own data to his claims.

Setterfield claims to have found a perfect match to a proposed curve showing c-decay. But even he admits that not one of his 38 data points (selected after ignoring those that he doesn't like) falls on his "perfect fit" curve.

Setterfield also has problems with statistics and other things. The web page I referenced goes into the details.

In addition, there are some common-sense reasons to reject Setterfield's data.

1. Setterfield has no objectivity whatsoever. He actually states in his paper that one of his goals is to reconcile "the observational problems of astronomy and Genesis creation .."

2. Both the Institute for Creation Research (ICR) and Answers in Genesis (AiG), two of the most prominent young Earth creationist organizations, say that this proposal has a number of problems that have not been satisfactorily answered and they both advise young Earth creationists against advocating the idea.

3. If Setterfield's curve is accurate then we should continue to see additional decay. We don't. The measured speed-of-light has not changed since at least 1975. In all that time technologies has been sufficient to measure the speed-of-light within a fraction of a kmps.

4. Relativity prevents any possible speed-of-light much larger than the one we measure now even in the past.

Einstein famously calculated that E = M * C^2. The 'C' in that equation is the speed-of-light.

Of course the atomic bombs dropped on Hiroshima and Nagasaki show the validity of that formula on a macro scale. But that formula affects ANY atomic change. So when a radioactive

particle decays, it releases energy at the rate of E = M * C^2. The energy released from each such a reaction is very small because the mass change is so small - sub-atomic. But the sum of all radioactive decay in the Earth's crust actually generates enough heat by the ration of E = M * C^2 to help keep the Earth from turning into a snowball.

Note that the energy released is proportional to the square of the speed-of-light. So **IF** the speed-of-light was ten times higher at some time in the past, then the amount of energy released from each radioactive decay was 100 times (C^2) higher. Simply put, any significant increase in the speed-of-light (and Setterfield actually says that it was 'infinite' in 4000 BC) would melt the Earth's crust.

This is especially ironic because another claim that some creationists make in order to explain away radiometric dating is that radioactive decays were SIGNIFICANTLY more frequent in the past. That claim would only increase the negative effects of a larger speed-of-light.

The bottom line: creationists cannot be trusted and the web site supplied by Jeebs is worst than most. At least the ICR and AIG don't use this particular fraudulent argument (though they use many others).

No comments:

Post a Comment