The story goes that without taking both the special and general theory of relativity into account, the Global Positioning System GPS would not work: It would be off by 11 km over a day. This is stated in e.g. Stephen Hawking's bestseller A Brief History of Time, and in some books about GPS, but not all. In particular, the Official U.S. government web site about the GPS and related topics, tells nothing about the use of relativity theory. Is US hiding something?
So what are the facts? The satellite clocks all use the same cesium atomic clocks, which at satellite launch are slowed down the fixed amount of 38.000 nanoseconds per day, which is about 0.5 nanoseconds per second.
Some sources present the offset as a tribute to both the special (+ 7.000) and the general theory of relativity (-45.000), but not all and not the Official U.S. Government GPS web site.
Other sources say that the initial offset is made because the environment of the satellite clock makes them tick a little faster than the master ground clock, not because of relativity but because the mechanics of the clock is influenced by temperature, pressure and gravitation. In any case, the satellite clocks are continuously synchronized to an Earth based master clock to within 1 nanosecond per second allowing to set Coordinated Universal Time UTC with that precision.
From this information alone, we understand that the initial offset of 0.5 nanoseconds per second plays no role whatever the reason for it may be: Even without initial offset the system would work fine by the continuous synchronization used.
Let us now see what effect 1 nanosecond per second, that is a relative precision of $10^{-9}$, can have on the precision of the system. We know that both the position and clock reading of a GPS satellite when sending a signal received by the GPS receiver in your hand, is encoded in the signal. With simultaneous readings from 4 satellites it allows the receiver to both synchronize its own clock with the synchronized satellite clocks and from that determine the signal transit time and then the distance to the different satellites and then finally its own position. The travel time of the signal is less than 0.1 seconds, and so the effect on position from a 1 nanosecond per second time offset is $10^{-10}$, which with a speed of light of $3\times 10^8$ meter per second, would be 3 centimeters. In any case the effect of a difference of 1 nanosecond in travel time of light is 3 decimeter = 1 foot as possible GPS precision.
From where does then Hawking pick his 11 km/dag? Well, it comes from multiplying $38.000\times 10^{-9}$ seconds/day with the speed of light $3\times 10^5$ km/second to get
$11=3.8\times 3$ km/day. But this has nothing to do with reality. It is deceptive fiction propagated by physicists like Clifford Will to sell the message that Einstein's theory of relativity has massive experimental support.
Why do physicists deceive us with fake-physics? How can 3 centi/decimeters be twisted into 11 km?
Maybe the master clock on earth also has an adjusted rate?
SvaraRaderaVery interesting, Claes. I agree with you. There is no need for adjusting the clocks to compensate the effect of relativity theory (if any) when we just may calibrate and syntronize them constantly. However, the usual error of atomic clocks is not more than 10 ns per year. With this accuracy in mind, slowing down the atomic clocks 0,5 ns per second is not a small adjustment. Isn't it striking that this large value coincide with the claimed effect of the relativity theory? I hate to be mainstream, so I really hope you can help me on track :)
SvaraRadera