An interview with TeleNav’s Bob Rennard
Most of us have had times when GPS reception was challenging, and it’s easy to make a leap in logic and blame it on current environmental conditions. Today, I’m happy to say that Bob Rennard, TeleNav co-founder and Chief Technical Officer, has agreed to tackle these issues head on in an interview with GPS Tracklog. So lets see if we can clear up some GPS reception myths and misconceptions…
Bob, can you tell our readers a bit about your background and what role you played in the development of GPS?
I was initially assigned responsibility for the concept validation phase ground control system that collects GPS observations, and computes the satellite orbit and satellite clock behavioral parameters that are then broadcast on the downlink to the mobile receivers. Unlike most large systems that are integrated by a defense contractor, GPS system integration was performed by the Air Force. Accordingly, most of my time was spent working on the behavioral and performance specifications for the different system segments, and the interface control documents that describe how the segments interact. I was also responsible for the radio frequency registration filings that passed through the US government to the UN World Administrative Radio Conference in 1979.
My understanding has been that the primary things that block GPS signals are water and metal, but that foliage and rain should not impact reception; yet many people report problems with the last two. Can you set us straight on this? Will a thin film of water on canopy, your car, the antenna of a handheld GPS receiver, etc., become an effective block?
Rainfall should not cause a noticeable degradation to the GPS signals unlike its impact to satellite television that operates in a different part of the radio spectrum. Dense foliage can totally block the signals. Some amount of water on the antenna is not an issue, but a deposit of ice or snow could be. The signals are strong enough to overcome the expected levels of attenuation for rainfall, or snowfall. GPS is after all a military system.
To what extent does a moving GPS receiver impact its ability to lock onto satellites?
The impact to GPS signal acquisition is negligible. A car speeding at 118 miles per hour is moving at 50 meters per second. The satellites are moving at 3,873 meters per second. The relative speed along the line-of-sight between the satellite and the receiver determines how much the GPS downlink signal is Doppler shifted in frequency. This line-of-sight speed is dominated by the satellite speed, and the receiver’s speed is inconsequential.
Does holding GPS receivers close together really interfere with reception? How close do they need to be for this to happen and how big of an impact is one likely to see on accuracy? Do other things cause interference, such as other RF receivers?
Two receivers of the same electrical design could interfere with each other if placed in close proximity since the processing signals from one receiver could radiate from its antenna and cause interference to another. It is hard to tell the exact level of degradation since this would likely be a function of the designs and the mutual proximity. Combinations of other signals in the environment could produce inter-modulation that would be disruptive to the GPS receiver, but I have never witnessed it in person. The high speed digital circuits in PCs can radiate interference to GPS receivers.
What level of increased accuracy will consumers see from GPS Block IIF and III satellites, and how soon (i.e., how many of each will need to be operational before we see those benefits?)?
The presence of the L2 C/A signal will allow consumer receivers to calibrate the ionospheric delay better than the current Klobuchar model can predict it, improving accuracy. The current USCG SPS is pretty loose when compared to the actual performance we see from GPS with WAAS augmentation. We tested one receiver over several hours, and it never produced a fix outside of a 4 meter by 8 meter rectangle. I am hesitant to suggest that with L2 C/A, and without WAAS, that accuracy will be much improved. In any case, at least 4 visible satellites will have to be the modernized satellites to fully see the benefits. Given the reliability of the satellites, they only need to be replaced at the rate of 3 to 4 per year. Accordingly, it will take a few years before we see the benefits.
Do CDMA phones have any advantage over GSM phones, due to fine time information transmitted over the network? Does this impact anything other than acquisition time?
The coordination of the CDMA network with GPS time does allow the receivers in CDMA phones to acquire satellite signals more quickly than receivers on other networks. In addition, CDMA networks offer a “hybrid” mode that uses base station trilateration as a supplement to GPS when an insufficient number of GPS signals are available.
I realize that this will vary greatly by model and design, but why is it so difficult for smartphones to lock onto satellites without a cellular connection?
Aiding is very important for getting a fix quickly. Satellites that are visible at a cellular base station will also be visible at any handset the base station services, and the Doppler offsets in the signals will be almost identical. This information radically reduces the time and frequency uncertainty that an unaided receiver must search. A receiver that has acquired GPS signals must then spend 18 to 30 seconds demodulating the orbit and satellite clock behavioral parameters, but an aided receiver can get this same data (about 25 kilobits for all satellites on orbit) in a couple of seconds. A GPS receiver with a lot of signal processing power, such as one of the Bluetooth connected GPS “pucks”, is able to start navigating from a cold start in about 40 seconds with no aiding.
Bob, thank you for taking the time to answer our questions!