This is part 1 of a 3-part blog.
Since California Surveying & Drafting Supply was founded 30 years ago, GPS has evolved from a handful of NAVSTAR satellites to a global constellation now referred to as GNSS, or Global Navigation Satellite System, used to pinpoint positions anywhere in the world.
GPS has changed the way we live and work. But the real revolution isn’t asking your smartphone for directions to the nearest Mexican restaurant. It’s the professional-grade positioning solutions that save time and money for surveyors, engineers, builders – even farmers.
Recently, we sat down with CSDS President Tom Cardenas and Systems Administrator Ed Morrison, who oversees California’s largest GNSS network, covering 93,000 square miles from Redding to Bakersfield. In a three-part Q&A series, Tom and Ed will explore:
- How has GPS changed the surveying industry?
- Who uses the CSDS real-time network – and why?
- What are the challenges – now and in the future?
Q: What did the workflow used to look like for surveyors and other professionals who needed precision positioning?
TOM: Before the advent of cellular-based internet technology, real–time, high-accuracy GPS could only be achieved by the end user utilizing two GPS receivers:
One is a base or a reference receiver that is typically set up on a tripod in a fixed position. The other roving receiver is carried around to measure points, and the user must get a relative correction from the tripod-based receiver. So to achieve high-accuracy GPS data collection, within a centimeter or two, you were required to purchase a base receiver. You needed a UHF radio, so you could communicate the corrections from that base receiver to the roving receiver. And then that roving receiver took those corrections over the air from that radio and you were able to achieve real-time precision and/or accuracy on the roving side.
When these systems first came out, you were looking at about $60,000 to $70,000 for the two receivers, plus the radio that created the communication between those two receivers, your data collector and processing software, and several other costs. Obviously, if you had a stationary base receiver, you had to be cognizant of where you were setting it up because you didn’t want to leave it unsupervised. So, you had to factor in either having a fenced in location to keep that receiver running while you were out with the rover, or you’d have to hire somebody to sit and watch the receiver. And for these high-powered UHF radios, the user had to obtain proper FCC licensing, with a pretty limited band of frequencies and power limitations, too.
There was also a limit to the distance that you could go from that base station. Using the rule of thumb of a one-part-per-million error rate, that meant for every six miles beyond the base receiver, you were introducing about 1 centimeter of error into your solution. Beyond 7 to 10 miles, you’d potentially lose reception between yourself and the base. That’s just the way it was. But technology evolves fairly rapidly.
From about the mid 90s to the early 2000s, Real-Time Kinematic GPS, or RTK, was in its infancy. As cell phones became more popular and the whole wireless revolution came about, Trimble identified the ability to be able to provide these corrections over a cellular link, over the internet to the rover, instead of from a radio over the air. Trimble started utilizing what are referred to as CORS stations (Continuously Operating Reference Stations), which are permanently-mounted GPS receivers that are typically mounted to buildings or ruggedized, permanent mounts.
Around 2004, we installed our first CORS station, and we were so excited. We got the antenna mounted on our roof, and we had the receiver down here in the office. So now, we had a reference receiver permanently mounted on our building, and we were offering up this correction over the single base station. That eliminated the need for customers that were working within, say, 20 miles of our office to have to purchase another base station.
As the technology evolved, Trimble developed a system where instead of using just one reference station, we were able to create a network of multiple reference stations as long as those reference stations were within 45 kilometers, or about 30 miles of each other. As our network grew, we eliminated the range factor because as long as you were working within the perimeter of that network, you were able to get those corrections.
That brings us up to today, with the CSDS real-time network.
ED: It’s a high-accuracy, Real-Time Kinematic (RTK) GNSS network. It’s the largest network of its kind in the state of California, running from Redding to Bakersfield, from the Nevada border over to the coast. We have 42 stations in the network. There are two subnets to that network that serve the valleys in one and then the coastal subnet for the other. Our network is fully redundant and constantly monitored, and it works with virtually all major surveying, GIS, machine control or agricultural GPS systems.
Q: How do your customers use the network?
ED: They’ll take their GPS equipment out into the field and establish an internet connection with some kind of internet device, usually a Wi-Fi enabled smartphone or a MiFi device. Customers can access the network from anywhere that has cell phone coverage. So, we’re talking about two different networks here – our survey network versus the cellular network. Our survey network, the way it’s been designed, so long as you’re within the boundaries of our network, you can get centimeter-level precisions. But since this all satellite-driven, the limitation for this service is whether there’s actually cell service at that area where you’re trying to survey.
TOM: As long as you’re working within the perimeter of our network – and, of course, as long as you have cellphone coverage – you can receive corrections. The largest source of error in GPS is ionospheric – essentially, the atmosphere causing a distortion in the signal from the satellite to the receiver itself. But the Trimble software models the effects of the ionosphere on the GPS signal.
ED: The signal from the satellite is based on time and range. So if your distance from point A to point B is 10 miles, but in between point A and point B there is some kind of atmospheric noise or obstruction causing that signal to move off its direct path, then the time that it takes for the signal to leave each GPS satellite isn’t the same as it would be if it was traveling in a straight line, which can cause a deviation in the length of the radio waves between receivers.
TOM: We’re dealing with the speed of light. Any minor interruption or distortion of that signal can cause a significant change in the calculated distance between the satellites and receivers.
ED: Now, we have all these reference stations around the state capturing GPS or GNSS signals from satellites. All this diffraction is going everywhere, at every location. So the station takes that raw data and ships it via the internet to servers in our data center. Our servers then take all these feeds and mathematically model out all these diffractions. So now the signals become theoretically straight again and that’s what gives us the high precision over such a large area. That’s when somebody can go out there with a rover, tie into the network, and know where they are within a centimeter without the parts-per-million error that you would have when you were dealing with a single base.
TOM: If you’re using only one base or fixed reference station, the further you go, the more variation you’re going to have in the atmosphere between those two locations, which is going to cause the higher potential for error. But with multiple stations running software to model out the ionosphere, we’ve essentially eliminated that parts-per-million error over our entire network.
Stay tuned for Part 2 of our GPS series:
Who uses the real-time network, and why?
In the meantime, leave your questions for Tom and Ed, and we’ll answer them in comments below.