Difference between revisions of "GPS analysis system"
(→Running Solutions) |
(→Autoftp) |
||
Line 7: | Line 7: | ||
====Autoftp==== | ====Autoftp==== | ||
+ | |||
+ | Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available. | ||
====Autofront==== | ====Autofront==== |
Revision as of 13:41, 20 May 2006
We use a GPS data analysis system based on the GIPSY software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.
Contents
Where do you put RINEX files?
RINEX files should be put into the hopper, $RAWDATA/hopper
. What, you don't have RINEX files yet? See RINEXing. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the autofront
and autoclean
programs manually.
What happens automatically?
Autoftp
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a request file, which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.
Autofront
Autoclean
Where do the data files go?
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each week directory there are several subdirectories; qm files are stored in $ANALYSIS/wwww/qm
, where wwww
is the 4 character GPS week number (with a leading zero if needed).
Running Solutions
Solve, a very flexible script. (link to detailed help)
Philosophy of solve
Subnet files and campaign files
Standard solutions
(text of standard_Alaska_solution)
Running several days at once: make-make-flt and make-alaska
(text of a standard make-alaska file and variant)
Running several weeks at once
(text of sample rerun-* file)
Cleaning Solutions
Initial Explanation of terms
Expected residuals from a clean solution
Automated screening: postfit, the point file, postbreak
Checking for bad pseudorange data: badp, allbadp
Removing biased pseudorange data: del_pcode_arc
Automatically identified cycle slips: breaks
Quickly scanning through residuals: short_hand
Limitations of short_hand
Manually checking residuals and fixing problems (link)