<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>http://gps.alaska.edu/internal/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Jeff</id>
		<title>GeodesyLab - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="http://gps.alaska.edu/internal/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Jeff"/>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php/Special:Contributions/Jeff"/>
		<updated>2026-04-18T14:15:01Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.25.1</generator>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Kinematic_Processing&amp;diff=2896</id>
		<title>Kinematic Processing</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Kinematic_Processing&amp;diff=2896"/>
				<updated>2013-10-24T17:30:06Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;==Workflow==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;define subnet file (including clock site, a couple of non-kinematically estimated sites, at the ones you are really interested in)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;define namelist (nml) file (RANDOM-WALK!), at best put them into /home/akda/Kinematic . That's also were examples live&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;adapt script that calls solve (check whether you need a rapid solution -- when orbits aren't in yet)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;run solution&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;extract data from .trop files&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;plot results.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Temporal Resolution below 5 min==&lt;br /&gt;
The standard solutions run with a temporal solution of 5 minutes (6 minutes if older than 1995). This means the standard qm files do not contain enough information if you need a finer resolution. Here is what to do:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;create campaign directory: &amp;lt;code&amp;gt;newcamp /gps/data/CAMPAIGN_NAME-interval&amp;lt;/code&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;create qm files for all rinex files from sites that go into that campaign for respective time:&amp;lt;br/&amp;gt;&lt;br /&gt;
    &amp;lt;center&amp;gt;&lt;br /&gt;
        &amp;lt;code&amp;gt;&lt;br /&gt;
         rnx2qm RINEX-file -dir $procdir -interval 30&lt;br /&gt;
&lt;br /&gt;
         or&lt;br /&gt;
&lt;br /&gt;
         reprocess_rinex station-name start-day end-day rinexdir outdir interval&lt;br /&gt;
        &amp;lt;/code&amp;gt;&lt;br /&gt;
    &amp;lt;/center&amp;gt;&lt;br /&gt;
where $procdir=/gps/data/CAMPAIGN_NAME-interval, i.e. the campaign directory you just created.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;copy the orbits for the repective day(s) into /gps/data/CAMPAIGN_NAME-interval/orbit&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;add the &amp;lt;code&amp;gt;-nosvclkfile&amp;lt;/code&amp;gt; flag to your &amp;lt;code&amp;gt;solve&amp;lt;/code&amp;gt; call, to make sure the satellite clocks are not used ...&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;or get high resolution clocks for the respective days from [ftp://sideshow.jpl.nasa.gov/pub/gipsy_products/hrclocks/ ftp://sideshow.jpl.nasa.gov/pub/gipsy_products/hrclocks/] and copy those into /gps/data/CAMPAIGN_NAME-interval/orbit &amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;if you got the high resolution orbits, make sure to rename them: mv *.tdp *.tdpc &amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;make sure to update the $CAMP variable in the make-xxxxx file that you copy into your /gps/data/CAMPAIGN_NAME-interval/flt directory&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Standard/Automated Processing for Kinematic PPP Solutions==&lt;br /&gt;
&lt;br /&gt;
There are some new scripts available to make it quick and easy to do high rate static and kinematic&lt;br /&gt;
solutions. These implement a standard solution using 1 sample per second data, and using the 3.0&lt;br /&gt;
solution strategy otherwise.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
  &amp;lt;li&amp;gt;Create a new campaign directory using the &amp;lt;code&amp;gt;newcamp&amp;lt;/code&amp;gt; script. Copy the orbit files&lt;br /&gt;
       you will need for whatever days you are processing. Also, collect the 1 sample per second rinex files&lt;br /&gt;
       in some directory (not in the campaign data structure).&amp;lt;/li&amp;gt;&lt;br /&gt;
  &amp;lt;li&amp;gt;Run the script &amp;lt;code&amp;gt;f-e.kinematic&amp;lt;/code&amp;gt; to create 1 sample per second qm files. The full command line is&lt;br /&gt;
       &amp;lt;code&amp;gt;f-e.kinematic rinexdir [campdir]&amp;lt;/code&amp;gt;. The first argument is the directory where your&lt;br /&gt;
       rinex files (compressed .Z or .gz files) are, and the optional second directory is the&lt;br /&gt;
       top-level campaign directory (default is &amp;lt;code&amp;gt;$CAMP&amp;lt;/code&amp;gt;).&amp;lt;/li&amp;gt;&lt;br /&gt;
  &amp;lt;li&amp;gt;You will need to check the data for cycle slips and other problems using a high-rate static&lt;br /&gt;
       solution. Run the script &amp;lt;code&amp;gt;make_static_pppsolve_script&amp;lt;/code&amp;gt; in &amp;lt;code&amp;gt;$CAMP/flt&amp;lt;/code&amp;gt;&lt;br /&gt;
       to make a script that will generate a script to run such a solution for all qm files. You may&lt;br /&gt;
       want to edit the resulting script &amp;lt;code&amp;gt;make-static&amp;lt;/code&amp;gt; to add lines defining the $CAMP&lt;br /&gt;
       variable so that you can run it more automatically.&amp;lt;/li&amp;gt;&lt;br /&gt;
  &amp;lt;li&amp;gt;Check the residuals, add ambiguity parameters as needed. Same as a regular solution.&amp;lt;/li&amp;gt;&lt;br /&gt;
  &amp;lt;li&amp;gt;Run PPP solutions for all stations using the script &amp;lt;code&amp;gt;run_3.0_kinematic_solution&amp;lt;/code&amp;gt;.&lt;br /&gt;
       Full syntax is &amp;lt;code&amp;gt;run_3.0_kinematic_solution -date yymmmdd [-update]&amp;lt;/code&amp;gt;. The script&lt;br /&gt;
       will run all qm files for that date, putting the solutions in the sub-directory PPP3.0kin.&lt;br /&gt;
       The time-depenent parameter files (TDP files) are in the $CAMP/trop directory. With the -update flag,&lt;br /&gt;
       the script will only run qm files that have not been run before.&amp;lt;/li&amp;gt;&lt;br /&gt;
  &amp;lt;li&amp;gt;Extract the time-dependent solutions for each station and make SAC files using the script&lt;br /&gt;
       &amp;lt;trop2enu&amp;gt;. Give it the .trop file as an argument, and it will create corresponding .enu and .SAC files.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==Defining Custom Solution Parameters==&lt;br /&gt;
Use an NML file to define the solution parameters.   Here is an example to start with.&lt;br /&gt;
&lt;br /&gt;
 $PREP                       !namelist input for preprefilter           &lt;br /&gt;
 ! reference clock is REQUIRED.  you need to use one of &lt;br /&gt;
 ! your sites, generally one with a stable clock.  Mostly&lt;br /&gt;
 ! it means anything except a trimble.  If an external&lt;br /&gt;
 ! oscillator is used, then it can be a reference clock.&lt;br /&gt;
   REFCLOCK = 'THU1' &lt;br /&gt;
 ! just says whether the troposphere will be estimated stochastically.&lt;br /&gt;
 ! except for very short baselines, this is always true.&lt;br /&gt;
   WETZTROP  = .TRUE.                                                    &lt;br /&gt;
 ! This is the white noise sigma applied to phase ambiguity parameters &lt;br /&gt;
 ! whenever a new ambiguity needs to be estimated (.1 seems to work).&lt;br /&gt;
   yaddsg = 0.1&lt;br /&gt;
 ! this is the random walk constraint for the troposphere, in km/sqrt(sec)&lt;br /&gt;
   TROPDRIFT = 1.70D-7                                                   &lt;br /&gt;
 ! if you want to estimate additional paraetmeters stochastically, you&lt;br /&gt;
 ! put them here.  You don't have to put clocks here, because GIPSY&lt;br /&gt;
 ! assumes that all clocks will be estimated stochastically.&lt;br /&gt;
   SSTCH = 'STA E   SCOB','STA N   SCOB', 'STA V   SCOB'&lt;br /&gt;
 ! in this example they are the east, north, and vertical components &lt;br /&gt;
 ! for station SCOB.  Generally station locations are not&lt;br /&gt;
 ! estimated stochastically&lt;br /&gt;
 ! This tells you how to constrain the parameters you named above.&lt;br /&gt;
   SPSIG = 3*1.0D-6,  ! random walk constraint&lt;br /&gt;
   SMTAU = 3*'RANDOMWALK', &lt;br /&gt;
 ! how often do you want to estimate these stochastic parameters&lt;br /&gt;
   SDELT = 3*'/00:05', ! compute a position every 5  minutes &lt;br /&gt;
 &lt;br /&gt;
                                                                        &lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
                                                                        &lt;br /&gt;
 $INIT                       !namelist input for filter                 &lt;br /&gt;
   YDEL(1) = 'STABIAS THU1' ! same for the reference clock&lt;br /&gt;
  ! STABIAS is the GIPSY name for a station clock. SATBIAS is &lt;br /&gt;
  ! for a satellite clock&lt;br /&gt;
 &lt;br /&gt;
  ! this is if you are estimating station positions stochastically.&lt;br /&gt;
  ! just do it.&lt;br /&gt;
   grounddelete = 'SCOB',&lt;br /&gt;
  STAINT    = .FALSE.                                                   &lt;br /&gt;
  IDIGIT    = 9                                                         &lt;br /&gt;
                                                                        &lt;br /&gt;
   ! this is a list of all the parameters you DO NOT                         &lt;br /&gt;
   ! want GIPSY to estimate.  look at the cartoon book&lt;br /&gt;
   ! for more info.  X, Y, Z, DX, DY, DZare the satellite orbits.&lt;br /&gt;
   ! you don't want to estimate them if youare using someone&lt;br /&gt;
   ! else's orbits, which I would assume you are all going to do&lt;br /&gt;
  YDEL(2)      = 'UT1-UTC', 'UT1-UTC RATE', 'X POLE *', 'Y POLE *',     &lt;br /&gt;
              'GEOCENTER*', 'V LOVE', 'H LOVE', 'DRYZTROP*',            &lt;br /&gt;
              'LMPZTROP*', 'STA DRFT*', 'STA ACCL*', 'SAT DRFT*',       &lt;br /&gt;
              'SAT ACCL*', 'BIASPSR*', 'SOLARSCLTOP*','Y_BIAS  TOP*',   &lt;br /&gt;
              'XPOLEMOTION','XPOLERATE','YPOLEMOTION','YPOLERATE',      &lt;br /&gt;
              'X*',  'Y*', 'Z*',  'DX*', 'DY*', 'DZ*','TRPAZ*', &lt;br /&gt;
               'SOLSCL_X*', 'Y_BIAS*', 'SOLSCL_Z*', 'SOLARSCL*'&lt;br /&gt;
                                                                        &lt;br /&gt;
  SOLPRT    = .TRUE.                                                    &lt;br /&gt;
  NOMEAS    = .FALSE.                                                   &lt;br /&gt;
  SOLVE     = .TRUE.                                                    &lt;br /&gt;
  DEBUG     = 1                                                         &lt;br /&gt;
  OUTNAME   = 'OASIS'                                                   &lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
 $APRIORI                    !namelist input for filter                 &lt;br /&gt;
  APALL     = .TRUE.                                                    &lt;br /&gt;
  !  The following are NOT currently estimated                          &lt;br /&gt;
  !                                                                     &lt;br /&gt;
  ! Parameters actually estimated start here                            &lt;br /&gt;
  !                                                                     &lt;br /&gt;
     APNAMS( 1) =      'WETZTROP*',        APSIGS( 1) = 10.0D-5  !10 cm &lt;br /&gt;
     APNAMS( 2) =      'STABIAS*',         APSIGS( 2) = 3.0D5   !1 sec  &lt;br /&gt;
     APNAMS( 3) =      'SATBIAS*',         APSIGS( 3) = 3.0D5   !1 sec  &lt;br /&gt;
     APNAMS( 4) =      'PHASE*',           APSIGS( 4) = 1.0D-1   !1 micr&lt;br /&gt;
    ! I am using ENV here, but you should set these for XYZ if you are doing&lt;br /&gt;
    ! it in the Cartesian frame&lt;br /&gt;
     APNAMS( 5) =      'STAE*',            APSIGS( 5) = 1.  !1 km&lt;br /&gt;
     APNAMS( 6) =      'STAN*',            APSIGS( 6) = 1.  !1 km&lt;br /&gt;
     APNAMS( 7) =      'STAV*',            APSIGS( 7) = 1.  !1 km&lt;br /&gt;
     APNAMS( 8) =      'STA E   KOKB',     APSIGS( 8) = 10.0D-3  !10 m&lt;br /&gt;
     APNAMS( 9) =      'STA N   KOKB',     APSIGS( 9) = 10.0D-3  !10 m&lt;br /&gt;
     APNAMS(10) =      'STA V   KOKB',     APSIGS(10) = 10.0D-3  !10 m&lt;br /&gt;
                                                                       &lt;br /&gt;
                                                                        &lt;br /&gt;
                                                                        &lt;br /&gt;
                                                                        &lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
 $DATAWGHT                 !namelist input for filter, edtpnt2, and post&lt;br /&gt;
  ELMINSTA  = 15.0D0 ! elevation angle minimum                          &lt;br /&gt;
  ! 110 is pseudorange, 120 is phase&lt;br /&gt;
  DATYPE    = 110,    120,   &lt;br /&gt;
  DATSIG    = 1.0D-3, 1.0D-5, ! these are in km&lt;br /&gt;
  XRECNM(1) = 'HOLE' ! this is to delete a specific receiver&lt;br /&gt;
  XXMTNM(1) = 'GPS04' ! this is to delete a specific satellite&lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
 $SMINPUT                  !namelist input for smapper                  &lt;br /&gt;
  SAVEUD     = 'LAST'                                                   &lt;br /&gt;
  SAVESIGMA  = .TRUE.                                                  &lt;br /&gt;
  IDIGIT     = 9                                                        &lt;br /&gt;
  MAPTYP = 'CARTESIAN'&lt;br /&gt;
  OUTNAME    = 'OASIS'                                                  &lt;br /&gt;
   TDPFULLPRECISION = .TRUE.&lt;br /&gt;
  ! write out these parameters for stochastic parameters&lt;br /&gt;
  ! GIPSY only creates this file if asked for (it is a TDP&lt;br /&gt;
  ! - time-dependent-parameter file).  If you want to know&lt;br /&gt;
  ! clock values, then you need to write their names here.&lt;br /&gt;
  WRTTDP  =  'STA E   SCOB',  'STA N   SCOB',  'STA V   SCOB',&lt;br /&gt;
  TDPTYP  = 'TIMEVARY',&lt;br /&gt;
  TDPTOL  = 1.0d00,&lt;br /&gt;
  STAINT     = .FALSE.                                                  &lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
 $APRIORINML              !namelist input for smapper                   &lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
 &lt;br /&gt;
 $LIMITS                 !namelist input for postfit                    &lt;br /&gt;
  ! outlier criterion, i.e. flag all phase data greater than 5 cm.&lt;br /&gt;
  STAINT = .FALSE.                                                      &lt;br /&gt;
  DATYPE  = 120                                                         &lt;br /&gt;
  WINDOW  = 5.0D-5,                                                     &lt;br /&gt;
 $END                                                                   &lt;br /&gt;
                                                                        &lt;br /&gt;
 $EDTINIT                !namelist input for edtpnt2                    &lt;br /&gt;
  STAINT = .FALSE.                                                      &lt;br /&gt;
  DEBUG  = .TRUE.                                                       &lt;br /&gt;
 $END&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=2895</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=2895"/>
				<updated>2013-10-24T17:07:07Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by [[Special:Call/Include_info%2C/export/ftpweb/htdocs/sh2doc/index.html | shell scripts]] written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Documentation of Solution Strategies (new)===&lt;br /&gt;
The links below describe our solution strategy as it has evolved over time.&lt;br /&gt;
{|&lt;br /&gt;
|[[Solution_Strategy_1.0 | 1.0 1990s strategy.]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Solution_Strategy_2.0 | 2.0 Network solutions used from ~2002 through 2008.]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Solution_Strategy_2.5 | 2.5 PPP solutions but otherwise strategy 2.0]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Solution_Strategy_3.0 | 3.0 2010 new solution strategy]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
Quite a lot.&lt;br /&gt;
&lt;br /&gt;
Autoftp runs every night beginning at 6pm local time and fetches data files. These are placed into the '''hopper''' ($RAWDATA/hopper), a directory where all data files are put for entry into the system and processing. Autofront then runs at midnight to process all files in the hopper, including any placed there manually (from campaigns, for example). Finally, autoclean runs at 4am local to carry out automated screening for cycle slips and other bad data.&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
Autofront is a script intended to run under cron that carried out the initial &amp;quot;front end&amp;quot; processing on a set of GPS data files. When executed, it will process all files in the '''hopper''' directory, and will place each resulting qm file into the appropriate week directory.&lt;br /&gt;
&lt;br /&gt;
Autofront runs the following steps&lt;br /&gt;
1. Checks on the validity of RINEX file and repair of some common problems.&lt;br /&gt;
2. Depending on receiver type, clockprep -fixtags&lt;br /&gt;
3. (optional, presently not default) PhasEdit&lt;br /&gt;
4. ninja&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
Autoclean carries out automated cleaning of cycle slips, based on point positioning solutions. It is quite effective and at present it rarely misses cycle slips unless they are smaller than its minimum tolerance (10 cm). Autoclean operates on an ''edit-request'' file, which contains the name of the directory (week directory) and a list of qm files that need to be cleaned. It will clean all files on the list as long as orbits and clocks are available, and it marks off files that have been cleaned so that it can safely be run multiple times.&lt;br /&gt;
&lt;br /&gt;
Autoclean operates in an iterative mode. It's zeroth iteration is to do a pseudorange-only solution and identify and delete extremely bad pseudorange data. In this step it uses a tolerance that catches only grossly biased data. (Explain it). It then carries out 1 or more iterations of screening the phase data. In each iteration, it uses postbreak to identify discontinuities in the residuals of a point positioning solution. Postbreak is run with an adaptive tolerance (minimum 10 cm), and it is critical that my slightly modified version of postbreak be used. If any cycle slips are discovered, they are flagged and another iteration is run. Autoclean runs a maximum of 4 iterations on the phase data.&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several [[subdirectories]]; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Static Solutions===&lt;br /&gt;
In the [[subdirectories|flt]] directory for each [[week directory|week]] there will (hopefully) be a UNIX script called&amp;lt;br&amp;gt;&lt;br /&gt;
''make-*''&amp;lt;br&amp;gt;&lt;br /&gt;
This script runs another script called&amp;lt;br&amp;gt;&lt;br /&gt;
''standard_*_solution''&amp;lt;br&amp;gt;&lt;br /&gt;
which again runs another script called&amp;lt;br&amp;gt;&lt;br /&gt;
''solve''&amp;lt;br&amp;gt;&lt;br /&gt;
for each [[subnets | network of stations]] (see each subnet with [[Subnets_GoogleEarth | Google Earth]]) for each day. The solve script runs solutions for each of these networks based on the data from sites in the network that are available in the qm directory.&lt;br /&gt;
&lt;br /&gt;
To run solutions, copy the ''make'' script to a file called ''make-flt'' (for example).&lt;br /&gt;
&lt;br /&gt;
Check that the ''make-flt'' script contains all the days that you want to run.&lt;br /&gt;
&lt;br /&gt;
Check which [[Linux Computer System | computer]] is free to run the script (to do so, type:)&lt;br /&gt;
   check-solves&lt;br /&gt;
log on to a free computer and type&lt;br /&gt;
   submit make-flt&lt;br /&gt;
&lt;br /&gt;
As the script runs, files will appear in the [[subdirectories|flt]] directory for each [[subnets|network]] for each day. Usually&lt;br /&gt;
this script will have been run once [[#What happens automatically? | automatically]], so there will often already be files in the [[subdirectories|flt]]&lt;br /&gt;
directory ready to be cleaned and then re-run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Static Solutions===&lt;br /&gt;
&lt;br /&gt;
Sometimes bad data (outliers and cycle slips) make it past the automatic editors. When this&lt;br /&gt;
happens the bad data are removed from the qm files by either deleting points or by inserting&lt;br /&gt;
new phase ambiguities to deal with cycle slips. The steps, commands and scrips to use are somewhat explained [[How to clean a solution? | here]]. Once the data are cleaned, the files should be deleted from the [[subdirectories|flt]] directory and the solutions re-run (run &amp;lt;code&amp;gt;make-flt&amp;lt;/code&amp;gt; again). Usually you will have to go through 2-3&lt;br /&gt;
iterations of the cleaning-rerunning cycle. To be clean, the .point files for a solution do not&lt;br /&gt;
exist or are small (below 1000 bytes). Once a solution is clean the files should remain in&lt;br /&gt;
the [[subdirectories|flt]] directory and the lines to rerun that solution should be deleted from the &amp;lt;code&amp;gt;make-flt&amp;lt;/code&amp;gt; file.&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, [[How to clean a solution?|allbadp]]&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: [[How to clean a solution?|del_pcode_arc]]&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks, [[How to clean a solution?|allbreaks]]&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&amp;lt;br&amp;gt;&lt;br /&gt;
The data problems can be identified and&lt;br /&gt;
fixed using the program&lt;br /&gt;
   [[short_hand]] &lt;br /&gt;
(follow the link to read about and ask for help to get started with this program).&amp;lt;br&amp;gt; &lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing [[problem stations | problems]]&lt;br /&gt;
&lt;br /&gt;
===Procedure for Running Solutions for a week for the first time===&lt;br /&gt;
&lt;br /&gt;
A few special things need to be done the very first time solutions in a week are run. First, you need to make up a script to run all days of the week. This may need to be edited if the JPL final orbits are not available at the time. The &amp;lt;code&amp;gt;standard_Alaska_solution&amp;lt;/code&amp;gt; uses the non-fiducial orbits and thus requires that the final JPL orbits be present. If they are not, you can run &amp;lt;code&amp;gt;rapid_Alaska_solution&amp;lt;/code&amp;gt; instead. &lt;br /&gt;
&lt;br /&gt;
Then, the log files from autoclean should be moved away to a subdirectory, and any problem stations identified by autoclean should be checked. Then, you are ready to run the solutions.&lt;br /&gt;
&lt;br /&gt;
First, make a script to run solutions. For example, to make a script to run all days of the week for the Alaska solution:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS&lt;br /&gt;
make-make-flt 1381&lt;br /&gt;
cd 1381/flt&lt;br /&gt;
vi make-alaska&lt;br /&gt;
#  Edit the file if needed so that you are ready to run the rapid solutions.&lt;br /&gt;
&lt;br /&gt;
cat make-alaska&lt;br /&gt;
#!/bin/csh -f&lt;br /&gt;
#&lt;br /&gt;
setenv CAMP $ANALYSIS/1381&lt;br /&gt;
#&lt;br /&gt;
#standard_Alaska_solution 06jul01&lt;br /&gt;
#standard_Alaska_solution 06jun25&lt;br /&gt;
#standard_Alaska_solution 06jun26&lt;br /&gt;
#standard_Alaska_solution 06jun27&lt;br /&gt;
#standard_Alaska_solution 06jun28&lt;br /&gt;
#standard_Alaska_solution 06jun29&lt;br /&gt;
#standard_Alaska_solution 06jun30&lt;br /&gt;
#&lt;br /&gt;
rapid_Alaska_solution 06jul01&lt;br /&gt;
rapid_Alaska_solution 06jun25&lt;br /&gt;
rapid_Alaska_solution 06jun26&lt;br /&gt;
rapid_Alaska_solution 06jun27&lt;br /&gt;
rapid_Alaska_solution 06jun28&lt;br /&gt;
rapid_Alaska_solution 06jun29&lt;br /&gt;
rapid_Alaska_solution 06jun30&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The script &amp;lt;code&amp;gt;make-make-flt&amp;lt;/code&amp;gt; finds all unique dates for qm files in that week's directory, and uses that to generate the script, so if you run it before the end of the week you will get a partial script. If the final JPL orbits are not yet present, you will need to edit the script to change &amp;quot;standard&amp;quot; ro &amp;quot;rapid&amp;quot;. Or better yet, copy all the lines and comment one set out, then modify the others to read &amp;quot;rapid_Alaska_solution &amp;lt;date&amp;gt;&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
Next call &amp;lt;code&amp;gt;mv_logfiles&amp;lt;/code&amp;gt; (being in the WEEK/flt directory!) which creates a subdirectory called &amp;lt;code&amp;gt;logfiles&amp;lt;/code&amp;gt; and moves &lt;br /&gt;
all of autoclean's logfiles of the format &amp;lt;code&amp;gt;*____*.i*&amp;lt;/code&amp;gt; into this directory:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
HOSTNAME WWWW/flt&amp;gt; mv_logfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now look for a file called &amp;lt;code&amp;gt;make-problems&amp;lt;/code&amp;gt;, which lists all files that autoclean had a problem with. Sometimes these files are almost clean, but sometimes they are full of junk or horribly mangled by the automated editing. There should be PPP solutions already run for these files, so they are ready to be checked.&lt;br /&gt;
&lt;br /&gt;
Set the &amp;lt;code&amp;gt;CAMP&amp;lt;/code&amp;gt; variable (if not set): &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
setenv CAMP $ANALYSIS/wwww&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Again, &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4-digit GPS-week.&lt;br /&gt;
&lt;br /&gt;
Now run the solutions. The first time you run the solutions, look at the residuals very carefully before trying &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt;. Uncompress the postlog, postfit and postbreak files, and then use &amp;lt;code&amp;gt;allbadp&amp;lt;/code&amp;gt; to check the pseudorange and &amp;lt;code&amp;gt;allbreaks&amp;lt;/code&amp;gt; to check for major cycle slips.&lt;br /&gt;
&lt;br /&gt;
A very common problem is that for several stations per week, there will be one satellite arc of pseudorange data that all have residuals of roughly 2000 cm. If you see these, don't delete the data, but instead run del_pcode_arc to remove only the pseudorange data. I am not sure why these show up, but it could be either a hardware channel bias or a pre-processing glitch. They happen much more often with Ashtechs than any others, and are particuarly common with the US Coast Guard CORS sites. In the &amp;lt;code&amp;gt;qm&amp;lt;/code&amp;gt; directory,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
del_pcode_arc *02gus2* GUS2 GPS41&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you just run &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; without looking, it will probably either throw out all the pseudorange for one site, or delete a lot of data (phase and pseudorange) where only the pseudorange needs to be deleted. So don't do that. Then I delete a batch of bad pseudorange data to get the number of pseudorange outliers under control for the next run.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS/1381/flt&lt;br /&gt;
gunzip *alaska*post*&lt;br /&gt;
allbadp&lt;br /&gt;
allbreaks&lt;br /&gt;
# Based on this, run del_pcode_arc as above, and add ambiguities manually if needed.&lt;br /&gt;
#&lt;br /&gt;
delete_allbadp 50&lt;br /&gt;
#  This creates a file called delete.&lt;br /&gt;
vi delete&lt;br /&gt;
#    Remove lines for any points for which you have already run del_pocde_arc.&lt;br /&gt;
cd ../qm&lt;br /&gt;
sh ../flt/delete&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
At this point, don't worry too much about phase outliers. Basically we are trying to get the number of pseudorange outliers down into the range where &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; will do the right thing when we run it later. Now may be a good time to run &amp;lt;code&amp;gt;Alaska_cleaning_solution $date&amp;lt;/code&amp;gt;, which runs a smaller and much faster solution including the sites that most often need some cleaning.&lt;br /&gt;
&lt;br /&gt;
===Data Backup===&lt;br /&gt;
&lt;br /&gt;
RINEX file backups.&lt;br /&gt;
There are either 1 or 2 separate backups of the raw rinex files. For data we collected&lt;br /&gt;
ourselves, a copy of the original rinex files can be found in either the campaign&lt;br /&gt;
directory (/gps/akda/Campaigns/Data.2007/&amp;lt;project&amp;gt;, where &amp;lt;project&amp;gt; is the project name,&lt;br /&gt;
and Data.2007 will change with the year), or in the continuous site ftp area&lt;br /&gt;
(/gps/akda/Permanent/2007/260/, where 2007 is the year and 160 is the day of year). Also,&lt;br /&gt;
every rinex file put through the hopper is moved to a directory like $RAWDATA/2007/260/&lt;br /&gt;
(again, year and day of year may change). However, the $RAWDATA/2007/260/ directories are&lt;br /&gt;
not really archived and eventually they will be deleted. But in practice we have most of&lt;br /&gt;
the last few years of rinex files online in case something goes wrong.&lt;br /&gt;
&lt;br /&gt;
QM file backups.&lt;br /&gt;
Before autoclean makes any changes to a qm file, it copies the file to a subdirectory&lt;br /&gt;
called &amp;quot;original&amp;quot; in the qm directory. so if you completely destroy a qm file by&lt;br /&gt;
accident,  you can still go back the original version. Of course, that loses all editing&lt;br /&gt;
done to the file, but at least the original data can be recovered easily. In general, it&lt;br /&gt;
is not a good idea to go back to the version in the original subdirectory unless you know&lt;br /&gt;
what you are doing, because doing that can make a lot more work for everyone. Mostly we&lt;br /&gt;
do that when files have been mangled by autoclean. It is actually hard to mangle data&lt;br /&gt;
files using our usual editing procedures.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Customized Solutions===&lt;br /&gt;
&lt;br /&gt;
Sometimes customized solutions are required for various reasons.  This link will provide some strategies that may improve your situation.&lt;br /&gt;
&amp;lt;br&amp;gt;[[Kinematic Processing]]&lt;br /&gt;
&amp;lt;br&amp;gt;[[Ambiguity Resolution]]&lt;br /&gt;
&lt;br /&gt;
===Products / file contents===&lt;br /&gt;
&lt;br /&gt;
Where to find certain information and what are the structure / content of output files? It is started to be summarized on the [[files | files]] page.&lt;br /&gt;
&lt;br /&gt;
=== Velocity solutions ===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; su akda&lt;br /&gt;
/&amp;gt; cd $ANALYZED&lt;br /&gt;
/&amp;gt; mkdir &amp;lt;USEFUL NEW PROJECTNAME&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Find another project and copy the following files into your new directory:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; cp *.nml $ANALYZED/new_project&lt;br /&gt;
/&amp;gt; cp make_vel* $ANALYZED/new_project&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Rename the *.nml file to something reasonable for your project and edit this file, it needs to know the stations that you want to include&lt;br /&gt;
into your velocity solution as well as the input file locations where the data comes from. To get the input files in the correct syntax you might wanna use:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; grep_vel_infiles.pl --from-week=WWWW --to-week=WWWW --infile-index=x --sum-id=&amp;lt;alaska2.0_nfxigs03 | NEAsia2.0_nfxigs03 | ...&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;WWWW&amp;quot; stands for the week you want to start / end with and &amp;quot;x&amp;quot; is the starting value for the id-counter. These id's are useful later to reference certain input files for additional editing purposes (see below). From-week, to-week, and infile-index are optional. &amp;quot;sum-id&amp;quot; is basically the solution name. Copy all lines of the format &amp;quot;   infile(x) = '...'&amp;quot; into your namelist (nml) file (insert before the &amp;amp;amp;end). ([http://www.gps.alaska.edu/internal/index.php/Special:Call/Include_info%2C/export/ftpweb/htdocs/sh2doc/grep_vel_infiles.pl.html grep_vel_infiles.pl]  documentation)&lt;br /&gt;
&lt;br /&gt;
Once the editing of the namelist file is finished:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; refresh_zebu younamelist.nml outfile.ref&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This is necessary to order and re-number the entries in your namelistfile which can contain comments. To not have you go through the namelist file whenever you want to try to throw out a station or some data and renumber everything, refresh_zebu does that for you.&lt;br /&gt;
&lt;br /&gt;
Once you have a nice reference file (.ref):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; rzebu2 outfile.ref &amp;gt; &amp;amp; out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You should redirect the output to a file so you can look at it later :).&lt;br /&gt;
&lt;br /&gt;
For example right now. As soon as the solution is finished running, look at the total Chi_squared value at the bottom of the output in &amp;quot;out&amp;quot;. It should be &amp;quot;1&amp;quot;. If that's not the case which is likely if you run a solution for the first couple of times look for sites that cause the deviation from chi_squared=1. Note the site names with the larges chi squared values. Then you can do three things:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
A) /&amp;gt; grep SITE outlier.inf&lt;br /&gt;
B) /&amp;gt; grep SITE residual.inf&lt;br /&gt;
C) /&amp;gt; vi $ANALYSIS/solution_timeseries/SITE.pfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In all three cases you want to find dates on which the sigmas for this site are rather large. Note down the date and find it in the namelist &lt;br /&gt;
file (*.nml) and remove the respective site from the velocity solution for that day by adding a line:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
removedat(a, infile_id) = 'SITENAME'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;a&amp;quot; is the id that simply counts how many removedats have been invoked on that one infile_id. &amp;quot;infile_id&amp;quot; is the counter I mentioned above. An example is probably best:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
   infile(161) = '/gps/analysis/1222/post/03jun10NEAsia2.0_nfxigs03.sum'&lt;br /&gt;
    removedat(1,161) = 'ELD '&lt;br /&gt;
    removedat(2,161) = 'PETP'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here I assume that both sites, &amp;quot;ELD&amp;quot; and &amp;quot;PETP&amp;quot; misbehave on June 10, 2003 for the North East Asia solution. Hence I remove them from the velocity solution.&lt;br /&gt;
&lt;br /&gt;
Remove all the files created by &amp;lt;code&amp;gt;rzebu2&amp;lt;/code&amp;gt;. Creating a Makefile of the form:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
clean:&lt;br /&gt;
        rm *.ref ATWA ATY solution.* *.inf out nnr.* *.dat *.gmtvec *.vel fort.* gmt.format STACOV argus.weights&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
might be useful.&lt;br /&gt;
&lt;br /&gt;
Repeat the above until your reduced chi squared is &amp;lt;= 1.0. If you can't get there, change the fudge_factor as follows:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
new_fudge = old_fudge * Chi_squared&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
and rerun the solution one last time. The reduced chi squared value should be 1.0 .&lt;br /&gt;
&lt;br /&gt;
Once you achieved that you can go to the next level and run one of the make_vel files you copied to your directory:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make_vel_Sella: velocities relative to a stable North America&lt;br /&gt;
make_vel_ITRF: velocities in ITRF &lt;br /&gt;
make_vel_EURA: velocities relative to a stable Eurasia &lt;br /&gt;
make_vel_XXXX: velocities with reference station XXXX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There might be others, or you could go ahead and edit these files to adapt them to your needs. The make files create *.gmtvec output which you can use to with e.g., &amp;lt;code&amp;gt;psvelo&amp;lt;/code&amp;gt; in a GMT script.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.0&amp;diff=1895</id>
		<title>Solution Strategy 2.0</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.0&amp;diff=1895"/>
				<updated>2010-01-13T07:42:46Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;page not yet prepared.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;br /&gt;
&lt;br /&gt;
More stuff goes here.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table border=&amp;quot;1&amp;quot;&amp;gt;&lt;br /&gt;
 &amp;lt;tr&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;200&amp;quot;&amp;gt;Parameter&amp;lt;/td&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;800&amp;quot;&amp;gt;Explanation&amp;lt;/td&amp;gt;&lt;br /&gt;
 &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;tr&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;200&amp;quot;&amp;gt;Parameter&amp;lt;/td&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;800&amp;quot;&amp;gt;Explanation&amp;lt;/td&amp;gt;&lt;br /&gt;
 &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.0&amp;diff=1894</id>
		<title>Solution Strategy 2.0</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.0&amp;diff=1894"/>
				<updated>2010-01-13T07:42:20Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;page not yet prepared.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;br /&gt;
&lt;br /&gt;
More stuff goes here.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;table border=&amp;quot;1&amp;quot;&amp;gt;&lt;br /&gt;
 &amp;lt;tr&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;100&amp;quot;&amp;gt;Parameter&amp;lt;/td&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;500&amp;quot;&amp;gt;Explanation&amp;lt;/td&amp;gt;&lt;br /&gt;
 &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
 &amp;lt;tr&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;100&amp;quot;&amp;gt;Parameter&amp;lt;/td&amp;gt;&lt;br /&gt;
  &amp;lt;td width=&amp;quot;500&amp;quot;&amp;gt;Explanation&amp;lt;/td&amp;gt;&lt;br /&gt;
 &amp;lt;/tr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/table&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=1893</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=1893"/>
				<updated>2010-01-13T07:38:50Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by [[Special:Call/Include_info%2C/export/ftpweb/htdocs/sh2doc/index.html | shell scripts]] written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Documentation of Solution Strategies (new)===&lt;br /&gt;
The links below describe our solution strategy as it has evolved over time.&lt;br /&gt;
{|&lt;br /&gt;
|[[Solution_Strategy_1.0 | 1.0 1990s strategy.]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Solution_Strategy_2.0 | 2.0 Network solutions used from ~2002 through 2008.]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Solution_Strategy_2.5 | 2.5 PPP solutions but otherwise strategy 2.0]]&lt;br /&gt;
|-&lt;br /&gt;
|[[Solution_Strategy_3.0 | 3.0 2010 new solution strategy]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
Quite a lot.&lt;br /&gt;
&lt;br /&gt;
Autoftp runs every night beginning at 6pm local time and fetches data files. These are placed into the '''hopper''' ($RAWDATA/hopper), a directory where all data files are put for entry into the system and processing. Autofront then runs at midnight to process all files in the hopper, including any placed there manually (from campaigns, for example). Finally, autoclean runs at 4am local to carry out automated screening for cycle slips and other bad data.&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
Autofront is a script intended to run under cron that carried out the initial &amp;quot;front end&amp;quot; processing on a set of GPS data files. When executed, it will process all files in the '''hopper''' directory, and will place each resulting qm file into the appropriate week directory.&lt;br /&gt;
&lt;br /&gt;
Autofront runs the following steps&lt;br /&gt;
1. Checks on the validity of RINEX file and repair of some common problems.&lt;br /&gt;
2. Depending on receiver type, clockprep -fixtags&lt;br /&gt;
3. (optional, presently not default) PhasEdit&lt;br /&gt;
4. ninja&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
Autoclean carries out automated cleaning of cycle slips, based on point positioning solutions. It is quite effective and at present it rarely misses cycle slips unless they are smaller than its minimum tolerance (10 cm). Autoclean operates on an ''edit-request'' file, which contains the name of the directory (week directory) and a list of qm files that need to be cleaned. It will clean all files on the list as long as orbits and clocks are available, and it marks off files that have been cleaned so that it can safely be run multiple times.&lt;br /&gt;
&lt;br /&gt;
Autoclean operates in an iterative mode. It's zeroth iteration is to do a pseudorange-only solution and identify and delete extremely bad pseudorange data. In this step it uses a tolerance that catches only grossly biased data. (Explain it). It then carries out 1 or more iterations of screening the phase data. In each iteration, it uses postbreak to identify discontinuities in the residuals of a point positioning solution. Postbreak is run with an adaptive tolerance (minimum 10 cm), and it is critical that my slightly modified version of postbreak be used. If any cycle slips are discovered, they are flagged and another iteration is run. Autoclean runs a maximum of 4 iterations on the phase data.&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several [[subdirectories]]; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
In the [[subdirectories|flt]] directory for each [[week directory|week]] there will (hopefully) be a UNIX script called&amp;lt;br&amp;gt;&lt;br /&gt;
''make-*''&amp;lt;br&amp;gt;&lt;br /&gt;
This script runs another script called&amp;lt;br&amp;gt;&lt;br /&gt;
''standard_*_solution''&amp;lt;br&amp;gt;&lt;br /&gt;
which again runs another script called&amp;lt;br&amp;gt;&lt;br /&gt;
''solve''&amp;lt;br&amp;gt;&lt;br /&gt;
for each [[subnets | network of stations]] (see each subnet with [[Subnets_GoogleEarth | Google Earth]]) for each day. The solve script runs solutions for each of these networks based on the data from sites in the network that are available in the qm directory.&lt;br /&gt;
&lt;br /&gt;
To run solutions, copy the ''make'' script to a file called ''make-flt'' (for example).&lt;br /&gt;
&lt;br /&gt;
Check that the ''make-flt'' script contains all the days that you want to run.&lt;br /&gt;
&lt;br /&gt;
Check which [[Linux Computer System | computer]] is free to run the script (to do so, type:)&lt;br /&gt;
   check-solves&lt;br /&gt;
log on to a free computer and type&lt;br /&gt;
   submit make-flt&lt;br /&gt;
&lt;br /&gt;
As the script runs, files will appear in the [[subdirectories|flt]] directory for each [[subnets|network]] for each day. Usually&lt;br /&gt;
this script will have been run once [[#What happens automatically? | automatically]], so there will often already be files in the [[subdirectories|flt]]&lt;br /&gt;
directory ready to be cleaned and then re-run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Sometimes bad data (outliers and cycle slips) make it past the automatic editors. When this&lt;br /&gt;
happens the bad data are removed from the qm files by either deleting points or by inserting&lt;br /&gt;
new phase ambiguities to deal with cycle slips. The steps, commands and scrips to use are somewhat explained [[How to clean a solution? | here]]. Once the data are cleaned, the files should be deleted from the [[subdirectories|flt]] directory and the solutions re-run (run &amp;lt;code&amp;gt;make-flt&amp;lt;/code&amp;gt; again). Usually you will have to go through 2-3&lt;br /&gt;
iterations of the cleaning-rerunning cycle. To be clean, the .point files for a solution do not&lt;br /&gt;
exist or are small (below 1000 bytes). Once a solution is clean the files should remain in&lt;br /&gt;
the [[subdirectories|flt]] directory and the lines to rerun that solution should be deleted from the &amp;lt;code&amp;gt;make-flt&amp;lt;/code&amp;gt; file.&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, [[How to clean a solution?|allbadp]]&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: [[How to clean a solution?|del_pcode_arc]]&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks, [[How to clean a solution?|allbreaks]]&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&amp;lt;br&amp;gt;&lt;br /&gt;
The data problems can be identified and&lt;br /&gt;
fixed using the program&lt;br /&gt;
   [[short_hand]] &lt;br /&gt;
(follow the link to read about and ask for help to get started with this program).&amp;lt;br&amp;gt; &lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing [[problem stations | problems]]&lt;br /&gt;
&lt;br /&gt;
===Procedure for Running Solutions for a week for the first time===&lt;br /&gt;
&lt;br /&gt;
A few special things need to be done the very first time solutions in a week are run. First, you need to make up a script to run all days of the week. This may need to be edited if the JPL final orbits are not available at the time. The &amp;lt;code&amp;gt;standard_Alaska_solution&amp;lt;/code&amp;gt; uses the non-fiducial orbits and thus requires that the final JPL orbits be present. If they are not, you can run &amp;lt;code&amp;gt;rapid_Alaska_solution&amp;lt;/code&amp;gt; instead. &lt;br /&gt;
&lt;br /&gt;
Then, the log files from autoclean should be moved away to a subdirectory, and any problem stations identified by autoclean should be checked. Then, you are ready to run the solutions.&lt;br /&gt;
&lt;br /&gt;
First, make a script to run solutions. For example, to make a script to run all days of the week for the Alaska solution:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS&lt;br /&gt;
make-make-flt 1381&lt;br /&gt;
cd 1381/flt&lt;br /&gt;
vi make-alaska&lt;br /&gt;
#  Edit the file if needed so that you are ready to run the rapid solutions.&lt;br /&gt;
&lt;br /&gt;
cat make-alaska&lt;br /&gt;
#!/bin/csh -f&lt;br /&gt;
#&lt;br /&gt;
setenv CAMP $ANALYSIS/1381&lt;br /&gt;
#&lt;br /&gt;
#standard_Alaska_solution 06jul01&lt;br /&gt;
#standard_Alaska_solution 06jun25&lt;br /&gt;
#standard_Alaska_solution 06jun26&lt;br /&gt;
#standard_Alaska_solution 06jun27&lt;br /&gt;
#standard_Alaska_solution 06jun28&lt;br /&gt;
#standard_Alaska_solution 06jun29&lt;br /&gt;
#standard_Alaska_solution 06jun30&lt;br /&gt;
#&lt;br /&gt;
rapid_Alaska_solution 06jul01&lt;br /&gt;
rapid_Alaska_solution 06jun25&lt;br /&gt;
rapid_Alaska_solution 06jun26&lt;br /&gt;
rapid_Alaska_solution 06jun27&lt;br /&gt;
rapid_Alaska_solution 06jun28&lt;br /&gt;
rapid_Alaska_solution 06jun29&lt;br /&gt;
rapid_Alaska_solution 06jun30&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The script &amp;lt;code&amp;gt;make-make-flt&amp;lt;/code&amp;gt; finds all unique dates for qm files in that week's directory, and uses that to generate the script, so if you run it before the end of the week you will get a partial script. If the final JPL orbits are not yet present, you will need to edit the script to change &amp;quot;standard&amp;quot; ro &amp;quot;rapid&amp;quot;. Or better yet, copy all the lines and comment one set out, then modify the others to read &amp;quot;rapid_Alaska_solution &amp;lt;date&amp;gt;&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
Next call &amp;lt;code&amp;gt;mv_logfiles&amp;lt;/code&amp;gt; (being in the WEEK/flt directory!) which creates a subdirectory called &amp;lt;code&amp;gt;logfiles&amp;lt;/code&amp;gt; and moves &lt;br /&gt;
all of autoclean's logfiles of the format &amp;lt;code&amp;gt;*____*.i*&amp;lt;/code&amp;gt; into this directory:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
HOSTNAME WWWW/flt&amp;gt; mv_logfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now look for a file called &amp;lt;code&amp;gt;make-problems&amp;lt;/code&amp;gt;, which lists all files that autoclean had a problem with. Sometimes these files are almost clean, but sometimes they are full of junk or horribly mangled by the automated editing. There should be PPP solutions already run for these files, so they are ready to be checked.&lt;br /&gt;
&lt;br /&gt;
Set the &amp;lt;code&amp;gt;CAMP&amp;lt;/code&amp;gt; variable (if not set): &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
setenv CAMP $ANALYSIS/wwww&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Again, &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4-digit GPS-week.&lt;br /&gt;
&lt;br /&gt;
Now run the solutions. The first time you run the solutions, look at the residuals very carefully before trying &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt;. Uncompress the postlog, postfit and postbreak files, and then use &amp;lt;code&amp;gt;allbadp&amp;lt;/code&amp;gt; to check the pseudorange and &amp;lt;code&amp;gt;allbreaks&amp;lt;/code&amp;gt; to check for major cycle slips.&lt;br /&gt;
&lt;br /&gt;
A very common problem is that for several stations per week, there will be one satellite arc of pseudorange data that all have residuals of roughly 2000 cm. If you see these, don't delete the data, but instead run del_pcode_arc to remove only the pseudorange data. I am not sure why these show up, but it could be either a hardware channel bias or a pre-processing glitch. They happen much more often with Ashtechs than any others, and are particuarly common with the US Coast Guard CORS sites. In the &amp;lt;code&amp;gt;qm&amp;lt;/code&amp;gt; directory,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
del_pcode_arc *02gus2* GUS2 GPS41&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you just run &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; without looking, it will probably either throw out all the pseudorange for one site, or delete a lot of data (phase and pseudorange) where only the pseudorange needs to be deleted. So don't do that. Then I delete a batch of bad pseudorange data to get the number of pseudorange outliers under control for the next run.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS/1381/flt&lt;br /&gt;
gunzip *alaska*post*&lt;br /&gt;
allbadp&lt;br /&gt;
allbreaks&lt;br /&gt;
# Based on this, run del_pcode_arc as above, and add ambiguities manually if needed.&lt;br /&gt;
#&lt;br /&gt;
delete_allbadp 50&lt;br /&gt;
#  This creates a file called delete.&lt;br /&gt;
vi delete&lt;br /&gt;
#    Remove lines for any points for which you have already run del_pocde_arc.&lt;br /&gt;
cd ../qm&lt;br /&gt;
sh ../flt/delete&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
At this point, don't worry too much about phase outliers. Basically we are trying to get the number of pseudorange outliers down into the range where &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; will do the right thing when we run it later. Now may be a good time to run &amp;lt;code&amp;gt;Alaska_cleaning_solution $date&amp;lt;/code&amp;gt;, which runs a smaller and much faster solution including the sites that most often need some cleaning.&lt;br /&gt;
&lt;br /&gt;
===Data Backup===&lt;br /&gt;
&lt;br /&gt;
RINEX file backups.&lt;br /&gt;
There are either 1 or 2 separate backups of the raw rinex files. For data we collected&lt;br /&gt;
ourselves, a copy of the original rinex files can be found in either the campaign&lt;br /&gt;
directory (/gps/akda/Campaigns/Data.2007/&amp;lt;project&amp;gt;, where &amp;lt;project&amp;gt; is the project name,&lt;br /&gt;
and Data.2007 will change with the year), or in the continuous site ftp area&lt;br /&gt;
(/gps/akda/Permanent/2007/260/, where 2007 is the year and 160 is the day of year). Also,&lt;br /&gt;
every rinex file put through the hopper is moved to a directory like $RAWDATA/2007/260/&lt;br /&gt;
(again, year and day of year may change). However, the $RAWDATA/2007/260/ directories are&lt;br /&gt;
not really archived and eventually they will be deleted. But in practice we have most of&lt;br /&gt;
the last few years of rinex files online in case something goes wrong.&lt;br /&gt;
&lt;br /&gt;
QM file backups.&lt;br /&gt;
Before autoclean makes any changes to a qm file, it copies the file to a subdirectory&lt;br /&gt;
called &amp;quot;original&amp;quot; in the qm directory. so if you completely destroy a qm file by&lt;br /&gt;
accident,  you can still go back the original version. Of course, that loses all editing&lt;br /&gt;
done to the file, but at least the original data can be recovered easily. In general, it&lt;br /&gt;
is not a good idea to go back to the version in the original subdirectory unless you know&lt;br /&gt;
what you are doing, because doing that can make a lot more work for everyone. Mostly we&lt;br /&gt;
do that when files have been mangled by autoclean. It is actually hard to mangle data&lt;br /&gt;
files using our usual editing procedures.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Customized Solutions===&lt;br /&gt;
&lt;br /&gt;
Sometimes customized solutions are required for various reasons.  This link will provide some strategies that may improve your situation.&lt;br /&gt;
&amp;lt;br&amp;gt;[[Kinematic Processing]]&lt;br /&gt;
&amp;lt;br&amp;gt;[[Ambiguity Resolution]]&lt;br /&gt;
&lt;br /&gt;
===Products / file contents===&lt;br /&gt;
&lt;br /&gt;
Where to find certain information and what are the structure / content of output files? It is started to be summarized on the [[files | files]] page.&lt;br /&gt;
&lt;br /&gt;
=== Velocity solutions ===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; su akda&lt;br /&gt;
/&amp;gt; cd $ANALYZED&lt;br /&gt;
/&amp;gt; mkdir &amp;lt;USEFUL NEW PROJECTNAME&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Find another project and copy the following files into your new directory:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; cp *.nml $ANALYZED/new_project&lt;br /&gt;
/&amp;gt; cp make_vel* $ANALYZED/new_project&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Rename the *.nml file to something reasonable for your project and edit this file, it needs to know the stations that you want to include&lt;br /&gt;
into your velocity solution as well as the input file locations where the data comes from. To get the input files in the correct syntax you might wanna use:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; grep_vel_infiles.pl --from-week=WWWW --to-week=WWWW --infile-index=x --sum-id=&amp;lt;alaska2.0_nfxigs03 | NEAsia2.0_nfxigs03 | ...&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;WWWW&amp;quot; stands for the week you want to start / end with and &amp;quot;x&amp;quot; is the starting value for the id-counter. These id's are useful later to reference certain input files for additional editing purposes (see below). From-week, to-week, and infile-index are optional. &amp;quot;sum-id&amp;quot; is basically the solution name. Copy all lines of the format &amp;quot;   infile(x) = '...'&amp;quot; into your namelist (nml) file (insert before the &amp;amp;amp;end). ([http://www.gps.alaska.edu/internal/index.php/Special:Call/Include_info%2C/export/ftpweb/htdocs/sh2doc/grep_vel_infiles.pl.html grep_vel_infiles.pl]  documentation)&lt;br /&gt;
&lt;br /&gt;
Once the editing of the namelist file is finished:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; refresh_zebu younamelist.nml outfile.ref&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This is necessary to order and re-number the entries in your namelistfile which can contain comments. To not have you go through the namelist file whenever you want to try to throw out a station or some data and renumber everything, refresh_zebu does that for you.&lt;br /&gt;
&lt;br /&gt;
Once you have a nice reference file (.ref):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; rzebu2 outfile.ref &amp;gt; &amp;amp; out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You should redirect the output to a file so you can look at it later :).&lt;br /&gt;
&lt;br /&gt;
For example right now. As soon as the solution is finished running, look at the total Chi_squared value at the bottom of the output in &amp;quot;out&amp;quot;. It should be &amp;quot;1&amp;quot;. If that's not the case which is likely if you run a solution for the first couple of times look for sites that cause the deviation from chi_squared=1. Note the site names with the larges chi squared values. Then you can do three things:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
A) /&amp;gt; grep SITE outlier.inf&lt;br /&gt;
B) /&amp;gt; grep SITE residual.inf&lt;br /&gt;
C) /&amp;gt; vi $ANALYSIS/solution_timeseries/SITE.pfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In all three cases you want to find dates on which the sigmas for this site are rather large. Note down the date and find it in the namelist &lt;br /&gt;
file (*.nml) and remove the respective site from the velocity solution for that day by adding a line:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
removedat(a, infile_id) = 'SITENAME'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;a&amp;quot; is the id that simply counts how many removedats have been invoked on that one infile_id. &amp;quot;infile_id&amp;quot; is the counter I mentioned above. An example is probably best:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
   infile(161) = '/gps/analysis/1222/post/03jun10NEAsia2.0_nfxigs03.sum'&lt;br /&gt;
    removedat(1,161) = 'ELD '&lt;br /&gt;
    removedat(2,161) = 'PETP'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here I assume that both sites, &amp;quot;ELD&amp;quot; and &amp;quot;PETP&amp;quot; misbehave on June 10, 2003 for the North East Asia solution. Hence I remove them from the velocity solution.&lt;br /&gt;
&lt;br /&gt;
Remove all the files created by &amp;lt;code&amp;gt;rzebu2&amp;lt;/code&amp;gt;. Creating a Makefile of the form:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
clean:&lt;br /&gt;
        rm *.ref ATWA ATY solution.* *.inf out nnr.* *.dat *.gmtvec *.vel fort.* gmt.format STACOV argus.weights&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
might be useful.&lt;br /&gt;
&lt;br /&gt;
Repeat the above until your reduced chi squared is &amp;lt;= 1.0. If you can't get there, change the fudge_factor as follows:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
new_fudge = old_fudge * Chi_squared&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
and rerun the solution one last time. The reduced chi squared value should be 1.0 .&lt;br /&gt;
&lt;br /&gt;
Once you achieved that you can go to the next level and run one of the make_vel files you copied to your directory:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make_vel_Sella: velocities relative to a stable North America&lt;br /&gt;
make_vel_ITRF: velocities in ITRF &lt;br /&gt;
make_vel_EURA: velocities relative to a stable Eurasia &lt;br /&gt;
make_vel_XXXX: velocities with reference station XXXX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There might be others, or you could go ahead and edit these files to adapt them to your needs. The make files create *.gmtvec output which you can use to with e.g., &amp;lt;code&amp;gt;psvelo&amp;lt;/code&amp;gt; in a GMT script.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.5&amp;diff=1699</id>
		<title>Solution Strategy 2.5</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.5&amp;diff=1699"/>
				<updated>2010-01-12T20:03:50Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: New page: page not yet prepared.   Back to GPS Analysis System  More stuff goes here.   Back to GPS Analysis System&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;page not yet prepared.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;br /&gt;
&lt;br /&gt;
More stuff goes here.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.0&amp;diff=1698</id>
		<title>Solution Strategy 2.0</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_2.0&amp;diff=1698"/>
				<updated>2010-01-12T20:03:41Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: New page: page not yet prepared.   Back to GPS Analysis System  More stuff goes here.   Back to GPS Analysis System&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;page not yet prepared.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;br /&gt;
&lt;br /&gt;
More stuff goes here.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_1.0&amp;diff=1697</id>
		<title>Solution Strategy 1.0</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_1.0&amp;diff=1697"/>
				<updated>2010-01-12T20:03:21Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: New page: page not yet prepared   Back to GPS Analysis System  More stuff goes here.   Back to GPS Analysis System&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;page not yet prepared&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;br /&gt;
&lt;br /&gt;
More stuff goes here.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_3.0&amp;diff=1696</id>
		<title>Solution Strategy 3.0</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_3.0&amp;diff=1696"/>
				<updated>2010-01-12T20:02:55Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This solution strategy is still under development.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;br /&gt;
&lt;br /&gt;
More stuff goes here.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_3.0&amp;diff=1695</id>
		<title>Solution Strategy 3.0</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Solution_Strategy_3.0&amp;diff=1695"/>
				<updated>2010-01-12T20:02:25Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: New page: This solution strategy is still under development.   Back to GPS Analysis System&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This solution strategy is still under development.&lt;br /&gt;
&lt;br /&gt;
[[GPS_analysis_system | Back to GPS Analysis System]]&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=1694</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=1694"/>
				<updated>2010-01-12T20:00:39Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by [[Special:Call/Include_info%2C/export/ftpweb/htdocs/sh2doc/index.html | shell scripts]] written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Documentation of Solution Strategies (new)===&lt;br /&gt;
The links below describe our solution strategy as it has evolved over time.&lt;br /&gt;
[[Solution_Strategy_1.0 | 1.0 1990s strategy.]]&lt;br /&gt;
[[Solution_Strategy_2.0 | 2.0 Network solutions used from ~2002 through 2008.]]&lt;br /&gt;
[[Solution_Strategy_2.5 | 2.5 PPP solutions but otherwise strategy 2.0]]&lt;br /&gt;
[[Solution_Strategy_3.0 | 3.0 2010 new solution strategy]]&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
Quite a lot.&lt;br /&gt;
&lt;br /&gt;
Autoftp runs every night beginning at 6pm local time and fetches data files. These are placed into the '''hopper''' ($RAWDATA/hopper), a directory where all data files are put for entry into the system and processing. Autofront then runs at midnight to process all files in the hopper, including any placed there manually (from campaigns, for example). Finally, autoclean runs at 4am local to carry out automated screening for cycle slips and other bad data.&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
Autofront is a script intended to run under cron that carried out the initial &amp;quot;front end&amp;quot; processing on a set of GPS data files. When executed, it will process all files in the '''hopper''' directory, and will place each resulting qm file into the appropriate week directory.&lt;br /&gt;
&lt;br /&gt;
Autofront runs the following steps&lt;br /&gt;
1. Checks on the validity of RINEX file and repair of some common problems.&lt;br /&gt;
2. Depending on receiver type, clockprep -fixtags&lt;br /&gt;
3. (optional, presently not default) PhasEdit&lt;br /&gt;
4. ninja&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
Autoclean carries out automated cleaning of cycle slips, based on point positioning solutions. It is quite effective and at present it rarely misses cycle slips unless they are smaller than its minimum tolerance (10 cm). Autoclean operates on an ''edit-request'' file, which contains the name of the directory (week directory) and a list of qm files that need to be cleaned. It will clean all files on the list as long as orbits and clocks are available, and it marks off files that have been cleaned so that it can safely be run multiple times.&lt;br /&gt;
&lt;br /&gt;
Autoclean operates in an iterative mode. It's zeroth iteration is to do a pseudorange-only solution and identify and delete extremely bad pseudorange data. In this step it uses a tolerance that catches only grossly biased data. (Explain it). It then carries out 1 or more iterations of screening the phase data. In each iteration, it uses postbreak to identify discontinuities in the residuals of a point positioning solution. Postbreak is run with an adaptive tolerance (minimum 10 cm), and it is critical that my slightly modified version of postbreak be used. If any cycle slips are discovered, they are flagged and another iteration is run. Autoclean runs a maximum of 4 iterations on the phase data.&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several [[subdirectories]]; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
In the [[subdirectories|flt]] directory for each [[week directory|week]] there will (hopefully) be a UNIX script called&amp;lt;br&amp;gt;&lt;br /&gt;
''make-*''&amp;lt;br&amp;gt;&lt;br /&gt;
This script runs another script called&amp;lt;br&amp;gt;&lt;br /&gt;
''standard_*_solution''&amp;lt;br&amp;gt;&lt;br /&gt;
which again runs another script called&amp;lt;br&amp;gt;&lt;br /&gt;
''solve''&amp;lt;br&amp;gt;&lt;br /&gt;
for each [[subnets | network of stations]] (see each subnet with [[Subnets_GoogleEarth | Google Earth]]) for each day. The solve script runs solutions for each of these networks based on the data from sites in the network that are available in the qm directory.&lt;br /&gt;
&lt;br /&gt;
To run solutions, copy the ''make'' script to a file called ''make-flt'' (for example).&lt;br /&gt;
&lt;br /&gt;
Check that the ''make-flt'' script contains all the days that you want to run.&lt;br /&gt;
&lt;br /&gt;
Check which [[Linux Computer System | computer]] is free to run the script (to do so, type:)&lt;br /&gt;
   check-solves&lt;br /&gt;
log on to a free computer and type&lt;br /&gt;
   submit make-flt&lt;br /&gt;
&lt;br /&gt;
As the script runs, files will appear in the [[subdirectories|flt]] directory for each [[subnets|network]] for each day. Usually&lt;br /&gt;
this script will have been run once [[#What happens automatically? | automatically]], so there will often already be files in the [[subdirectories|flt]]&lt;br /&gt;
directory ready to be cleaned and then re-run.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Sometimes bad data (outliers and cycle slips) make it past the automatic editors. When this&lt;br /&gt;
happens the bad data are removed from the qm files by either deleting points or by inserting&lt;br /&gt;
new phase ambiguities to deal with cycle slips. The steps, commands and scrips to use are somewhat explained [[How to clean a solution? | here]]. Once the data are cleaned, the files should be deleted from the [[subdirectories|flt]] directory and the solutions re-run (run &amp;lt;code&amp;gt;make-flt&amp;lt;/code&amp;gt; again). Usually you will have to go through 2-3&lt;br /&gt;
iterations of the cleaning-rerunning cycle. To be clean, the .point files for a solution do not&lt;br /&gt;
exist or are small (below 1000 bytes). Once a solution is clean the files should remain in&lt;br /&gt;
the [[subdirectories|flt]] directory and the lines to rerun that solution should be deleted from the &amp;lt;code&amp;gt;make-flt&amp;lt;/code&amp;gt; file.&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, [[How to clean a solution?|allbadp]]&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: [[How to clean a solution?|del_pcode_arc]]&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks, [[How to clean a solution?|allbreaks]]&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&amp;lt;br&amp;gt;&lt;br /&gt;
The data problems can be identified and&lt;br /&gt;
fixed using the program&lt;br /&gt;
   [[short_hand]] &lt;br /&gt;
(follow the link to read about and ask for help to get started with this program).&amp;lt;br&amp;gt; &lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing [[problem stations | problems]]&lt;br /&gt;
&lt;br /&gt;
===Procedure for Running Solutions for a week for the first time===&lt;br /&gt;
&lt;br /&gt;
A few special things need to be done the very first time solutions in a week are run. First, you need to make up a script to run all days of the week. This may need to be edited if the JPL final orbits are not available at the time. The &amp;lt;code&amp;gt;standard_Alaska_solution&amp;lt;/code&amp;gt; uses the non-fiducial orbits and thus requires that the final JPL orbits be present. If they are not, you can run &amp;lt;code&amp;gt;rapid_Alaska_solution&amp;lt;/code&amp;gt; instead. &lt;br /&gt;
&lt;br /&gt;
Then, the log files from autoclean should be moved away to a subdirectory, and any problem stations identified by autoclean should be checked. Then, you are ready to run the solutions.&lt;br /&gt;
&lt;br /&gt;
First, make a script to run solutions. For example, to make a script to run all days of the week for the Alaska solution:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS&lt;br /&gt;
make-make-flt 1381&lt;br /&gt;
cd 1381/flt&lt;br /&gt;
vi make-alaska&lt;br /&gt;
#  Edit the file if needed so that you are ready to run the rapid solutions.&lt;br /&gt;
&lt;br /&gt;
cat make-alaska&lt;br /&gt;
#!/bin/csh -f&lt;br /&gt;
#&lt;br /&gt;
setenv CAMP $ANALYSIS/1381&lt;br /&gt;
#&lt;br /&gt;
#standard_Alaska_solution 06jul01&lt;br /&gt;
#standard_Alaska_solution 06jun25&lt;br /&gt;
#standard_Alaska_solution 06jun26&lt;br /&gt;
#standard_Alaska_solution 06jun27&lt;br /&gt;
#standard_Alaska_solution 06jun28&lt;br /&gt;
#standard_Alaska_solution 06jun29&lt;br /&gt;
#standard_Alaska_solution 06jun30&lt;br /&gt;
#&lt;br /&gt;
rapid_Alaska_solution 06jul01&lt;br /&gt;
rapid_Alaska_solution 06jun25&lt;br /&gt;
rapid_Alaska_solution 06jun26&lt;br /&gt;
rapid_Alaska_solution 06jun27&lt;br /&gt;
rapid_Alaska_solution 06jun28&lt;br /&gt;
rapid_Alaska_solution 06jun29&lt;br /&gt;
rapid_Alaska_solution 06jun30&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The script &amp;lt;code&amp;gt;make-make-flt&amp;lt;/code&amp;gt; finds all unique dates for qm files in that week's directory, and uses that to generate the script, so if you run it before the end of the week you will get a partial script. If the final JPL orbits are not yet present, you will need to edit the script to change &amp;quot;standard&amp;quot; ro &amp;quot;rapid&amp;quot;. Or better yet, copy all the lines and comment one set out, then modify the others to read &amp;quot;rapid_Alaska_solution &amp;lt;date&amp;gt;&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
Next call &amp;lt;code&amp;gt;mv_logfiles&amp;lt;/code&amp;gt; (being in the WEEK/flt directory!) which creates a subdirectory called &amp;lt;code&amp;gt;logfiles&amp;lt;/code&amp;gt; and moves &lt;br /&gt;
all of autoclean's logfiles of the format &amp;lt;code&amp;gt;*____*.i*&amp;lt;/code&amp;gt; into this directory:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
HOSTNAME WWWW/flt&amp;gt; mv_logfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now look for a file called &amp;lt;code&amp;gt;make-problems&amp;lt;/code&amp;gt;, which lists all files that autoclean had a problem with. Sometimes these files are almost clean, but sometimes they are full of junk or horribly mangled by the automated editing. There should be PPP solutions already run for these files, so they are ready to be checked.&lt;br /&gt;
&lt;br /&gt;
Set the &amp;lt;code&amp;gt;CAMP&amp;lt;/code&amp;gt; variable (if not set): &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
setenv CAMP $ANALYSIS/wwww&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Again, &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4-digit GPS-week.&lt;br /&gt;
&lt;br /&gt;
Now run the solutions. The first time you run the solutions, look at the residuals very carefully before trying &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt;. Uncompress the postlog, postfit and postbreak files, and then use &amp;lt;code&amp;gt;allbadp&amp;lt;/code&amp;gt; to check the pseudorange and &amp;lt;code&amp;gt;allbreaks&amp;lt;/code&amp;gt; to check for major cycle slips.&lt;br /&gt;
&lt;br /&gt;
A very common problem is that for several stations per week, there will be one satellite arc of pseudorange data that all have residuals of roughly 2000 cm. If you see these, don't delete the data, but instead run del_pcode_arc to remove only the pseudorange data. I am not sure why these show up, but it could be either a hardware channel bias or a pre-processing glitch. They happen much more often with Ashtechs than any others, and are particuarly common with the US Coast Guard CORS sites. In the &amp;lt;code&amp;gt;qm&amp;lt;/code&amp;gt; directory,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
del_pcode_arc *02gus2* GUS2 GPS41&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you just run &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; without looking, it will probably either throw out all the pseudorange for one site, or delete a lot of data (phase and pseudorange) where only the pseudorange needs to be deleted. So don't do that. Then I delete a batch of bad pseudorange data to get the number of pseudorange outliers under control for the next run.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS/1381/flt&lt;br /&gt;
gunzip *alaska*post*&lt;br /&gt;
allbadp&lt;br /&gt;
allbreaks&lt;br /&gt;
# Based on this, run del_pcode_arc as above, and add ambiguities manually if needed.&lt;br /&gt;
#&lt;br /&gt;
delete_allbadp 50&lt;br /&gt;
#  This creates a file called delete.&lt;br /&gt;
vi delete&lt;br /&gt;
#    Remove lines for any points for which you have already run del_pocde_arc.&lt;br /&gt;
cd ../qm&lt;br /&gt;
sh ../flt/delete&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
At this point, don't worry too much about phase outliers. Basically we are trying to get the number of pseudorange outliers down into the range where &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; will do the right thing when we run it later. Now may be a good time to run &amp;lt;code&amp;gt;Alaska_cleaning_solution $date&amp;lt;/code&amp;gt;, which runs a smaller and much faster solution including the sites that most often need some cleaning.&lt;br /&gt;
&lt;br /&gt;
===Data Backup===&lt;br /&gt;
&lt;br /&gt;
RINEX file backups.&lt;br /&gt;
There are either 1 or 2 separate backups of the raw rinex files. For data we collected&lt;br /&gt;
ourselves, a copy of the original rinex files can be found in either the campaign&lt;br /&gt;
directory (/gps/akda/Campaigns/Data.2007/&amp;lt;project&amp;gt;, where &amp;lt;project&amp;gt; is the project name,&lt;br /&gt;
and Data.2007 will change with the year), or in the continuous site ftp area&lt;br /&gt;
(/gps/akda/Permanent/2007/260/, where 2007 is the year and 160 is the day of year). Also,&lt;br /&gt;
every rinex file put through the hopper is moved to a directory like $RAWDATA/2007/260/&lt;br /&gt;
(again, year and day of year may change). However, the $RAWDATA/2007/260/ directories are&lt;br /&gt;
not really archived and eventually they will be deleted. But in practice we have most of&lt;br /&gt;
the last few years of rinex files online in case something goes wrong.&lt;br /&gt;
&lt;br /&gt;
QM file backups.&lt;br /&gt;
Before autoclean makes any changes to a qm file, it copies the file to a subdirectory&lt;br /&gt;
called &amp;quot;original&amp;quot; in the qm directory. so if you completely destroy a qm file by&lt;br /&gt;
accident,  you can still go back the original version. Of course, that loses all editing&lt;br /&gt;
done to the file, but at least the original data can be recovered easily. In general, it&lt;br /&gt;
is not a good idea to go back to the version in the original subdirectory unless you know&lt;br /&gt;
what you are doing, because doing that can make a lot more work for everyone. Mostly we&lt;br /&gt;
do that when files have been mangled by autoclean. It is actually hard to mangle data&lt;br /&gt;
files using our usual editing procedures.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Customized Solutions===&lt;br /&gt;
&lt;br /&gt;
Sometimes customized solutions are required for various reasons.  This link will provide some strategies that may improve your situation.&lt;br /&gt;
&amp;lt;br&amp;gt;[[Kinematic Processing]]&lt;br /&gt;
&amp;lt;br&amp;gt;[[Ambiguity Resolution]]&lt;br /&gt;
&lt;br /&gt;
===Products / file contents===&lt;br /&gt;
&lt;br /&gt;
Where to find certain information and what are the structure / content of output files? It is started to be summarized on the [[files | files]] page.&lt;br /&gt;
&lt;br /&gt;
=== Velocity solutions ===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; su akda&lt;br /&gt;
/&amp;gt; cd $ANALYZED&lt;br /&gt;
/&amp;gt; mkdir &amp;lt;USEFUL NEW PROJECTNAME&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Find another project and copy the following files into your new directory:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; cp *.nml $ANALYZED/new_project&lt;br /&gt;
/&amp;gt; cp make_vel* $ANALYZED/new_project&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Rename the *.nml file to something reasonable for your project and edit this file, it needs to know the stations that you want to include&lt;br /&gt;
into your velocity solution as well as the input file locations where the data comes from. To get the input files in the correct syntax you might wanna use:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; grep_vel_infiles.pl --from-week=WWWW --infile-index=x --sum-id=&amp;lt;alaska2.0_nfxigs03 | NEAsia2.0_nfxigs03 | ...&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;WWWW&amp;quot; stands for the week you want to start with and &amp;quot;x&amp;quot; is the starting value for the id-counter. These id's are useful later to reference certain input files for additional editing purposes (see below). &amp;quot;sum-id&amp;quot; is basically the solution name. Copy all lines of the format &amp;quot;   infile(x) = '...'&amp;quot; into your namelist (nml) file (insert before the &amp;amp;amp;end).&lt;br /&gt;
&lt;br /&gt;
Once the editing of the namelist file is finished:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; refresh_zebu younamelist.nml outfile.ref&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This is necessary to order and re-number the entries in your namelistfile which can contain comments. To not have you go through the namelist file whenever you want to try to throw out a station or some data and renumber everything, refresh_zebu does that for you.&lt;br /&gt;
&lt;br /&gt;
Once you have a nice reference file (.ref):&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/&amp;gt; rzebu2 outfile.ref &amp;gt; &amp;amp; out&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You should redirect the output to a file so you can look at it later :).&lt;br /&gt;
&lt;br /&gt;
For example right now. As soon as the solution is finished running, look at the total Chi_squared value at the bottom of the output in &amp;quot;out&amp;quot;. It should be &amp;quot;1&amp;quot;. If that's not the case which is likely if you run a solution for the first couple of times look for sites that cause the deviation from chi_squared=1. Note the site names with the larges chi squared values. Then you can do three things:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
A) /&amp;gt; grep SITE outlier.inf&lt;br /&gt;
B) /&amp;gt; grep SITE residual.inf&lt;br /&gt;
C) /&amp;gt; vi $ANALYSIS/solution_timeseries/SITE.pfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In all three cases you want to find dates on which the sigmas for this site are rather large. Note down the date and find it in the namelist &lt;br /&gt;
file (*.nml) and remove the respective site from the velocity solution for that day by adding a line:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
removedat(a, infile_id) = 'SITENAME'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;quot;a&amp;quot; is the id that simply counts how many removedats have been invoked on that one infile_id. &amp;quot;infile_id&amp;quot; is the counter I mentioned above. An example is probably best:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
   infile(161) = '/gps/analysis/1222/post/03jun10NEAsia2.0_nfxigs03.sum'&lt;br /&gt;
    removedat(1,161) = 'ELD '&lt;br /&gt;
    removedat(2,161) = 'PETP'&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here I assume that both sites, &amp;quot;ELD&amp;quot; and &amp;quot;PETP&amp;quot; misbehave on June 10, 2003 for the North East Asia solution. Hence I remove them from the velocity solution.&lt;br /&gt;
&lt;br /&gt;
Remove all the files created by &amp;lt;code&amp;gt;rzebu2&amp;lt;/code&amp;gt;. Creating a Makefile of the form:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
clean:&lt;br /&gt;
        rm *.ref ATWA ATY solution.* *.inf out nnr.* *.dat *.gmtvec *.vel fort.* gmt.format STACOV argus.weights&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
might be useful.&lt;br /&gt;
&lt;br /&gt;
Repeat the above until your reduced chi squared is &amp;lt;= 1.0. If you can't get there, change the fudge_factor as follows:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
new_fudge = old_fudge * Chi_squared&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
and rerun the solution one last time. The reduced chi squared value should be 1.0 .&lt;br /&gt;
&lt;br /&gt;
Once you achieved that you can go to the next level and run one of the make_vel files you copied to your directory:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
make_vel_Sella: velocities relative to a stable North America&lt;br /&gt;
make_vel_ITRF: velocities in ITRF &lt;br /&gt;
make_vel_EURA: velocities relative to a stable Eurasia &lt;br /&gt;
make_vel_XXXX: velocities with reference station XXXX&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There might be others, or you could go ahead and edit these files to adapt them to your needs. The make files create *.gmtvec output which you can use to with e.g., &amp;lt;code&amp;gt;psvelo&amp;lt;/code&amp;gt; in a GMT script.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=214</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=214"/>
				<updated>2006-08-01T00:47:50Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Recent weeks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
&lt;br /&gt;
1374     05/07/2006                      Venkat&lt;br /&gt;
1375     05/14/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1376     05/21/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1377     05/28/2006    07/20/2006        Trivikram 1&lt;br /&gt;
1378     06/04/2006    07/20/2006        Trivikram 1        &lt;br /&gt;
1379     06/11/2006    07/20/2006        Trivikram 1        &lt;br /&gt;
1380     06/18/2006    07/20/2006        Trivikram 1                 &lt;br /&gt;
1381     06/25/2006    07/12/2006        Ryan&lt;br /&gt;
1382     07/02/2006                      Julie&lt;br /&gt;
1383     07/09/2006                      Julie&lt;br /&gt;
1384     07/16/2006                      Julie&lt;br /&gt;
1385     07/23/2006                      Julie&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2006===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1356     01/01/2006                      Sandeep   1 2/Venkat&lt;br /&gt;
1357     01/08/2006                      Sandeep   1 2/Venkat&lt;br /&gt;
1358     01/15/2006                      Sandeep   1 2/Venkat&lt;br /&gt;
1359     01/22/2006                      Sandeep   1 2/Venkat&lt;br /&gt;
1360     01/29/2006                      Sandeep   1 2/Venkat&lt;br /&gt;
1361     02/05/2006                      Trivikram/sandeep 1/ trivikram 2/&lt;br /&gt;
1362     02/12/2006                      Trivikram/sandeep 1/ trivikram 2/ CLEAN except 06feb12&lt;br /&gt;
1363     02/19/2006                      Venkat      CLEAN &lt;br /&gt;
1364     02/26/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1365     03/05/2006                      Venkat&lt;br /&gt;
1366     03/12/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1367     03/19/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1368     03/26/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1369     04/02/2006                      Trivikram/triivkram 1/&lt;br /&gt;
1370     04/09/2006                      Tom&lt;br /&gt;
1371     04/16/2006                      Venkat&lt;br /&gt;
1372     04/23/2006                      Venkat&lt;br /&gt;
1373     04/30/2006                      Venkat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING   Trivikram(tibet) 07/26/2006  CLEAN&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN     Trivikram(tibet) 07/26/2006  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN     Trivikram(tibet) 07/26/2006  CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1/tr&lt;br /&gt;
1320    04/24/2005                      Sandeep   Done 1 2    Trivikram(tibet) 07/27/2006 soln running&lt;br /&gt;
1321    05/01/2005                      Sandeep   Done 1 2    Trivikram(tibet) 07/28/2006 soln running    &lt;br /&gt;
1322    05/08/2005                      Sandeep   Done 1 2    Trivikram(tibet) 07/28/2006&lt;br /&gt;
1323    05/15/2005                      Sandeep   Done 1 2&lt;br /&gt;
1324    05/22/2005                      Sandeep   Done 1 2&lt;br /&gt;
1325    05/29/2005                      Sigrun    DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun    DONE&lt;br /&gt;
1327    06/12/2005                      Sandeep   Done 1 2  &lt;br /&gt;
1328    06/19/2005                      Sandeep   Done 1 2&lt;br /&gt;
1329    06/26/2005                      Sandeep   Done 1 2 &lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan      DONE&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom       Submitted&lt;br /&gt;
1332    07/17/2005                      Sandeep   CLEAN 1-2-3/Venkat  (Point File Size &amp;lt; 500)&lt;br /&gt;
1333    07/24/2005                      Sandeep   CLEAN 1-2-3/Trivikram&lt;br /&gt;
1334    07/31/2005                      Sandeep   CLEAN 1-2-3/Trivikram &lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom       Done 1&lt;br /&gt;
1336    08/14/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan      CLEAN 1-2-3/Trivikram&lt;br /&gt;
1340    09/11/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1341    09/18/2005                      Sandeep   Done 1-2/Venkat   &lt;br /&gt;
1342    09/25/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1343    10/02/2005                      Sandeep   CLEAN 1-2-3/Venkat  (Point File Size &amp;lt; 300)&lt;br /&gt;
1344    10/09/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1345    10/16/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1346    10/23/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1348    11/06/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1349    11/13/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1350    11/20/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1351    11/27/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1352    12/04/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1353    12/11/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1354    12/18/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1356    12/25/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    DONE&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN   trivikram(tibet) CLEAN 07/26/2006&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN   &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=291</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=291"/>
				<updated>2006-07-04T22:34:20Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
Quite a lot.&lt;br /&gt;
&lt;br /&gt;
Autoftp runs every night beginning at 6pm local time and fetches data files. These are placed into the '''hopper''' ($RAWDATA/hopper), a directory where all data files are put for entry into the system and processing. Autofront then runs at midnight to process all files in the hopper, including any placed there manually (from campaigns, for example). Finally, autoclean runs at 4am local to carry out automated screening for cycle slips and other bad data.&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
Autofront is a script intended to run under cron that carried out the initial &amp;quot;front end&amp;quot; processing on a set of GPS data files. When executed, it will process all files in the '''hopper''' directory, and will place each resulting qm file into the appropriate week directory.&lt;br /&gt;
&lt;br /&gt;
Autofront runs the following steps&lt;br /&gt;
1. Checks on the validity of RINEX file and repair of some common problems.&lt;br /&gt;
2. Depending on receiver type, clockprep -fixtags&lt;br /&gt;
3. (optional, presently not default) PhasEdit&lt;br /&gt;
4. ninja&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
Autoclean carries out automated cleaning of cycle slips, based on point positioning solutions. It is quite effective and at present it rarely misses cycle slips unless they are smaller than its minimum tolerance (10 cm). Autoclean operates on an ''edit-request'' file, which contains the name of the directory (week directory) and a list of qm files that need to be cleaned. It will clean all files on the list as long as orbits and clocks are available, and it marks off files that have been cleaned so that it can safely be run multiple times.&lt;br /&gt;
&lt;br /&gt;
Autoclean operates in an iterative mode. It's zeroth iteration is to do a pseudorange-only solution and identify and delete extremely bad pseudorange data. In this step it uses a tolerance that catches only grossly biased data. (Explain it). It then carries out 1 or more iterations of screening the phase data. In each iteration, it uses postbreak to identify discontinuities in the residuals of a point positioning solution. Postbreak is run with an adaptive tolerance (minimum 10 cm), and it is critical that my slightly modified version of postbreak be used. If any cycle slips are discovered, they are flagged and another iteration is run. Autoclean runs a maximum of 4 iterations on the phase data.&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several subdirectories; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, allbadp&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: del_pcode_arc&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&lt;br /&gt;
&lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing problems (link)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Procedure for Running Solutions for a week for the first time===&lt;br /&gt;
&lt;br /&gt;
A few special things need to be done the very first time solutions in a week are run. First, you need to make up a script to run all days of the week. This may need to be edited if the JPL final orbits are not available at the time. The &amp;lt;code&amp;gt;standard_Alaska_solution&amp;lt;/code&amp;gt; uses the non-fiducial orbits and thus requires that the final JPL orbits be present. If they are not, you can run &amp;lt;code&amp;gt;rapid_Alaska_solution&amp;lt;/code&amp;gt; instead. &lt;br /&gt;
&lt;br /&gt;
Then, the log files from autoclean should be moved away to a subdirectory, and any problem stations identified by autoclean should be checked. Then, you are ready to run the solutions.&lt;br /&gt;
&lt;br /&gt;
First, make a script to run solutions. For example, to make a script to run all days of the week for the Alaska solution:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS&lt;br /&gt;
make-make-flt 1381&lt;br /&gt;
cd 1381/flt&lt;br /&gt;
vi make-alaska&lt;br /&gt;
#  Edit the file if needed so that you are ready to run the rapid solutions.&lt;br /&gt;
&lt;br /&gt;
cat make-alaska&lt;br /&gt;
#!/bin/csh -f&lt;br /&gt;
#&lt;br /&gt;
setenv CAMP $ANALYSIS/1381&lt;br /&gt;
#&lt;br /&gt;
#standard_Alaska_solution 06jul01&lt;br /&gt;
#standard_Alaska_solution 06jun25&lt;br /&gt;
#standard_Alaska_solution 06jun26&lt;br /&gt;
#standard_Alaska_solution 06jun27&lt;br /&gt;
#standard_Alaska_solution 06jun28&lt;br /&gt;
#standard_Alaska_solution 06jun29&lt;br /&gt;
#standard_Alaska_solution 06jun30&lt;br /&gt;
#&lt;br /&gt;
rapid_Alaska_solution 06jul01&lt;br /&gt;
rapid_Alaska_solution 06jun25&lt;br /&gt;
rapid_Alaska_solution 06jun26&lt;br /&gt;
rapid_Alaska_solution 06jun27&lt;br /&gt;
rapid_Alaska_solution 06jun28&lt;br /&gt;
rapid_Alaska_solution 06jun29&lt;br /&gt;
rapid_Alaska_solution 06jun30&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The script &amp;lt;code&amp;gt;make-make-flt&amp;lt;/code&amp;gt; finds all unique dates for qm files in that week's directory, and uses that to generate the script, so if you run it before the end of the week you will get a partial script. If the final JPL orbits are not yet present, you will need to edit the script to change &amp;quot;standard&amp;quot; ro &amp;quot;rapid&amp;quot;. Or better yet, copy all the lines and comment one set out, then modify the others to read &amp;quot;rapid_Alaska_solution &amp;lt;date&amp;gt;&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
Next, make a subdirectory called &amp;lt;code&amp;gt;logfiles&amp;lt;/code&amp;gt;, and move all of autoclean's logfiles into it. You will need to use two separate &amp;lt;code&amp;gt;mv&amp;lt;/code&amp;gt; commands because the list of files is too big for one.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mkdir logfiles&lt;br /&gt;
mv *____*.i1* logfiles&lt;br /&gt;
mv *____*.i* logfiles&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now look for a file called &amp;lt;code&amp;gt;make-problems&amp;lt;/code&amp;gt;, which lists all files that autoclean had a problem with. Sometimes these files are almost clean, but sometimes they are full of junk or horribly mangled by the automated editing. There should be PPP solutions already run for these files, so they are ready to eb checked.&lt;br /&gt;
&lt;br /&gt;
Now run the solutions. The first time you run the solutions, look at the residuals very carefully before trying &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt;. Uncompress the postlog, postfit and postbreak files, and then use &amp;lt;code&amp;gt;allbadp&amp;lt;/code&amp;gt; to check the pseudorange and &amp;lt;code&amp;gt;allbreaks&amp;lt;/code&amp;gt; to check for major cycle slips.&lt;br /&gt;
&lt;br /&gt;
A very common problem is that for several stations per week, there will be one satellite arc of pseudorange data that all have residuals of roughly 2000 cm. If you see these, don't delete the data, but instead run del_pcode_arc to remove only the pseudorange data. I am not sure why these show up, but it could be either a hardware channel bias or a pre-processing glitch. They happen much more often with Ashtechs than any others, and are particuarly common with the US Coast Guard CORS sites. In the &amp;lt;code&amp;gt;qm&amp;lt;/code&amp;gt; directory,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
del_pcode_arc *02gus2* GUS2 GPS41&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you just run &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; without looking, it will probably either throw out all the pseudorange for one site, or delete a lot of data (phase and pseudorange) where only the pseudorange needs to be deleted. So don't do that. Then I delete a batch of bad pseudorange data to get the number of pseudorange outliers under control for the next run.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ANALYSIS/1381/flt&lt;br /&gt;
gunzip *alaska*post*&lt;br /&gt;
allbadp&lt;br /&gt;
allbreaks&lt;br /&gt;
# Based on this, run del_pcode_arc as above, and add ambiguities manually if needed.&lt;br /&gt;
#&lt;br /&gt;
delete_allbadp 50&lt;br /&gt;
#  This creates a file called delete.&lt;br /&gt;
vi delete&lt;br /&gt;
#    Remove lines for any points for which you have already run del_pocde_arc.&lt;br /&gt;
cd ../qm&lt;br /&gt;
sh ../flt/delete&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
At this point, don't worry too much about phase outliers. Basically we are trying to get the number of pseudorange outliers down into the range where &amp;lt;code&amp;gt;short_hand&amp;lt;/code&amp;gt; will do the right thing when we run it later. Now may be a good time to run &amp;lt;code&amp;gt;Alaska_cleaning_solution $date&amp;lt;/code&amp;gt;, which runs a smaller and much faster solution including the sites that most often need some cleaning.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=187</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=187"/>
				<updated>2006-07-04T22:01:15Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* 2006 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
&lt;br /&gt;
1374     05/07/2006                      Venkat&lt;br /&gt;
1375     05/14/2006                      Trivikram&lt;br /&gt;
1376     05/21/2006                      Trivikram&lt;br /&gt;
1377     05/28/2006                      &lt;br /&gt;
1378     06/04/2006                      &lt;br /&gt;
1379     06/11/2006                      &lt;br /&gt;
1380     06/18/2006                      &lt;br /&gt;
1381     06/25/2006                      &lt;br /&gt;
1382     07/02/2006                      &lt;br /&gt;
1383     07/09/2006                      &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2006===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1356     01/01/2006                      &lt;br /&gt;
1357     01/08/2006                      &lt;br /&gt;
1358     01/15/2006                      &lt;br /&gt;
1359     01/22/2006                      Sandeep   1&lt;br /&gt;
1360     01/29/2006                      Sandeep   1&lt;br /&gt;
1361     02/05/2006                      Trivikram/sandeep 1/ trivikram 2/soln running&lt;br /&gt;
1362     02/12/2006                      Trivikram/sandeep 1/ trivikram 2/soln running&lt;br /&gt;
1363     02/19/2006                      Venkat      CLEAN &lt;br /&gt;
1364     02/26/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1365     03/05/2006                      Venkat&lt;br /&gt;
1366     03/12/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1367     03/19/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1368     03/26/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1369     04/02/2006                      Trivikram&lt;br /&gt;
1370     04/09/2006                      Tom&lt;br /&gt;
1371     04/16/2006                      Venkat&lt;br /&gt;
1372     04/23/2006                      Venkat&lt;br /&gt;
1373     04/30/2006                      Venkat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1/tr&lt;br /&gt;
1320    04/24/2005                      Sandeep   Done 1 2&lt;br /&gt;
1321    05/01/2005                      Sandeep   Done 1 2&lt;br /&gt;
1322    05/08/2005                      Sandeep   Done 1 2&lt;br /&gt;
1323    05/15/2005                      Sandeep   Done 1 2&lt;br /&gt;
1324    05/22/2005                      Sandeep   Done 1 2&lt;br /&gt;
1325    05/29/2005                      Sigrun    DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun    DONE&lt;br /&gt;
1327    06/12/2005                      Sandeep   Done 1 2  &lt;br /&gt;
1328    06/19/2005                      Sandeep   Done 1 2&lt;br /&gt;
1329    06/26/2005                      Sandeep   Done 1 2 &lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan      DONE&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom       Submitted&lt;br /&gt;
1332    07/17/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1333    07/24/2005                      Sandeep   Done 1/trivikram soln running&lt;br /&gt;
1334    07/31/2005                      Sandeep   Done 1/trivikram soln running &lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom       Done 1/trivikram fatal error problem(2 days)&lt;br /&gt;
1336    08/14/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan    working on it /trivikram&lt;br /&gt;
1340    09/11/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1341    09/18/2005                      Sandeep   Done 1-2/Venkat   &lt;br /&gt;
1342    09/25/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1343    10/02/2005                      Sandeep   CLEAN 1-2/Venkat&lt;br /&gt;
1344    10/09/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1345    10/16/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1346    10/23/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1348    11/06/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1349    11/13/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1350    11/20/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1351    11/27/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1352    12/04/2005                      Sandeep   Done 1 2&lt;br /&gt;
1353    12/11/2005                      Sandeep   Done 1 2&lt;br /&gt;
1354    12/18/2005                      Sandeep   Done 1&lt;br /&gt;
1356    12/25/2005                      Sandeep   Done 1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    DONE&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=185</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=185"/>
				<updated>2006-07-04T22:00:14Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* 2006 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
&lt;br /&gt;
1374     05/07/2006                      Venkat&lt;br /&gt;
1375     05/14/2006                      Trivikram&lt;br /&gt;
1376     05/21/2006                      Trivikram&lt;br /&gt;
1377     05/28/2006                      &lt;br /&gt;
1378     06/04/2006                      &lt;br /&gt;
1379     06/11/2006                      &lt;br /&gt;
1380     06/18/2006                      &lt;br /&gt;
1381     06/25/2006                      &lt;br /&gt;
1382     07/02/2006                      &lt;br /&gt;
1383     07/09/2006                      &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2006===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1357&lt;br /&gt;
1358&lt;br /&gt;
1359     01/22/2006                      Sandeep   1&lt;br /&gt;
1360     01/29/2006                      Sandeep   1&lt;br /&gt;
1361     02/05/2006                      Trivikram/sandeep 1/ trivikram 2/soln running&lt;br /&gt;
1362     02/12/2006                      Trivikram/sandeep 1/ trivikram 2/soln running&lt;br /&gt;
1363     02/19/2006                      Venkat      CLEAN &lt;br /&gt;
1364     02/26/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1365     03/05/2006                      Venkat&lt;br /&gt;
1366     03/12/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1367     03/19/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1368     03/26/2006                      Trivikram/trivikram 1/&lt;br /&gt;
1369     04/02/2006                      Trivikram&lt;br /&gt;
1370     04/09/2006                      Tom&lt;br /&gt;
1371     04/16/2006                      Venkat&lt;br /&gt;
1372     04/23/2006                      Venkat&lt;br /&gt;
1373     04/30/2006                      Venkat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1/tr&lt;br /&gt;
1320    04/24/2005                      Sandeep   Done 1 2&lt;br /&gt;
1321    05/01/2005                      Sandeep   Done 1 2&lt;br /&gt;
1322    05/08/2005                      Sandeep   Done 1 2&lt;br /&gt;
1323    05/15/2005                      Sandeep   Done 1 2&lt;br /&gt;
1324    05/22/2005                      Sandeep   Done 1 2&lt;br /&gt;
1325    05/29/2005                      Sigrun    DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun    DONE&lt;br /&gt;
1327    06/12/2005                      Sandeep   Done 1 2  &lt;br /&gt;
1328    06/19/2005                      Sandeep   Done 1 2&lt;br /&gt;
1329    06/26/2005                      Sandeep   Done 1 2 &lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan      DONE&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom       Submitted&lt;br /&gt;
1332    07/17/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1333    07/24/2005                      Sandeep   Done 1/trivikram soln running&lt;br /&gt;
1334    07/31/2005                      Sandeep   Done 1/trivikram soln running &lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom       Done 1/trivikram fatal error problem(2 days)&lt;br /&gt;
1336    08/14/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan    working on it /trivikram&lt;br /&gt;
1340    09/11/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1341    09/18/2005                      Sandeep   Done 1-2/Venkat   &lt;br /&gt;
1342    09/25/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1343    10/02/2005                      Sandeep   CLEAN 1-2/Venkat&lt;br /&gt;
1344    10/09/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1345    10/16/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1346    10/23/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1348    11/06/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1349    11/13/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1350    11/20/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1351    11/27/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1352    12/04/2005                      Sandeep   Done 1 2&lt;br /&gt;
1353    12/11/2005                      Sandeep   Done 1 2&lt;br /&gt;
1354    12/18/2005                      Sandeep   Done 1&lt;br /&gt;
1356    12/25/2005                      Sandeep   Done 1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    DONE&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=184</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=184"/>
				<updated>2006-07-04T21:58:53Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Recent weeks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
&lt;br /&gt;
1374     05/07/2006                      Venkat&lt;br /&gt;
1375     05/14/2006                      Trivikram&lt;br /&gt;
1376     05/21/2006                      Trivikram&lt;br /&gt;
1377     05/28/2006                      &lt;br /&gt;
1378     06/04/2006                      &lt;br /&gt;
1379     06/11/2006                      &lt;br /&gt;
1380     06/18/2006                      &lt;br /&gt;
1381     06/25/2006                      &lt;br /&gt;
1382     07/02/2006                      &lt;br /&gt;
1383     07/09/2006                      &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2006===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1357&lt;br /&gt;
1358&lt;br /&gt;
1359    1/22/2006                       Sandeep&lt;br /&gt;
1360    1/29/2006                       Sandeep&lt;br /&gt;
1361    2/05/2006                       Trivikram&lt;br /&gt;
1362    2/12/2006                       Trivikram&lt;br /&gt;
1363    2/19/2006                       Venkat    CLEAN &lt;br /&gt;
1364    2/26/2006                       Trivikram &lt;br /&gt;
1365    3/05/2006                       Venkat&lt;br /&gt;
1366    3/12/2006                       Trivikram&lt;br /&gt;
1367    3/19/2006                       Trivikram&lt;br /&gt;
1368    3/26/2006                       Trivikram&lt;br /&gt;
1369    4/02/2006        06/08/06       Trivikram   &lt;br /&gt;
1370    04/09/2006       05/31/2006     Tom     running&lt;br /&gt;
1371    4/16/2006                       Venkat&lt;br /&gt;
1372    4/23/2006                       Venkat&lt;br /&gt;
1373    4/30/2006                       Venkat&lt;br /&gt;
1374    5/07/2006                       Venkat&lt;br /&gt;
1375    5/14/2006                       Trivikram&lt;br /&gt;
1376    5/21/2006                       Trivikram&lt;br /&gt;
1377    &lt;br /&gt;
1378    &lt;br /&gt;
1379&lt;br /&gt;
1380&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1/tr&lt;br /&gt;
1320    04/24/2005                      Sandeep   Done 1 2&lt;br /&gt;
1321    05/01/2005                      Sandeep   Done 1 2&lt;br /&gt;
1322    05/08/2005                      Sandeep   Done 1 2&lt;br /&gt;
1323    05/15/2005                      Sandeep   Done 1 2&lt;br /&gt;
1324    05/22/2005                      Sandeep   Done 1 2&lt;br /&gt;
1325    05/29/2005                      Sigrun    DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun    DONE&lt;br /&gt;
1327    06/12/2005                      Sandeep   Done 1 2  &lt;br /&gt;
1328    06/19/2005                      Sandeep   Done 1 2&lt;br /&gt;
1329    06/26/2005                      Sandeep   Done 1 2 &lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan      DONE&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom       Submitted&lt;br /&gt;
1332    07/17/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1333    07/24/2005                      Sandeep   Done 1/trivikram soln running&lt;br /&gt;
1334    07/31/2005                      Sandeep   Done 1/trivikram soln running &lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom       Done 1/trivikram fatal error problem(2 days)&lt;br /&gt;
1336    08/14/2005                      Sandeep   Done 1 2/Venkat&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan    working on it /trivikram&lt;br /&gt;
1340    09/11/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1341    09/18/2005                      Sandeep   Done 1-2/Venkat   &lt;br /&gt;
1342    09/25/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1343    10/02/2005                      Sandeep   CLEAN 1-2/Venkat&lt;br /&gt;
1344    10/09/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1345    10/16/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1346    10/23/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1347    10/30/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1348    11/06/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1349    11/13/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1350    11/20/2005                      Sandeep   Done 1-2/Venkat &lt;br /&gt;
1351    11/27/2005                      Sandeep   Done 1-2/Venkat&lt;br /&gt;
1352    12/04/2005                      Sandeep   Done 1 2&lt;br /&gt;
1353    12/11/2005                      Sandeep   Done 1 2&lt;br /&gt;
1354    12/18/2005                      Sandeep   Done 1&lt;br /&gt;
1356    12/25/2005                      Sandeep   Done 1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    DONE&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=186</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=186"/>
				<updated>2006-05-20T13:55:14Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* What happens automatically? */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
Quite a lot.&lt;br /&gt;
&lt;br /&gt;
Autoftp runs every night beginning at 6pm local time and fetches data files. These are placed into the '''hopper''' ($RAWDATA/hopper), a directory where all data files are put for entry into the system and processing. Autofront then runs at midnight to process all files in the hopper, including any placed there manually (from campaigns, for example). Finally, autoclean runs at 4am local to carry out automated screening for cycle slips and other bad data.&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
Autofront is a script intended to run under cron that carried out the initial &amp;quot;front end&amp;quot; processing on a set of GPS data files. When executed, it will process all files in the '''hopper''' directory, and will place each resulting qm file into the appropriate week directory.&lt;br /&gt;
&lt;br /&gt;
Autofront runs the following steps&lt;br /&gt;
1. Checks on the validity of RINEX file and repair of some common problems.&lt;br /&gt;
2. Depending on receiver type, clockprep -fixtags&lt;br /&gt;
3. (optional, presently not default) PhasEdit&lt;br /&gt;
4. ninja&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
Autoclean carries out automated cleaning of cycle slips, based on point positioning solutions. It is quite effective and at present it rarely misses cycle slips unless they are smaller than its minimum tolerance (10 cm). Autoclean operates on an ''edit-request'' file, which contains the name of the directory (week directory) and a list of qm files that need to be cleaned. It will clean all files on the list as long as orbits and clocks are available, and it marks off files that have been cleaned so that it can safely be run multiple times.&lt;br /&gt;
&lt;br /&gt;
Autoclean operates in an iterative mode. It's zeroth iteration is to do a pseudorange-only solution and identify and delete extremely bad pseudorange data. In this step it uses a tolerance that catches only grossly biased data. (Explain it). It then carries out 1 or more iterations of screening the phase data. In each iteration, it uses postbreak to identify discontinuities in the residuals of a point positioning solution. Postbreak is run with an adaptive tolerance (minimum 10 cm), and it is critical that my slightly modified version of postbreak be used. If any cycle slips are discovered, they are flagged and another iteration is run. Autoclean runs a maximum of 4 iterations on the phase data.&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several subdirectories; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, allbadp&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: del_pcode_arc&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&lt;br /&gt;
&lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing problems (link)&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=42</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=42"/>
				<updated>2006-05-20T13:46:17Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Autofront */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
Autofront is a script intended to run under cron that carried out the initial &amp;quot;front end&amp;quot; processing on a set of GPS data files. When executed, it will process all files in the '''hopper''' directory, and will place each resulting qm file into the appropriate week directory.&lt;br /&gt;
&lt;br /&gt;
Autofront runs the following steps&lt;br /&gt;
1. Checks on the validity of RINEX file and repair of some common problems.&lt;br /&gt;
2. Depending on receiver type, clockprep -fixtags&lt;br /&gt;
3. (optional, presently not default) PhasEdit&lt;br /&gt;
4. ninja&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several subdirectories; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, allbadp&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: del_pcode_arc&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&lt;br /&gt;
&lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing problems (link)&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=41</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=41"/>
				<updated>2006-05-20T13:41:42Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Autoftp */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
Autoftp is an efficient data-fetching tool that uses wget to automatically get data from any of several internet GPS data archives. It reads a list of desired sites from a ''request file'', which contains the date in the filename, and attempts to find and download data from as many sites as possible. It is intended to run automatically on a daily basis under cron, and when acccompanied by another simple program to generate a standard request file every day, it can easily fetch a standard set of sites on a daily basis for analysis. Because it keeps track in the request file of sites that it has found already, autoftp can be run multiple times with the same request file and it will not repeatedly fetch data. This is ideal for the real world, in which data from some sites are available rapidly while data from other sites may require many hours or days to become available.&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several subdirectories; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, allbadp&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: del_pcode_arc&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&lt;br /&gt;
&lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing problems (link)&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=40</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=40"/>
				<updated>2006-05-20T13:35:31Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Running Solutions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several subdirectories; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
&lt;br /&gt;
Solve, a very flexible script. (link to detailed help)&lt;br /&gt;
&lt;br /&gt;
Philosophy of solve&lt;br /&gt;
&lt;br /&gt;
Subnet files and campaign files&lt;br /&gt;
&lt;br /&gt;
Standard solutions&lt;br /&gt;
&lt;br /&gt;
(text of standard_Alaska_solution)&lt;br /&gt;
&lt;br /&gt;
Running several days at once: make-make-flt and make-alaska&lt;br /&gt;
&lt;br /&gt;
(text of a standard make-alaska file and variant)&lt;br /&gt;
&lt;br /&gt;
Running several weeks at once&lt;br /&gt;
&lt;br /&gt;
(text of sample rerun-* file)&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, allbadp&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: del_pcode_arc&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&lt;br /&gt;
&lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing problems (link)&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=39</id>
		<title>GPS analysis system</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GPS_analysis_system&amp;diff=39"/>
				<updated>2006-05-20T13:30:40Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Cleaning Solutions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We use a GPS data analysis system based on the [http://sideshow.jpl.nasa.gov GIPSY] software developed at JPL. Most of the GIPSY programs are called by shell scripts written by Jeff Freymueller. Using these scripts, we can analyze a large amount of data either as part of network solutions or in Precise Point Positioning (PPP) mode.&lt;br /&gt;
&lt;br /&gt;
===Where do you put RINEX files?===&lt;br /&gt;
RINEX files should be put into the hopper, &amp;lt;code&amp;gt;$RAWDATA/hopper&amp;lt;/code&amp;gt;. What, you don't have RINEX files yet? See [[RINEXing]]. Once files are in the hopper, you can either let the first processing stages happen automatically overnight (see next section), or run the &amp;lt;code&amp;gt;autofront&amp;lt;/code&amp;gt; and &amp;lt;code&amp;gt;autoclean&amp;lt;/code&amp;gt; programs manually.&lt;br /&gt;
&lt;br /&gt;
===What happens automatically?===&lt;br /&gt;
&lt;br /&gt;
====Autoftp====&lt;br /&gt;
&lt;br /&gt;
====Autofront====&lt;br /&gt;
&lt;br /&gt;
====Autoclean====&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Where do the data files go?===&lt;br /&gt;
Data files from each station are stored in the QM format that is native to GIPSY. QM files (and all other) files are stored in directories by GPS week. For each [[week_directory|week directory]] there are several subdirectories; qm files are stored in &amp;lt;code&amp;gt;$ANALYSIS/wwww/qm&amp;lt;/code&amp;gt;, where &amp;lt;code&amp;gt;wwww&amp;lt;/code&amp;gt; is the 4 character GPS week number (with a leading zero if needed).&lt;br /&gt;
&lt;br /&gt;
===Running Solutions===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===Cleaning Solutions===&lt;br /&gt;
&lt;br /&gt;
Initial Explanation of terms&lt;br /&gt;
&lt;br /&gt;
Expected residuals from a clean solution&lt;br /&gt;
&lt;br /&gt;
Automated screening: postfit, the point file, postbreak&lt;br /&gt;
&lt;br /&gt;
Checking for bad pseudorange data: badp, allbadp&lt;br /&gt;
&lt;br /&gt;
Removing biased pseudorange data: del_pcode_arc&lt;br /&gt;
&lt;br /&gt;
Automatically identified cycle slips: breaks&lt;br /&gt;
&lt;br /&gt;
Quickly scanning through residuals: short_hand&lt;br /&gt;
&lt;br /&gt;
Limitations of short_hand&lt;br /&gt;
&lt;br /&gt;
Manually checking residuals and fixing problems (link)&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=43</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=43"/>
				<updated>2006-05-19T21:16:03Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Recent weeks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1321    05/01/2005                      Jill&lt;br /&gt;
1322    05/08/2005                      Jill&lt;br /&gt;
1323    05/15/2005                      Jill&lt;br /&gt;
1324    05/22/2005                      Julie&lt;br /&gt;
1325    05/29/2005                      Sigrun   DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun   DONE&lt;br /&gt;
1327    06/12/2005                      Sigrun&lt;br /&gt;
1328    06/19/2005                      Tom&lt;br /&gt;
1329    06/26/2005                      Tom&lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom     Submitted&lt;br /&gt;
1332    07/17/2005                      Samik&lt;br /&gt;
1333    07/24/2005                      Julie&lt;br /&gt;
1334    07/31/2005                      Jill&lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom     SUBMITTED&lt;br /&gt;
1336    08/14/2005                      Jill&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan    working on it&lt;br /&gt;
1340    09/11/2005                      Julie&lt;br /&gt;
1341    09/18/2005                      Julie&lt;br /&gt;
1342    09/25/2005                      Tom&lt;br /&gt;
1343    10/02/2005                      Tom&lt;br /&gt;
1344    10/09/2005                      Tom&lt;br /&gt;
1345    10/16/2005      01/10/06        Ryan     Working on it&lt;br /&gt;
1346    10/23/2005                      (nobody)&lt;br /&gt;
&lt;br /&gt;
(skipping some weeks)&lt;br /&gt;
&lt;br /&gt;
1359     1/22/2006                      Sandeep&lt;br /&gt;
1360     1/29/2006                      Sandeep&lt;br /&gt;
1361     2/05/2006                      Trivikram&lt;br /&gt;
1362     2/12/2006                      Venkat&lt;br /&gt;
&lt;br /&gt;
(skipping some weeks)&lt;br /&gt;
&lt;br /&gt;
1374    10/23/2005                      (nobody)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1320    04/24/2005                      Jill&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    DONE&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=37</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=37"/>
				<updated>2006-05-19T20:47:25Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Older Stuff */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1321    05/01/2005                      Jill&lt;br /&gt;
1322    05/08/2005                      Jill&lt;br /&gt;
1323    05/15/2005                      Jill&lt;br /&gt;
1324    05/22/2005                      Julie&lt;br /&gt;
1325    05/29/2005                      Sigrun   DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun   DONE&lt;br /&gt;
1327    06/12/2005                      Sigrun&lt;br /&gt;
1328    06/19/2005                      Tom&lt;br /&gt;
1329    06/26/2005                      Tom&lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom     Submitted&lt;br /&gt;
1332    07/17/2005                      Samik&lt;br /&gt;
1333    07/24/2005                      Julie&lt;br /&gt;
1334    07/31/2005                      Jill&lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom     SUBMITTED&lt;br /&gt;
1336    08/14/2005                      Jill&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan    working on it&lt;br /&gt;
1340    09/11/2005                      Julie&lt;br /&gt;
1341    09/18/2005                      Julie&lt;br /&gt;
1342    09/25/2005                      Tom&lt;br /&gt;
1343    10/02/2005                      Tom&lt;br /&gt;
1344    10/09/2005                      Tom&lt;br /&gt;
1345    10/16/2005      01/10/06        Ryan     Working on it&lt;br /&gt;
1346    10/23/2005                      (nobody)&lt;br /&gt;
&lt;br /&gt;
(skipping some weeks)&lt;br /&gt;
&lt;br /&gt;
1360     1/29/2006                      Sandeep&lt;br /&gt;
1361     2/05/2006                      Trivikram&lt;br /&gt;
1362     2/12/2006                      Venkat&lt;br /&gt;
&lt;br /&gt;
(skipping some weeks)&lt;br /&gt;
&lt;br /&gt;
1374    10/23/2005                      (nobody)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1320    04/24/2005                      Jill&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    DONE&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=36</id>
		<title>WHOS</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=WHOS&amp;diff=36"/>
				<updated>2006-05-19T20:41:37Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: /* Recent weeks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The WHOS list tracks assigned GPS weeks for processing.&lt;br /&gt;
&lt;br /&gt;
More details about analysis at [[GPS analysis system]].&lt;br /&gt;
&lt;br /&gt;
===Recent weeks===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
1321    05/01/2005                      Jill&lt;br /&gt;
1322    05/08/2005                      Jill&lt;br /&gt;
1323    05/15/2005                      Jill&lt;br /&gt;
1324    05/22/2005                      Julie&lt;br /&gt;
1325    05/29/2005                      Sigrun   DONE &lt;br /&gt;
1326    06/05/2005                      Sigrun   DONE&lt;br /&gt;
1327    06/12/2005                      Sigrun&lt;br /&gt;
1328    06/19/2005                      Tom&lt;br /&gt;
1329    06/26/2005                      Tom&lt;br /&gt;
1330    07/03/2005      9/14/2005       Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1331    07/10/2005      11/10/2005      Tom     Submitted&lt;br /&gt;
1332    07/17/2005                      Samik&lt;br /&gt;
1333    07/24/2005                      Julie&lt;br /&gt;
1334    07/31/2005                      Jill&lt;br /&gt;
1335    08/07/2005      9/12/2005       Tom     SUBMITTED&lt;br /&gt;
1336    08/14/2005                      Jill&lt;br /&gt;
1337    08/21/2005      10/17/2005      Ryan    ak CLEAN pt file size&amp;lt;500&lt;br /&gt;
1338    08/28/2005      10/17/2005      Ryan    ak CLEAN&lt;br /&gt;
1339    09/04/2005      11/08/2005      Ryan    working on it&lt;br /&gt;
1340    09/11/2005                      Julie&lt;br /&gt;
1341    09/18/2005                      Julie&lt;br /&gt;
1342    09/25/2005                      Tom&lt;br /&gt;
1343    10/02/2005                      Tom&lt;br /&gt;
1344    10/09/2005                      Tom&lt;br /&gt;
1345    10/16/2005      01/10/06        Ryan     Working on it&lt;br /&gt;
1346    10/23/2005                      (nobody)&lt;br /&gt;
&lt;br /&gt;
(skipping some weeks)&lt;br /&gt;
&lt;br /&gt;
1360     1/29/2006                      Sandeep&lt;br /&gt;
1361     2/05/2006                      Trivikram&lt;br /&gt;
1362     2/12/2006                      Venkat&lt;br /&gt;
&lt;br /&gt;
(skipping some weeks)&lt;br /&gt;
&lt;br /&gt;
1374    10/23/2005                      (nobody)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===2005===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
1304    01/02/2005      09/16/2005      Ryan    CLEAN&lt;br /&gt;
1305    01/09/2005      08/30/2005      Pravee  CLEAN&lt;br /&gt;
1306    01/16/2005      07/15/2005      Jill    WORKING&lt;br /&gt;
1307    01/23/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1308    01/30/2005      04/05/2005      Jill    CLEAN&lt;br /&gt;
1309    02/06/2005      08/31/2005      Pravee  CLEAN&lt;br /&gt;
1310    02/13/2005      03/12/2005      Julie   CLEAN&lt;br /&gt;
1311    02/20/2005      03/12/2005      Pravee  CLEANING&lt;br /&gt;
1312    02/27/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1313    03/06/2005      03/31/2005      Pravee  CLEANING&lt;br /&gt;
1314    03/13/2005      03/31/2005      Tom     CLEAN&lt;br /&gt;
1315    03/20/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;500&lt;br /&gt;
1316    03/27/2005      06/13/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1317    04/03/2005      06/16/2005      Samik   CLEAN pt file size&amp;lt;400&lt;br /&gt;
1318    04/10/2005      9/15/2005       Ryan    not sure what's going on?&lt;br /&gt;
1319    04/17/2005      10/4/2005       Ryan    ak SUBMITTED&lt;br /&gt;
1320    04/24/2005                      Jill&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Older Stuff===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
****************************************************************&lt;br /&gt;
***************************2002*********************************&lt;br /&gt;
**************************EARTHQUAKE****************************&lt;br /&gt;
1189    10/20/02        02/26/05        Josh    PENDING&lt;br /&gt;
1190    10/27/02                        Josh    PENDING&lt;br /&gt;
1191    11/03/02        03/04/05        Josh    DONE&lt;br /&gt;
1192    11/10/02        03/11/05        Josh    DONE&lt;br /&gt;
1193    11/17/02        03/11/05        Josh    DONE&lt;br /&gt;
1194    11/24/02        03/11/05        Josh    DONE&lt;br /&gt;
1195    12/01/02        03/11/05        Josh    RUNNING&lt;br /&gt;
1196    12/08/02        03/11/05        Josh    DONE&lt;br /&gt;
1197    12/15/02        03/11/05        Josh    DONE&lt;br /&gt;
1198    12/22/02        03/12/05        Josh    DONE&lt;br /&gt;
1199    12/29/02        03/12/05        Josh    DONE&lt;br /&gt;
&lt;br /&gt;
*************************2003***********************************&lt;br /&gt;
&lt;br /&gt;
1200    01/05/03        03/12/05        Josh    DONE&lt;br /&gt;
1201    01/12/03        03/12/05        Josh    DONE&lt;br /&gt;
1202    01/19/03        03/15/05        Josh    RUNNING&lt;br /&gt;
1203    01/26/03        03/14/05        Josh    DONE&lt;br /&gt;
1204    02/02/03        03/14/05        Josh    DONE&lt;br /&gt;
1205    02/09/03                        Josh    PENDING&lt;br /&gt;
1206    02/16/03                        Josh    PENDING&lt;br /&gt;
1207    02/23/03                        Josh    PENDING&lt;br /&gt;
1208    03/02/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1209    03/09/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1210    03/16/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1211    03/23/03        02/27/05        Josh    CLEANING&lt;br /&gt;
1212    03/30/03        02/26/05        Josh    DONE&lt;br /&gt;
1213    04/06/03        03/12/05        Josh    RUNNING&lt;br /&gt;
1214    04/13/03        02/26/05        Josh    CLEANED&lt;br /&gt;
1215    04/20/03        02/27/05        Josh    CLEANED&lt;br /&gt;
1216    04/27/03        02/26/05        Josh    PENDING&lt;br /&gt;
1217    05/04/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1218    05/11/03        02/27/05        Josh    NEEDS CLEAN&lt;br /&gt;
1219    05/18/03        02/27/05        Josh    WORKING&lt;br /&gt;
1220    05/25/03        02/27/05        Josh    PENDING&lt;br /&gt;
1221    06/01/03                        Josh    PENDING&lt;br /&gt;
1222    06/08/03                        Josh    PENDING&lt;br /&gt;
1223    06/15/03                        Josh    PENDING&lt;br /&gt;
1224    06/22/03                        Josh    PENDING&lt;br /&gt;
1225    06/29/03        02/26/05        Josh    FAILED&lt;br /&gt;
1226    07/06/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1227    07/13/03        02/26/05        Josh    FAILED&lt;br /&gt;
1228    07/20/03                        Josh    FAILED&lt;br /&gt;
1229    07/27/03        11/10/05        Tom     Submitted  &lt;br /&gt;
1230    08/03/03                        Josh    FAILED&lt;br /&gt;
1231    08/10/03                        Josh    FAILED&lt;br /&gt;
1232    08/17/03        11/10/05        Tom     Submitted     &lt;br /&gt;
1233    08/24/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1234    08/31/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1235    09/07/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1236    09/14/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1237    09/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1238    09/28/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1239    10/05/03        02/26/05        Josh    WORKING&lt;br /&gt;
1240    10/12/03        02/26/05        Josh    PENDING&lt;br /&gt;
1241    10/19/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1242    10/26/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1243    11/02/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1244    11/09/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1245    11/16/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1246    11/23/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1247    11/30/03        02/26/05        Josh    ALMOST CLEAN&lt;br /&gt;
1248    12/07/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1249    12/14/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1250    12/21/03        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1251    12/28/03        02/26/05        Josh    WORKING&lt;br /&gt;
&lt;br /&gt;
***************************2004*********************************&lt;br /&gt;
&lt;br /&gt;
1253    01/11/04        02/26/05        Josh    PENDING&lt;br /&gt;
1254    01/18/04                        Josh    PENDING&lt;br /&gt;
1255    01/25/04                        Josh    NEEDS&lt;br /&gt;
1256    02/01/04                        Josh    PENDING&lt;br /&gt;
1257    02/08/04                        Josh    PENDING&lt;br /&gt;
1258    02/15/04                        Josh    PENDING&lt;br /&gt;
1259    02/22/04                        Josh    PENDING&lt;br /&gt;
1260    02/29/04                        Josh    PENDING&lt;br /&gt;
1261    03/07/04        02/26/05        Josh    PENDING&lt;br /&gt;
1262    03/14/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1263    03/21/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1264    03/28/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1265    04/04/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1266    04/11/04        02/26/05        Josh    FAILED  No files found, needs re&lt;br /&gt;
run&lt;br /&gt;
1267    04/18/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1268    04/25/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1269    05/02/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1270    05/09/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1271    05/16/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1272    05/23/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1273    05/30/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1274    06/06/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1275    06/13/04        02/26/05        Josh    NEEDS CLEAN&lt;br /&gt;
1276    06/20/04        02/27/05        Samik   DONE&lt;br /&gt;
1277    06/27/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1278    07/04/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1279    07/11/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1280    07/18/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1281    07/25/04        03/19/05        Jeff    NEEDS CLEAN&lt;br /&gt;
1282    08/01/04                        Josh    PENDING&lt;br /&gt;
1283    08/08/04                        Josh    PENDING&lt;br /&gt;
1284    08/15/04                        Josh    PENDING&lt;br /&gt;
1285    08/22/04                        Josh    PENDING&lt;br /&gt;
1286    08/29/04                        Josh    PENDING&lt;br /&gt;
1287    09/05/04                        Josh    PENDING&lt;br /&gt;
1288    09/12/04                        Josh    PENDING&lt;br /&gt;
1289    09/19/04        03/19/05        Jeff    ??&lt;br /&gt;
1290    09/26/04                        Josh    PENDING&lt;br /&gt;
1291    10/03/04                        Josh    PENDING&lt;br /&gt;
1292    10/10/04        03/19/05        Jeff    ??&lt;br /&gt;
1293    10/17/04                        Josh    PENDING&lt;br /&gt;
1294    10/24/04        03/19/05        Jeff    ??&lt;br /&gt;
1295    10/31/04        03/25/05        Tom     CLEAN&lt;br /&gt;
1296    11/07/04        04/07/05        Tom     Cleaning&lt;br /&gt;
1297    11/14/04        03/19/05        Jeff    ??&lt;br /&gt;
1298    11/21/04                                NEEDS CLEAN&lt;br /&gt;
1299    11/28/04                                NEEDS CLEAN&lt;br /&gt;
1300    12/05/04        08/16/05        Pravee  CLEAN&lt;br /&gt;
1301    12/12/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1302    12/19/04        08/19/05        Pravee  CLEAN&lt;br /&gt;
1303    12/26/04        08/26/05        Pravee  CLEAN&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Archaic (pre-1996)===&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
week #  start date      last processed  who     status&lt;br /&gt;
****************************************************************&lt;br /&gt;
At the moment, Jeff is dealing with these.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=1566</id>
		<title>RINEXing</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=1566"/>
				<updated>2005-10-21T04:58:05Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The program we now use to convert most files to rinex is called teqc, which is written and maintained by Lou Estey at UNAVCO. Teqc can handle data from many different receiver types. This page describes how to convert the data from Trimble receivers into RINEX format.&lt;br /&gt;
&lt;br /&gt;
===Converting Trimble .t00/.t01 files to .dat===&lt;br /&gt;
Normally you don't have to do this. The exception is the data from the Net-RS receiver, which under some circumstances we will only get in .t01 format. Or if you copy .t00 files straight off a flash card.&lt;br /&gt;
&lt;br /&gt;
====.t00 files====&lt;br /&gt;
There are two ways to convert these files to .dat format. You can do it on a Windows PC by right-clicking on the file and selecting &amp;quot;Convert to DAT format&amp;quot; from the contextual menu. Or you can convert the files on our Linux system using:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/usr/local/UNAVCO/bin/runpkr00 -d file.t00&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will create the file &amp;lt;code&amp;gt;file.dat&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
====.t01 files====&lt;br /&gt;
At the moment, these files can only be converted on a Windows PC that has the latest version of Trimble Data Transfer installed. Right-click on the file and choose &amp;quot;Convert to DAT format&amp;quot; from the contextual menu.&lt;br /&gt;
&lt;br /&gt;
===Converting dat files to RINEX===&lt;br /&gt;
You can create a single file to RINEX using teqc. We have a script that will create RINEX files and also add appropriate information into the RINEX headers. The easiest way to convert multiple files is to make up a script and then execute it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/bin/ls *.dat | awk '{print &amp;quot;newrinex_teqc -5700 tr_d &amp;quot; $0}' &amp;gt; rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There are some options you can add to this:&lt;br /&gt;
* The -5700 flag really means to use the Zephyr Geodetic antenna, you can use it for the Net-RS as well&lt;br /&gt;
* For data from a Trimble 4000 receiver, use -4000 instead.&lt;br /&gt;
* You can add the operator name using -oper Name (no spaces)&lt;br /&gt;
* If the data are from a non-UAF group, add -agency Agency&lt;br /&gt;
* For 4000 receivers, add -slant 0.2334 -0.0591 to convert from slant to vertical height&lt;br /&gt;
&lt;br /&gt;
Then execute the script:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There should be one RINEX observation file (&amp;lt;code&amp;gt;*.YYo&amp;lt;/code&amp;gt;, YY = last 2 digits of the year) and one RINEX navigation file (&amp;lt;code&amp;gt;*.YYn&amp;lt;/code&amp;gt;) for each .dat file.&lt;br /&gt;
&lt;br /&gt;
To make a fully complete RINEX file, you would have to edit the file in a text editor and add any missing items that are blank. That is important for files that we might send to someone else, but it doesn't matter for our own processing.&lt;br /&gt;
&lt;br /&gt;
The final step is to compress the files using gzip.&lt;br /&gt;
&lt;br /&gt;
===Splitting mulit-day dat files===&lt;br /&gt;
Normally we set our receivers to start a new file every day at UTC midnight and track for 24 hours. Every so often we get a data file that has several days all in one file, because the receiver was not programmed correctly. Most often this happens with borrowed receivers that we forget to reprogram.&lt;br /&gt;
&lt;br /&gt;
If the file contains just a few days of data you can just create a RINEX file as above, then use &amp;lt;pre&amp;gt;rinexwin&amp;lt;/pre&amp;gt; to split it into daily files. For example,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rinexwin warr2061.05o -ge &amp;quot;25-jul-2005 00:00&amp;quot; -lt &amp;quot;26-jul-2005 00:00&amp;quot; &amp;gt; ../warr2060.05o&lt;br /&gt;
rinexwin warr2061.05o -ge &amp;quot;26-jul-2005 00:00&amp;quot; -lt &amp;quot;27-jul-2005 00:00&amp;quot; &amp;gt; ../warr2070.05o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, once a file spans a long enough time it requires some special handling. For files spanning several days you can generally add the +smtt flag to the teqc command (at the moment you have to edit scripte or run it manually to do this). For longer files you need to split the dat file up into daily dat files. The reason for this has to do with the convention typically used in RINEX for handling the 1 millisecond clock offsets in the Trimble and other receivers. These files require some manual fiddling.&lt;br /&gt;
&lt;br /&gt;
There is a 4 step process:&lt;br /&gt;
1. Use teqc to create a diagnostic file that contains information about the time and byte position of each data record in the file.&lt;br /&gt;
2. Run a perl program that processes this file to make a list of epochs around UTC midnight along with their byte locations.&lt;br /&gt;
3. Edit this file, or a copy, to keep only those lines for the epochs at midnight.&lt;br /&gt;
4. Use a simple awk command to convert this file into a script to run the program chop, which chops bytes out of a file.&lt;br /&gt;
5. Edit this script to make a final simple change, and run it. OK, that is 5 steps.&lt;br /&gt;
&lt;br /&gt;
The reason for all of these steps is because of the millisecond offsets in the data, so that it becomes difficult for a long file to predict exactly which observation is the one at UTC midnight. To do it would require figuring out which way the clock is drifting. It would not be too hard to do that, but is too much programming hassle.&lt;br /&gt;
&lt;br /&gt;
This was developed before the teqc +smtt option, and I believe that by using teqc +smtt it should be possible to create a single perl program that would do all of the above steps in one.&lt;br /&gt;
&lt;br /&gt;
For example, suppose the file tk5e1440.dat contains a couple of months of data. The sequence of commands to split the file is given below.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
teqc +diag -O.obs - tk5e1440.dat &amp;gt;&amp;amp; tk5e1440.dat.diag&lt;br /&gt;
~jeff/bin/split_big_dat tk5e1440.dat.diag &amp;gt; rectimes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The file tk5e1440.dat.diag has the RINEX file lines with the observations times, plus a bunch of lines that refer to Trimble download frames. There is one of these with type=17 for each data record.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
 05  5 24  2 41 15.0000000  0  4G 4G13G16G23&lt;br /&gt;
Trimble download frame t @ 0o00004373 = 0x000008fb = 00002299   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00004671 = 0x000009b9 = 00002489   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00005167 = 0x00000a77 = 00002679   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00005465 = 0x00000b35 = 00002869   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 41 30.0000000  0  4G 4G13G16G23&lt;br /&gt;
Trimble download frame t @ 0o00006152 = 0x00000c6a = 00003178   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00006450 = 0x00000d28 = 00003368   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00006746 = 0x00000de6 = 00003558   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00007244 = 0x00000ea4 = 00003748   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00007542 = 0x00000f62 = 00003938   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 41 45.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
Trimble download frame t @ 0o00010667 = 0x000011b7 = 00004535   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 42  0.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
Trimble download frame t @ 0o00011764 = 0x000013f4 = 00005108   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00012262 = 0x000014b2 = 00005298   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 42 15.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The split_big_dat program is written in perl, and it processes this file and boils it down to an essential one line per data record, giving the byte number and the observation time:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
5097 at byte number 03354160 : 05 5 24 23 55 14.9710000&lt;br /&gt;
5098 at byte number 03354829 : 05 5 24 23 55 29.9710000&lt;br /&gt;
5099 at byte number 03355498 : 05 5 24 23 55 44.9710000&lt;br /&gt;
5100 at byte number 03356167 : 05 5 24 23 55 59.9710000&lt;br /&gt;
5101 at byte number 03356836 : 05 5 24 23 56 14.9710000&lt;br /&gt;
5102 at byte number 03357505 : 05 5 24 23 56 29.9710000&lt;br /&gt;
5103 at byte number 03358174 : 05 5 24 23 56 44.9710000&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now the trick is to edit this file and keep only the lines that correspond to the day boundaries. Be careful about the millisecond offsets. The split_big_dat program only prints out the lines that are near to midnight, so it will be pretty easy to do this. A trick is that once you have found the first one, you can delete 43 lines and the line after that will be the next one. The first few lines of the edited file look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
5116 at byte number 03366871 : 05 5 24 23 59 59.9710000&lt;br /&gt;
10876 at byte number 07145513 : 05 5 25 23 59 59.9390000&lt;br /&gt;
16636 at byte number 10925649 : 05 5 26 23 59 59.9060000&lt;br /&gt;
22396 at byte number 14704101 : 05 5 27 23 59 59.8720000&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this file, every few days there was some problem with the data point at 0000 UTC, so I picked the one at 0015 for those days. This example file had data from 43 different days (42 lines in the file).&lt;br /&gt;
&lt;br /&gt;
Next we make up a script that will run a program called chop over and over again to chop out pieces of the file. Chop needs to know the byte number of the start and end of the file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mkdir split&lt;br /&gt;
awk 'BEGIN {last = 0} {print &amp;quot;~jeff/bin/chop +&amp;quot; last &amp;quot; -&amp;quot; $5 &amp;quot; tk5e1440.dat &amp;gt; split/tk5exxx0.dat&amp;quot; ; last = $5}' rectimes.edited &amp;gt; splits&lt;br /&gt;
&lt;br /&gt;
more splits&lt;br /&gt;
~jeff/bin/chop +0 -03366871  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +03366871 -07145513  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +07145513 -10925649  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +10925649 -14704101  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +14704101 -18486658  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +18486658 -22266015  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +22266015 -26042019  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +26042019 -29821755  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +29821755 -33716237  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The last step is to edit the splits file and put in the correct day number for each of the output RINEX files. You have to add a final line for the last day by hand. It should look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
~jeff/bin/chop +157859599 -161760773  tk5e1440.dat &amp;gt; split/tk5e1840.dat&lt;br /&gt;
~jeff/bin/chop +161760773  tk5e1440.dat &amp;gt; split/tk5e1850.dat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then run the script, and you have daily .dat files! One catch is that some of the header meta-data may be missing from the files, unless it is repeated every day. So you may need to provide more meta-data when running teqc to have a complete RINEX header.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=25</id>
		<title>RINEXing</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=25"/>
				<updated>2005-10-21T00:56:23Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The program we now use to convert most files to rinex is called teqc, which is written and maintained by Lou Estey at UNAVCO. Teqc can handle data from many different receiver types. This page describes how to convert the data from Trimble receivers into RINEX format.&lt;br /&gt;
&lt;br /&gt;
===Converting Trimble .t00/.t01 files to .dat===&lt;br /&gt;
Normally you don't have to do this. The exception is the data from the Net-RS receiver, which under some circumstances we will only get in .t01 format. Or if you copy .t00 files straight off a flash card.&lt;br /&gt;
&lt;br /&gt;
====.t00 files====&lt;br /&gt;
There are two ways to convert these files to .dat format. You can do it on a Windows PC by right-clicking on the file and selecting &amp;quot;Convert to DAT format&amp;quot; from the contextual menu. Or you can convert the files on our Linux system using:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/usr/local/UNAVCO/bin/runpkr00 -d file.t00&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will create the file &amp;lt;code&amp;gt;file.dat&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
====.t01 files====&lt;br /&gt;
At the moment, these files can only be converted on a Windows PC that has the latest version of Trimble Data Transfer installed. Right-click on the file and choose &amp;quot;Convert to DAT format&amp;quot; from the contextual menu.&lt;br /&gt;
&lt;br /&gt;
===Converting dat files to RINEX===&lt;br /&gt;
You can create a single file to RINEX using teqc. We have a script that will create RINEX files and also add appropriate information into the RINEX headers. The easiest way to convert multiple files is to make up a script and then execute it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/bin/ls *.dat | awk '{print &amp;quot;newrinex_teqc -5700 tr_d &amp;quot; $0}' &amp;gt; rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There are some options you can add to this:&lt;br /&gt;
* The -5700 flag really means to use the Zephyr Geodetic antenna, you can use it for the Net-RS as well&lt;br /&gt;
* For data from a Trimble 4000 receiver, use -4000 instead.&lt;br /&gt;
* You can add the operator name using -oper Name (no spaces)&lt;br /&gt;
* If the data are from a non-UAF group, add -agency Agency&lt;br /&gt;
* For 4000 receivers, add -slant 0.2334 -0.0591 to convert from slant to vertical height&lt;br /&gt;
&lt;br /&gt;
Then execute the script:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There should be one RINEX observation file (&amp;lt;code&amp;gt;*.YYo&amp;lt;/code&amp;gt;, YY = last 2 digits of the year) and one RINEX navigation file (&amp;lt;code&amp;gt;*.YYn&amp;lt;/code&amp;gt;) for each .dat file.&lt;br /&gt;
&lt;br /&gt;
To make a fully complete RINEX file, you would have to edit the file in a text editor and add any missing items that are blank. That is important for files that we might send to someone else, but it doesn't matter for our own processing.&lt;br /&gt;
&lt;br /&gt;
The final step is to compress the files using gzip.&lt;br /&gt;
&lt;br /&gt;
===Splitting mulit-day dat files===&lt;br /&gt;
Normally we set our receivers to start a new file every day at UTC midnight and track for 24 hours. Every so often we get a data file that has several days all in one file, because the receiver was not programmed correctly. Most often this happens with borrowed receivers that we forget to reprogram.&lt;br /&gt;
&lt;br /&gt;
If the file contains just a few days of data you can just create a RINEX file as above, then use &amp;lt;pre&amp;gt;rinexwin&amp;lt;/pre&amp;gt; to split it into daily files. For example,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rinexwin warr2061.05o -ge &amp;quot;25-jul-2005 00:00&amp;quot; -lt &amp;quot;26-jul-2005 00:00&amp;quot; &amp;gt; ../warr2060.05o&lt;br /&gt;
rinexwin warr2061.05o -ge &amp;quot;26-jul-2005 00:00&amp;quot; -lt &amp;quot;27-jul-2005 00:00&amp;quot; &amp;gt; ../warr2070.05o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, once a file spans a long enough time it requires some special handling. For files spanning several days you can generally add the +smtt flag to the teqc command (at the moment you have to edit scripte or run it manually to do this). For longer files you need to split the dat file up into daily dat files. The reason for this has to do with the convention typically used in RINEX for handling the 1 millisecond clock offsets in the Trimble and other receivers. These files require some manual fiddling.&lt;br /&gt;
&lt;br /&gt;
There is a 4 step process:&lt;br /&gt;
1. Use teqc to create a diagnostic file that contains information about the time and byte position of each data record in the file.&lt;br /&gt;
2. Run a perl program that processes this file to make a list of &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
For example, suppose the file tk5e1440.dat contains a couple of months of data. The sequence of commands to split the file is given below.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
teqc +diag -O.obs - tk5e1440.dat &amp;gt;&amp;amp; tk5e1440.dat.diag&lt;br /&gt;
~jeff/bin/split_big_dat tk5e1440.dat.diag &amp;gt; rectimes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The file tk5e1440.dat.diag has the RINEX file lines with the observations times, plus a bunch of lines that refer to Trimble download frames. There is one of these with type=17 for each data record.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
 05  5 24  2 41 15.0000000  0  4G 4G13G16G23&lt;br /&gt;
Trimble download frame t @ 0o00004373 = 0x000008fb = 00002299   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00004671 = 0x000009b9 = 00002489   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00005167 = 0x00000a77 = 00002679   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00005465 = 0x00000b35 = 00002869   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 41 30.0000000  0  4G 4G13G16G23&lt;br /&gt;
Trimble download frame t @ 0o00006152 = 0x00000c6a = 00003178   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00006450 = 0x00000d28 = 00003368   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00006746 = 0x00000de6 = 00003558   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00007244 = 0x00000ea4 = 00003748   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00007542 = 0x00000f62 = 00003938   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 41 45.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
Trimble download frame t @ 0o00010667 = 0x000011b7 = 00004535   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 42  0.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
Trimble download frame t @ 0o00011764 = 0x000013f4 = 00005108   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00012262 = 0x000014b2 = 00005298   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 42 15.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The split_big_dat program is written in perl, and it processes this file and boils it down to an essential one line per data record, giving the byte number and the observation time:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
5097 at byte number 03354160 : 05 5 24 23 55 14.9710000&lt;br /&gt;
5098 at byte number 03354829 : 05 5 24 23 55 29.9710000&lt;br /&gt;
5099 at byte number 03355498 : 05 5 24 23 55 44.9710000&lt;br /&gt;
5100 at byte number 03356167 : 05 5 24 23 55 59.9710000&lt;br /&gt;
5101 at byte number 03356836 : 05 5 24 23 56 14.9710000&lt;br /&gt;
5102 at byte number 03357505 : 05 5 24 23 56 29.9710000&lt;br /&gt;
5103 at byte number 03358174 : 05 5 24 23 56 44.9710000&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now the trick is to edit this file and keep only the lines that correspond to the day boundaries. Be careful about the millisecond offsets. The split_big_dat program only prints out the lines that are near to midnight, so it will be pretty easy to do this. A trick is that once you have found the first one, you can delete 43 lines and the line after that will be the next one. The first few lines of the edited file look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
5116 at byte number 03366871 : 05 5 24 23 59 59.9710000&lt;br /&gt;
10876 at byte number 07145513 : 05 5 25 23 59 59.9390000&lt;br /&gt;
16636 at byte number 10925649 : 05 5 26 23 59 59.9060000&lt;br /&gt;
22396 at byte number 14704101 : 05 5 27 23 59 59.8720000&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this file, every few days there was some problem with the data point at 0000 UTC, so I picked the one at 0015 for those days. This example file had data from 43 different days (42 lines in the file).&lt;br /&gt;
&lt;br /&gt;
Next we make up a script that will run a program called chop over and over again to chop out pieces of the file. Chop needs to know the byte number of the start and end of the file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mkdir split&lt;br /&gt;
awk 'BEGIN {last = 0} {print &amp;quot;~jeff/bin/chop +&amp;quot; last &amp;quot; -&amp;quot; $5 &amp;quot; tk5e1440.dat &amp;gt; split/tk5exxx0.dat&amp;quot; ; last = $5}' rectimes.edited &amp;gt; splits&lt;br /&gt;
&lt;br /&gt;
more splits&lt;br /&gt;
~jeff/bin/chop +0 -03366871  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +03366871 -07145513  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +07145513 -10925649  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +10925649 -14704101  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +14704101 -18486658  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +18486658 -22266015  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +22266015 -26042019  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +26042019 -29821755  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +29821755 -33716237  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The last step is to edit the splits file and put in the correct day number for each of the output RINEX files. You have to add a final line for the last day by hand. It should look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
~jeff/bin/chop +157859599 -161760773  tk5e1440.dat &amp;gt; split/tk5e1840.dat&lt;br /&gt;
~jeff/bin/chop +161760773  tk5e1440.dat &amp;gt; split/tk5e1850.dat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then run the script, and you have daily .dat files! One catch is that some of the header meta-data may be missing from the files, unless it is repeated every day. So you may need to provide more meta-data when running teqc to have a complete RINEX header.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=24</id>
		<title>RINEXing</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=24"/>
				<updated>2005-10-20T01:13:54Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The program we now use to convert most files to rinex is called teqc, which is written and maintained by Lou Estey at UNAVCO. Teqc can handle data from many different receiver types. This page describes how to convert the data from Trimble receivers into RINEX format.&lt;br /&gt;
&lt;br /&gt;
===Converting Trimble .t00/.t01 files to .dat===&lt;br /&gt;
Normally you don't have to do this. The exception is the data from the Net-RS receiver, which under some circumstances we will only get in .t01 format. Or if you copy .t00 files straight off a flash card.&lt;br /&gt;
&lt;br /&gt;
====.t00 files====&lt;br /&gt;
There are two ways to convert these files to .dat format. You can do it on a Windows PC by right-clicking on the file and selecting &amp;quot;Convert to DAT format&amp;quot; from the contextual menu. Or you can convert the files on our Linux system using:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/usr/local/UNAVCO/bin/runpkr00 -d file.t00&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will create the file &amp;lt;code&amp;gt;file.dat&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
====.t01 files====&lt;br /&gt;
At the moment, these files can only be converted on a Windows PC that has the latest version of Trimble Data Transfer installed. Right-click on the file and choose &amp;quot;Convert to DAT format&amp;quot; from the contextual menu.&lt;br /&gt;
&lt;br /&gt;
===Converting dat files to RINEX===&lt;br /&gt;
You can create a single file to RINEX using teqc. We have a script that will create RINEX files and also add appropriate information into the RINEX headers. The easiest way to convert multiple files is to make up a script and then execute it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/bin/ls *.dat | awk '{print &amp;quot;newrinex_teqc -5700 tr_d &amp;quot; $0}' &amp;gt; rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There are some options you can add to this:&lt;br /&gt;
* The -5700 flag really means to use the Zephyr Geodetic antenna, you can use it for the Net-RS as well&lt;br /&gt;
* For data from a Trimble 4000 receiver, use -4000 instead.&lt;br /&gt;
* You can add the operator name using -oper Name (no spaces)&lt;br /&gt;
* If the data are from a non-UAF group, add -agency Agency&lt;br /&gt;
* For 4000 receivers, add -slant 0.2334 -0.0591 to convert from slant to vertical height&lt;br /&gt;
&lt;br /&gt;
Then execute the script:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There should be one RINEX observation file (&amp;lt;code&amp;gt;*.YYo&amp;lt;/code&amp;gt;, YY = last 2 digits of the year) and one RINEX navigation file (&amp;lt;code&amp;gt;*.YYn&amp;lt;/code&amp;gt;) for each .dat file.&lt;br /&gt;
&lt;br /&gt;
To make a fully complete RINEX file, you would have to edit the file in a text editor and add any missing items that are blank. That is important for files that we might send to someone else, but it doesn't matter for our own processing.&lt;br /&gt;
&lt;br /&gt;
The final step is to compress the files using gzip.&lt;br /&gt;
&lt;br /&gt;
===Splitting mulit-day dat files===&lt;br /&gt;
Normally we set our receivers to start a new file every day at UTC midnight and track for 24 hours. Every so often we get a data file that has several days all in one file, because the receiver was not programmed correctly. Most often this happens with borrowed receivers that we forget to reprogram.&lt;br /&gt;
&lt;br /&gt;
If the file contains just a few days of data you can just create a RINEX file as above, then use &amp;lt;pre&amp;gt;rinexwin&amp;lt;/pre&amp;gt; to split it into daily files. For example,&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
rinexwin warr2061.05o -ge &amp;quot;25-jul-2005 00:00&amp;quot; -lt &amp;quot;26-jul-2005 00:00&amp;quot; &amp;gt; ../warr2060.05o&lt;br /&gt;
rinexwin warr2061.05o -ge &amp;quot;26-jul-2005 00:00&amp;quot; -lt &amp;quot;27-jul-2005 00:00&amp;quot; &amp;gt; ../warr2070.05o&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
However, once a file spans a long enough time it requires some special handling. For files spanning several days you can generally add the +smtt flag to the teqc command (at the moment you have to edit scripte or run it manually to do this). For longer files you need to split the dat file up into daily dat files. The reason for this has to do with the convention typically used in RINEX for handling the 1 millisecond clock offsets in the Trimble and other receivers. These files require some manual fiddling.&lt;br /&gt;
&lt;br /&gt;
There is a 4 step process:&lt;br /&gt;
1. Use teqc to create a diagnostic file that contains information about the time and byte position of each data record in the file.&lt;br /&gt;
2. Run a perl program that processes this file to make a list of &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
For example, suppose the file tk5e1440.dat contains a couple of months of data. The sequence of commands to split the file is given below.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
teqc +diag -O.obs - tk5e1440.dat &amp;gt;&amp;amp; tk5e1440.dat.diag&lt;br /&gt;
~jeff/bin/split_big_dat tk5e1440.dat.diag &amp;gt; rectimes&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The file tk5e1440.dat.diag has the RINEX file lines with the observations times, plus a bunch of lines that refer to Trimble download frames. There is one of these with type=17 for each data record.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
 05  5 24  2 41 15.0000000  0  4G 4G13G16G23&lt;br /&gt;
Trimble download frame t @ 0o00004373 = 0x000008fb = 00002299   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00004671 = 0x000009b9 = 00002489   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00005167 = 0x00000a77 = 00002679   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00005465 = 0x00000b35 = 00002869   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 41 30.0000000  0  4G 4G13G16G23&lt;br /&gt;
Trimble download frame t @ 0o00006152 = 0x00000c6a = 00003178   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00006450 = 0x00000d28 = 00003368   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00006746 = 0x00000de6 = 00003558   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00007244 = 0x00000ea4 = 00003748   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00007542 = 0x00000f62 = 00003938   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 41 45.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
Trimble download frame t @ 0o00010667 = 0x000011b7 = 00004535   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 42  0.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
Trimble download frame t @ 0o00011764 = 0x000013f4 = 00005108   type= 0x15 = 21&lt;br /&gt;
Trimble download frame t @ 0o00012262 = 0x000014b2 = 00005298   type= 0x11 = 17&lt;br /&gt;
 05  5 24  2 42 15.0000000  0  8G 2G 4G 8G10G13G16G23G20&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The split_big_dat program is written in perl, and it processes this file and boils it down to an essential one line per data record, giving the byte number and the observation time:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
5097 at byte number 03354160 : 05 5 24 23 55 14.9710000&lt;br /&gt;
5098 at byte number 03354829 : 05 5 24 23 55 29.9710000&lt;br /&gt;
5099 at byte number 03355498 : 05 5 24 23 55 44.9710000&lt;br /&gt;
5100 at byte number 03356167 : 05 5 24 23 55 59.9710000&lt;br /&gt;
5101 at byte number 03356836 : 05 5 24 23 56 14.9710000&lt;br /&gt;
5102 at byte number 03357505 : 05 5 24 23 56 29.9710000&lt;br /&gt;
5103 at byte number 03358174 : 05 5 24 23 56 44.9710000&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Now the trick is to edit this file and keep only the lines that correspond to the day boundaries. Be careful about the millisecond offsets. The split_big_dat program only prints out the lines that are near to midnight, so it will be pretty easy to do this. A trick is that once you have found the first one, you can delete 43 lines and the line after that will be the next one. The first few lines of the edited file look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
5116 at byte number 03366871 : 05 5 24 23 59 59.9710000&lt;br /&gt;
10876 at byte number 07145513 : 05 5 25 23 59 59.9390000&lt;br /&gt;
16636 at byte number 10925649 : 05 5 26 23 59 59.9060000&lt;br /&gt;
22396 at byte number 14704101 : 05 5 27 23 59 59.8720000&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In this file, every few days there was some problem with the data point at 0000 UTC, so I picked the one at 0015 for those days. This example file had data from 43 different days (42 lines in the file).&lt;br /&gt;
&lt;br /&gt;
Next we make up a script that will run a program called chop over and over again to chop out pieces of the file. Chop needs to know the byte number of the start and end of the file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
mkdir split&lt;br /&gt;
awk 'BEGIN {last = 0} {print &amp;quot;~jeff/bin/chop +&amp;quot; last &amp;quot; -&amp;quot; $5 &amp;quot;  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&amp;quot;; last = $5}' rectimes.edited &amp;gt; splits&lt;br /&gt;
&lt;br /&gt;
more splits&lt;br /&gt;
~jeff/bin/chop +0 -03366871  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +03366871 -07145513  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +07145513 -10925649  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +10925649 -14704101  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +14704101 -18486658  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +18486658 -22266015  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +22266015 -26042019  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +26042019 -29821755  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
~jeff/bin/chop +29821755 -33716237  tk5e1440.dat &amp;gt; split/tk5exxx0.dat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The last step is to edit the splits file and put in the correct day number for each of the output RINEX files. You have to add a final line for the last day by hand. It should look like this:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
~jeff/bin/chop +157859599 -161760773  tk5e1440.dat &amp;gt; split/tk5e1840.dat&lt;br /&gt;
~jeff/bin/chop +161760773  tk5e1440.dat &amp;gt; split/tk5e1850.dat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Then run the script, and you have daily .dat files! One catch is that some of the header meta-data may be missing from the files, unless it is repeated every day. So you may need to provide more meta-data when running teqc to have a complete RINEX header.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=23</id>
		<title>RINEXing</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=RINEXing&amp;diff=23"/>
				<updated>2005-10-04T07:01:22Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The program we now use to convert most files to rinex is called teqc, which is written and maintained by Lou Estey at UNAVCO. Teqc can handle data from many different receiver types. This page describes how to convert the data from Trimble receivers into RINEX format.&lt;br /&gt;
&lt;br /&gt;
===Converting Trimble .t00/.t01 files to .dat===&lt;br /&gt;
Normally you don't have to do this. The exception is the data from the Net-RS receiver, which under some circumstances we will only get in .t01 format. Or if you copy .t00 files straight off a flash card.&lt;br /&gt;
&lt;br /&gt;
====.t00 files====&lt;br /&gt;
There are two ways to convert these files to .dat format. You can do it on a Windows PC by right-clicking on the file and selecting &amp;quot;Convert to DAT format&amp;quot; from the contextual menu. Or you can convert the files on our Linux system using:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/usr/local/UNAVCO/bin/runpkr00 -d file.t00&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
This will create the file &amp;lt;code&amp;gt;file.dat&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
====.t01 files====&lt;br /&gt;
At the moment, these files can only be converted on a Windows PC that has the latest version of Trimble Data Transfer installed. Right-click on the file and choose &amp;quot;Convert to DAT format&amp;quot; from the contextual menu.&lt;br /&gt;
&lt;br /&gt;
===Converting dat files to RINEX===&lt;br /&gt;
You can create a single file to RINEX using teqc. We have a script that will create RINEX files and also add appropriate information into the RINEX headers. The easiest way to convert multiple files is to make up a script and then execute it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/bin/ls *.dat | awk '{print &amp;quot;newrinex_teqc -5700 tr_d &amp;quot; $0}' &amp;gt; rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There are some options you can add to this:&lt;br /&gt;
* The -5700 flag really means to use the Zephyr Geodetic antenna, you can use it for the Net-RS as well&lt;br /&gt;
* For data from a Trimble 4000 receiver, use -4000 instead.&lt;br /&gt;
* You can add the operator name using -oper Name (no spaces)&lt;br /&gt;
* If the data are from a non-UAF group, add -agency Agency&lt;br /&gt;
* For 4000 receivers, add -slant 0.2334 -0.0591 to convert from slant to vertical height&lt;br /&gt;
&lt;br /&gt;
Then execute the script:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
sh rinexem&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There should be one RINEX observation file (&amp;lt;code&amp;gt;*.YYo&amp;lt;/code&amp;gt;, YY = last 2 digits of the year) and one RINEX navigation file (&amp;lt;code&amp;gt;*.YYn&amp;lt;/code&amp;gt;) for each .dat file.&lt;br /&gt;
&lt;br /&gt;
To make a fully complete RINEX file, you would have to edit the file in a text editor and add any missing items that are blank. That is important for files that we might send to someone else, but it doesn't matter for our own processing.&lt;br /&gt;
&lt;br /&gt;
The final step is to compress the files using gzip.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=GeodesyLab:About&amp;diff=421</id>
		<title>GeodesyLab:About</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=GeodesyLab:About&amp;diff=421"/>
				<updated>2005-10-04T06:31:15Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The UAF Geophysical Institute Geodesy Lab is a research group also associated with the Department of Geology and Geophysics. We conduct research into a variety of crustal deformation problems using modern space geodetic techniques. Primarily we use the Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR).&lt;br /&gt;
&lt;br /&gt;
Dr. Jeff Freymueller, Professor of Geophysics&lt;br /&gt;
&lt;br /&gt;
Dr. Chris Larsen, Research Associate (also with Glaciology group)&lt;br /&gt;
&lt;br /&gt;
Max Kaufman, Research Technician&lt;br /&gt;
&lt;br /&gt;
Graduate Students&lt;br /&gt;
* Ryan Cross, M.S. candidate&lt;br /&gt;
* Julie Elliott, Ph.D. candidate&lt;br /&gt;
* Tom Fournier, Ph.D. candidate&lt;br /&gt;
* Jill Shipman, Ph.D. candidate&lt;br /&gt;
* Samik Sil, M.S. candidate&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	<entry>
		<id>http://gps.alaska.edu/internal/index.php?title=Main_Page&amp;diff=7</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="http://gps.alaska.edu/internal/index.php?title=Main_Page&amp;diff=7"/>
				<updated>2005-10-04T05:27:18Z</updated>
		
		<summary type="html">&lt;p&gt;Jeff: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''Welcome to the GI Geodesy internal page'''&lt;br /&gt;
&lt;br /&gt;
This internal page is based on a wiki, which is a web-based system for creating collaborative content. We can all edit the content on this site. The particular wiki engine used here is [http://www.mediawiki.org MediaWiki], which is the engine used for [http://www.wikipedia.org Wikipedia], the multi-lingual free encyclopedia. Once it is fully set up, a username and password will be required to edit content, although others can browse it.&lt;br /&gt;
:For more information on how this works, see http://meta.wikimedia.org/wiki/MediaWiki_User's_Guide&lt;br /&gt;
&lt;br /&gt;
If all goes according to plan, all of us will contribute useful information to this site, which will then become a useful reference for the rest of us.&lt;br /&gt;
&lt;br /&gt;
One quick set up note: links shown in Red are links to non-existent pages (add some content there!). Links shown in blue are to existing pages. This is all in the early stages of construction. -- Jeff Freymueller, 10/3/2005.&lt;br /&gt;
&lt;br /&gt;
=== GPS Fieldwork ===&lt;br /&gt;
A simple guide to GPS [[fieldwork]], from finding site information to&lt;br /&gt;
setting up the receiver and antenna, and converting data files to rinex format.&lt;br /&gt;
&lt;br /&gt;
=== Working with the Geodesy Lab linux computer system ===&lt;br /&gt;
General information about our computer system will go here.&lt;br /&gt;
&lt;br /&gt;
=== The WHOS data analysis list ===&lt;br /&gt;
The [[WHOS]] list keeps track of which GPS weeks are assigned to&lt;br /&gt;
which people.&lt;br /&gt;
&lt;br /&gt;
=== GPS Data Analysis ===&lt;br /&gt;
We use the GIPSY  software for GPS data analysis. Jeff Freymueller has written&lt;br /&gt;
a set of scripts around GIPSY that automate data acquisition and much&lt;br /&gt;
of the data analysis tasks. Documentation for the [[GPS analysis system]] is&lt;br /&gt;
under construction here.&lt;br /&gt;
&lt;br /&gt;
=== InSAR software ===&lt;br /&gt;
We have the ROI_PAC software installed. More information will follow.&lt;/div&gt;</summary>
		<author><name>Jeff</name></author>	</entry>

	</feed>