MS Word - University of Hawaii Sea Level Center

advertisement
Hourly Sea Level Data Processing
and Quality Control Software:
Version for
Linux Operating Systems
SLP64 User Manual
(Version 4.0 of SLP Series)
Patrick Caldwell
Joint Archive for Sea Level of the
National Oceanographic Data Center and
University of Hawaii Sea Level Center
January 2015
JIMAR CONTRIBUTION NO. 15-390
SLP64 USER MANUAL
Abstract
This manual describes a Linux version of the SLP64 software, which was originally
designed for Microsoft operating systems.
The software was originally developed in 1988 for capacity building of hydrographic
agencies that collect and process tidal (sea level) data. The package was designed for IBMcompatible personal computers since they were inexpensive and readily available. It was
referred to as SLPRC, which was the name of the primary working directory. An upgrade to the
package was made in 1998 for the year 2000 issues, referred to as SLPR2 (new directory name).
The original Fortran software was compiled under 16 or 32-bit DOS operating systems. When
the 64-bit operating systems arrived, the programs from the original package would not execute.
This required a re-compilation. The plotting libraries of the original set are not available for 64bit systems. Thus, the new plotting software uses Python/Matplotlib.
The updated version is referred to as SLP64 (new directory name) in order to clearly
distinguishes the 64-bit compatibility. Additional improvements were made to several routines
moving from SLPR2 to SLP64. The working data file format remains the same, which is
referred to as the SLP64 processing format, though also known by other names, such as the
SLPR2, TOGA, or the University of Hawaii Sea Level Center (UHSLC) processing format. This
format is kept because it is easier for the quality control steps that require viewing the observed
and residual files in text editors simultaneously, an essential component to processing. The
format can readily be changed to a more contemporary format where each row or line has date,
time, and a single sea level value or missing data flag.
This manual serves as a user's guide for processing and quality control of hourly sea level
data. The goal is to prepare a scientifically-valid set for enhancing the application of the data.
The package was assembled and documented by the Joint Archive for Sea Level (JASL), a
collaboration between the UHSLC and the National Oceanographic Data Center (NODC), a
branch of the National Oceanic and Atmospheric Administration (NOAA). The software
includes the Canadian Institute for Ocean Sciences Tidal Package for tidal analysis and
prediction routines. The tidal predictions should be applied to quality control. The tidal
predictions should not be used for nautical or coastal engineering purposes.
ii
SLP64 USER MANUAL
Table of Contents
1
Introduction and Setup ................................................................................................................ 1-1
1.1
Important Points .......................................................................................................... 1-2
1.2
System Requirements .................................................................................................. 1-3
1.3
Installation and Software Organization ....................................................................... 1-3
1.3.1 Python .............................................................................................................. 1-3
1.3.2 SLP64 .............................................................................................................. 1-4
1.4
Station Information Setup ........................................................................................... 1-4
1.5
Filename Conventions and Data Formats ................................................................... 1-4
1.5.1 Files with Hourly Sampling Intervals Grouped by Year ................................. 1-5
1.5.2 Files with Hourly or Daily Sampling Intervals Grouped by Month ................ 1-6
1.5.3 Files of Daily and Monthly Means .................................................................. 1-7
2
3
4
Utilities ............................................................................................................................................. 2-1
2.1
Convert to SLP64 Format for Observed Hourly Data ................................................. 2-1
2.2
Convert Units .............................................................................................................. 2-3
2.3
List Missing Data ........................................................................................................ 2-3
2.4
Make Blocks of Missing Data Flags ........................................................................... 2-4
2.5
Shift Hourly Data in Time by Increments of An Hour................................................ 2-4
2.6
Convert SLP64 Format to CSV ................................................................................... 2-5
2.7
Add Constant to Each Hourly Value ........................................................................... 2-6
Plotting Routines ........................................................................................................................... 3-1
3.1
Hourly Files Grouped by Year .................................................................................... 3-1
3.2
Expanded Residual Plot............................................................................................... 3-2
3.3
Scatter Diagram for Tide Staff-Gauge Pairs ............................................................... 3-2
3.4
Daily and Monthly Data Plots ..................................................................................... 3-2
Tidal Analysis and Prediction .................................................................................................... 4-1
4.1
Tidal Analysis.............................................................................................................. 4-1
4.2
Tidal Prediction ........................................................................................................... 4-4
iii
SLP64 USER MANUAL
5
6
7
Quality Control .............................................................................................................................. 5-1
5.1
Hourly Residuals ......................................................................................................... 5-1
5.2
Reference Level Stability ............................................................................................ 5-2
5.3
Timing Errors .............................................................................................................. 5-6
5.4
Short Gaps and Data Spikes ........................................................................................ 5-8
Filtering ........................................................................................................................................... 6-1
6.1
The Filter ..................................................................................................................... 6-1
6.2
Running the Filter Program ......................................................................................... 6-2
6.3
Examine the Output ..................................................................................................... 6-2
Final Comment .............................................................................................................................. 7-1
7.1
Acknowledgements ..................................................................................................... 7-1
7.2
References ................................................................................................................... 7-2
Appendices ................................................................................................................................. A-1
A
Summary of Upgrades from SLPR2 to SLP64 .......................................................... A-1
B
Access to Package ...................................................................................................... B-1
C
Directory Tree ............................................................................................................ C-1
D
Formats ....................................................................................................................... D-1
E
Station Information File ..............................................................................................E-1
F
Notes on Foreman Tidal Analysis Program ................................................................ F-1
G
Harmonic Constants: Analysis Output ....................................................................... G-1
H
Make Predictions other than Hourly .......................................................................... H-1
I
Test for Magnitude of Timing Error............................................................................. I-1
J
Figures ......................................................................................................................... J-1
iv
SLP64 USER MANUAL
1 Introduction and Setup
The Joint Archive for Sea Level (JASL) has prepared a user-friendly, interactive software
package for hourly sea level data processing and quality control under Microsoft (MS) operating
systems (Caldwell, 2014). A version for Linux was adapted in December 2014 and is explained
in this manual. The JASL software utilizes the Canadian Institute for Ocean Sciences (IOS) Tidal
Package. The IOS tidal analysis and prediction routines were written in Fortran by Dr. Mike
Foreman. See http://www.pac.dfo-mpo.gc.ca/science/oceans/tidal-marees/index-eng.html for
more details.
The package includes three principle tasks: 1) tidal analysis and prediction, 2) quality
control, and 3) filtering hourly into daily and monthly values. The software is geared to those
with prior knowledge of computer basics as well as a general understanding of sea level
processing. The goal is to share software on commonly available platforms that will enhance the
quality of sea level data sets. This in turn will promote the growth of the international archives of
sea level, thus, providing scientists with greater opportunities to understand the ocean. JASL
encourages contributions to its expanding global archive (Caldwell and Merrifield, 2013).
This package is derived from existing routines that were commonly used for data
processing at the Tropical Ocean Global Atmosphere (TOGA) Sea Level Center (TSLC), under
direction of Dr. Klaus Wyrtki (International TOGA Project Office, 1992). Mr. Bernie Kilonsky
and Mrs. Shikiko Nakahara wrote the earlier versions of some core routines, many of which date
back to the 1970s under the North Pacific Experiment (NORPAX) (Wyrtki, 1979). Under the
TOGA program, major efforts were made to centralize hourly sea level for a scientific archive.
Mr. Patrick Caldwell was hired by the NODC in 1987 to support this task through the JASL.
With the goal of transferring the latest information technology and enhancing the quality of data
for various agencies in South and Central America, Mr. Caldwell put together the first public
domain package in 1988 (Version 1.0). In 1990, the experts of Intergovernmental Oceanographic
Commission (IOC) Global Sea Level Observing System (GLOSS) suggested improvements for
the first version, resulting in a more complete second version, which was finalized in 1991
(Version 2.0), with copies available in English and Spanish. Over 150 copies have been
distributed and the package has been demonstrated at several GLOSS technical workshops
(UNESCO, 1993 and 1995). The TOGA program ended in 1995 but the JASL program
continued at the UHSLC. Under UHSLC guidance, modifications to the 1991 package have
been made to comply with the Year 2000 (Version 3.0, 1998). Since 1998, the software has
been distributed through FTP and the Internet to a large user base.
Versions 1-3 of the JASL software package were compiled under 16- or 32-bit MS
operating systems. In recent years, 64-bit operating systems are the standard. Since the original
plotting routines used PLOT88 libraries, which are unavailable for 64-bit systems, a new method
1-1
SLP64 USER MANUAL
of plotting was necessary. The solution was to adapt Python/Matplotlib, which is open source
software. The new version of the JASL sea level processing and quality control package is
Version 4.0 and designated “SLP64”. While updating the plotting routines, other improvements
were made to SLPR2 routines. A summary of changes from Version 3.0 (SLPR2) to Version 4.0
(SLP64) is given in Appendix A.
With the success under Microsoft operating systems and the versatile nature of Python
across computer platforms, a version of SLP64 was made for Linux operating systems. This
manual has minor edits to the original SLP64 manual to distinguish the differences under Linux.
1.1 Important Points
The package has been designed to work with hourly data sets. Data stored at higher
sampling intervals must be filtered to the hourly level and placed in the SLP64 data processing
format. The choice of filtering method is left up to the individual. At the UHSLC, we have used
a three-point Hanning filter centered on the hour with weights of 0.25, 0.50, and 0.25,
respectively. Utilities are provided to facilitate the conversion of originator’s hourly data files
into SLP64 format.
This package contains quality control procedures which require modification of data;
therefore, make copies of all original data prior to application of this software. Store the original
data in a safe location.
Data files provided with this package and examples in this manual are partially
fabricated. These files were purposely altered to show examples of data errors. Do not use the
sample data for scientific or engineering purposes.
An option is available for the creation of tide tables (times and heights of low and high
tides). These are to be used as a general reference only. Do not use these for navigational
purposes. For official nautical tidal predictions, please refer to the National Ocean Service
(NOS) of the National Oceanic and Atmospheric Administration (NOAA).
Python, an open source programming language, is required to run some SLP64 routines.
The user must ensure Python is properly set up within the Linux operating system. Matplotlib
can be acquired and set up concurrently with Python, since it is part of the overall Python
package.
Users should be familiar with basic computer concepts, such as directories (folders) and
files. Under Linux, the programs are ran from command windows (consoles, terminals) by
typing the name of the program at the command prompt. All programs are interactive so there
1-2
SLP64 USER MANUAL
are no control parameters needed when typing the program name at the prompt. Python
programs (files with “.py” as filename extension) are executed by typing at the command prompt
“python program.py”, where “program” is any python program file. A text editor is required
for some processing steps. Linux is case sensitive for file and directory names, which have
been set up in lower case. There are a few constant files in /slp64/din that are in upper case,
though the user does not need to manipulate these files.
The software in this package contains some internal consistency checks for the
interactive input, though it is impossible to predict all possible erroneous input. The user is
expected to provide input data files in the correct format, and to answer all interactive prompts
correctly. The programs will respond to incorrect answers and formats in an unpredictable
fashion, including occasionally aborting the program. If you are having trouble with the package,
first check the correctness of your formats and interactive responses. Ensure that output files
from previous runs of the same program do not exist. If so, delete them. Then execute the
program again. One can experiment with the sample data files provided under slp64/samp.
The Fortran programs were compiled using “gfortran –static filename.for –o
filename.exe”, where filename is the Fortran source code file. One undesirable characteristic of
Fortran is its inability to overwrite existing files. Thus, if a program must be run more than once,
be sure to eliminate the output files from the prior attempt. This is the most common cause for
programs to crash. Some of the new Python programs take care of this problem, but others do
not.
1.2 System Requirements
This package has been developed for use on computers with Linux operating systems. It
has been successfully tested under Red Hat (RHEL5) and Ubuntu 14.04.
1.3 Installation and Software Organization
1.3.1 Python
Python can be freely downloaded from https://www.python.org/. Code within SLP64
was prepared in 2014 with Python 2.6. It is stated on the Python “About” web tab, “Python is
developed under an OSI-approved open source license, making it freely usable and distributable,
even for commercial use. Python's license is administered by the Python Software Foundation.
If one chooses not to download the entire Python package, ensure Matplotlib is one of the
1-3
SLP64 USER MANUAL
selected module options. As a minimum requirement, Python 2.6 and Matplotlib 1.3 or higher
versions are necessary.
1.3.2 SLP64
1. Download the zipped package “slp64.tar.gz” (Appendix B), which will place a copy of the
file in your DOWNLOAD folder. The software package has been designed to have relative
paths below directory /SLP64, so the package can work under any level of the directory tree.
Place slp64.tar.gz into your desired directory.
2. Unzip by using “gzip –d slp64.tar.gz”. Unpack using “tar –xvf slp64.tar”. You can delete
slp64.tar once it is unpacked.
1.4 Station Information Setup
The first step after loading the software is to set up the STAtion INFOrmation Data INput
(stainfo.din) file (Appendix E). One must assign a three-digit number to each unique tide
station, along with other information about the station. Ensure columns of each parameter line
up exactly. Fortran programs read this file. Fortran requires fixed column positions of fields.
The stainfo.din file provided with SLP64 has stations and number assignments that exist
at the UHSLC. Users of SLP64 have the liberty to change or remove any of the stations in the
UHSLC list to cater to your own set of stations.
A summary of the fields in stainfo.din is given in Appendix E. Mandatory fields are the
Station Number and Name, Latitude, and Longitude.
The following field is the time-of-day schema, with 0 denoting hours 1-24 (British
system) and 1 denoting hours 0-23. This should be fixed as 1.
The remaining fields can be left blank.
1.5 Filename Conventions and Data Formats
1-4
SLP64 USER MANUAL
The file naming convention for SLP64 data files conforms to old standards; that is, 8
characters for the root of the name and 3 characters for the file extension. All formats are ASCII
and can be viewed with a text editor (Appendix D). Within this package, the format is referred to
as the SLP64 processing format.
All hourly sea level observations and residuals files are kept in subdirectory /slp64/dat.
It also holds the daily and monthly data files. The predicted tides are placed in subdirectory
/slp64/prd. Calibrations files are maintained in /slp64/cal.
1.5.1 Files with Hourly Sampling Intervals Grouped by Year
This format was chosen because it makes examination of the data by eye within an editor
easier. There is a utility under /slp64/util to convert from SLP64 hourly format to a simple
format (each row with one hour of tide data) in comma-separated version (CSV) format, which is
compatible with spreadsheets and can be viewed with a text editor.
The observed hourly data, the predicted tides, and the residuals (defined as observed data
minus predicted tides) have a similar SLP64 format and file naming convention. Each file
consists of a year of values at an hourly sampling interval.
The SLP64 processing format of the OBSERVED HOURLY DATA is described in
Appendix D and the file names have the following form (all letters are lower case):
CVSSSYY.dat
C: century (u:1800-1899, v:1900-1999, w:2000-2099)
V: file version (letters a-z)
SSS: station number
YY: the last two digits of the year
For example, va00385.dat is the observed hourly data file (version A) for
station 003 and year 1985. wa00385.dat would be the same for year 2085.
The file version is useful for keeping track of modifications to a data file. For instance,
the original data could be named version A. Then, after editing and interpolation, the modified
version could be called version B, and so on. A useful convention at the UHSLC is to rename the
"final" data as file version "Z" to denote that the data have received as much processing as
possible.
A routine is available (Section 2.1) to facilitate conversion of observed data into the
SLP64 observed data format.
The PREDICTED TIDES files have the same format as the observed hourly data files,
except columns 7-9 of the monthly headers are filled with the letters "PRD". As mentioned in the
1-5
SLP64 USER MANUAL
previous footnote, each predicted tides file has suffixed one day from the previous year and has
appended one day from the following year. This is to accommodate some routines which require
time shifts. The file naming convention for the predicted tides is as follows:
CVSSSYY.dat
C: century (o:1800-1899, p:1900-1999, q:2000-2099)
V: file version (letters a-z)
SSS: station number
YY: the last two digits of the year
For example, pa00385.dat is the predicted tides file (version A) for
station 003 and year 1985. qa00385.dat would be the same for year 2085.
The file version for the predicted files can be used to keep track of which version of
harmonic constants was used, as discussed in Section 4.2. Versions of harmonic constants
depend on the time span which was input into the tidal analysis routine.
The RESIDUALS FILES also have the same format as the observed hourly data files,
except columns 7-9 of the monthly headers are filled with the letters "RES". The first letter of the
file name is always "R" and the remaining letters are the same as the observed data file (file
version, station number, and two-byte year). For example, ra00385.dat is the residuals file
(version A) for station 003 and year 1985. ra00385.dat would also be the file name for year 2085.
A convention to denote the century was not incorporated because the residuals files are
temporary and one normally only works with a few years at a time during quality control, so the
need for multiple centuries would not arise.
The file version for the residuals file normally mirrors the file version of the observed
hourly data file from which it was computed. For example, the observed hourly data file,
va00385.dat, would have a corresponding residuals file, ra00385.dat.
1.5.2 Files with Hourly or Daily Sampling Intervals Grouped by Month
As will be described in Section 5.2 on reference level quality control, monthly files can
be maintained for preliminary incoming data. The file naming convention is as follows:
vSSSYYMM.dat
v: fixed (always "v")
SSS: station number
YY: the last two digits of the year
MM: month
For example, v0038807.dat denotes the file for station 003 during July of
1988.
1-6
SLP64 USER MANUAL
Because these files are temporary, a convention to account for the next century was not
required.
Related with the monthly file is the PAIR file (Section 5.2) as described in Appendix D.4.
It has a file name the same as the monthly file above, except the first letter is an "X". For
example, x0038807.dat denotes the file for station 003 during July of 1988. The PAIR file is
maintained in subdirectory /slp64/cal.
1.5.3 Files of Daily and Monthly Means
Daily and monthly means are obtained through use of a sophisticated filter as described
in Section 6. All years for a given site are kept in a single file per type (daily or monthly). The
daily data file has an ASCII format (Appendix D.2) and a file naming convention as follows:
dVSSS.dat
d: fixed (always "d")
V: file version (letters a-z)
SSS: station number
For example, da003.dat is the daily data file for station 003, version a.
The monthly data file has an ASCII format (Appendix D.3) and a file naming convention
as follows:
mVSSS.dat
m: fixed (always "m")
V: file version (letters a-z)
SSS: station number
For example, ma003.dat is the monthly data file for station 003, version A.
The daily and monthly SLP64 formats are awkward. To make a more convenient
format that is compatible with spreadsheets, Matlab, etc., see programs dsl2csv and msl2csv in
/slp64/util.
1-7
SLP64 USER MANUAL
2 Utilities
Several utilities are available for a variety of applications. The programs can be found
under subdirectory /slp64/util To execute a program, type the name of the executable file at the
command prompt of a terminal window, for example, “gapcou.exe”. For python programs, type
“python “ before the python program filename, for example, “python convert.py”.
2.1 Convert to SLP64 Format for Observed Hourly Data
The package has been designed to work with hourly data sets in the SLP64 processing
format. If originator data are stored at higher sampling intervals, they must be filtered to the
hourly intervals. The choice of filtering method is left up to the individual. At the UHSLC, we
have used a three- point Hanning filter centered on the hour with weights of 0.25, 0.50, and 0.25,
respectively.
SLP64 provides a Python program, /slp64/util/convert.py to generate the SLP64 hourly
data file format. This may involve a few steps. The goal is to change the original format to a
simple format with each row in the file containing a single hourly sea level value along with the
Date and Time. The software to generate this simple format is not provided and must be written
by the user. Examples of the simple format are given below.
Once a simple Date-Time Hourly-value format is created, there is an additional
requirement. The positions of each field must line up vertically. If the format is an output from a
spreadsheet (comma-separated version, CSV), and the fields do not line up, then there is a utility
to make the fields align. This is program /slp64/util/csv_to_fixed_columns.py.
As an example of CSV formats, the UHSLC provides this option. Go to web site
http://uhslc.soest.hawaii.edu/ and click on the Data tab. Under Download, click on either Fast
Delivery or Research Quality. Choose a station and click on hourly under the 2nd to last
column labelled CSV. Save the data file to your computer and place the file within /slp64/dat.
Open the file in an editor or utility. The top rows in the file look like:
2001,12,16,6,1333
2001,12,16,7,1130
2001,12,16,8,867
2001,12,16,9,553
2001,12,16,10,308
The data are year, month, day, hour, sea-level-value (in time zone GMT and units of mm).
When the month, day, or hour value is 1-9, it is only a single character. Also, the number of
2-1
SLP64 USER MANUAL
characters of the sea-level-value varies. For SLP64 convert.py to run, the columns need to be
fixed for each field. Thus, one can use csv_to_fixed_columns.py to prepare the file for
convert.py. A sample of the fixed-column CSV format (output of csv_to_fixed_columns.py):
2001,12,16,06,1333
2001,12,16,07,1130
2001,12,16,08,0867
2001,12,16,09,0553
2001,12,16,10,0308
Another example of an hourly format with Date-Time Value per row (or line) is the
NOAA National Ocean Service format for hourly data. A sample is found in
/slp64/samp/kwaj2014.txt, which is Kwajalein Atoll hourly data for 2014. Here are a few rows
2014/01/01
2014/01/01
2014/01/01
2014/01/01
2014/01/01
00:00 1.114 01:00 1.53 02:00 1.96 03:00 2.299 04:00 2.463 -
1.269
1.671
2.09
2.548
Fields are year/month/day hour:min predicted-tide dash observed-tide. It is the last column with
the observed hourly data that is desired. It is not fixed-column and needs to be corrected.
Not clear above is the fact that tab characters separate some columns. If no data exists
for the given Date-Time, then the missing flag is a dash. This complicated format is prepared for
convert.py using the utility nos_to_fixed_columns.py. This example is chosen to encourage use
of this Python program as a guideline to cater your own Python code to your situation. The code
in the Python program has lots of explanations to help facilitate the modification of your own
situation. Output of nos_to_fixed_columns.py can be viewed at /slp64/samp/kwaj14fix.txt,
which would become the input to convert.py.
Prior to running convert.py, one must define the start and end columns of each field with
file dtdcnv.din in directory /slp64/din. When convert.py is executed, there are comments
regarding options for the input file sea level value:
Code for sea level data format and missing data flag, if used
1: 5-digits in mm, range -9999 to 9998, flag 9999
e.g. -1223 or 3214 (read in fortran F5.0 or I5)
2: 6-positions in cm, range -999.9 to 999.8, flag 999.9
e.g. -122.3 or
321.4 (read in fortran F6.1)
3: 7-positions in m, range -99.999 to 99.998, flag 99.999
2-2
SLP64 USER MANUAL
e.g. -12.823 or
3.214 (read in fortran F7.3)
4: 5-positions in m, range -.999 to 9.998, flag 9.999
e.g.
2.823 or
3.214 (read in fortran F5.3)
For each of these options, there are sample files and pre-defined dtdcnv files given in
/slp64/din. See the 1readme.txt file for more notes. This gives the user the option to mirror your
originator format into one of these 4 options. Additional comments about convert.py:








The dates must be numeric. The year must be complete (4 characters, e.g. 1998).
Date can be yyyy mm dd hh or any combination, for example, hh mm dd yyyy
Delimiter positions between the dates or values can be any character (comma,
space,etc.)
This input file must not have headers.
Multiple years within the input file are acceptable.
If missing data, then two options: a) leave out record b) missing data flag
Output in millimeters (mm) and placed in /slp64/dat
Convert.py reads /slp64/din/dtdcnv.din so if you want to use one of the 4 options
defined in dtdcnv.op[1-4], then copy the desired one to dtdcnv.din.
2.2 Convert Units
It is recommended to use millimeters as the scientific units while processing data with
SLP64. However, one can change to other units if desired. The options allowed by the SLP64
formats are:
1. feet (in hundredths, e.g. UHSLC format 132 = 1.32 feet)
2. millimeters (in whole digits, 1141 = 1141.0 mm)
3. centimeters (in whole digits, 556 = 556.0 cm)
One can convert among these three forms using the program, chunit.exe. To run the program, go
to directory /slp64/util and enter at the command prompt:
chunit.exe
The program is interactive- you will be prompted to supply information about the input and
output files and other options.
2.3 List Missing Data
2-3
SLP64 USER MANUAL
As described in Section 4, the next step in SLP64 processing will be to choose a time
span of observed hourly data for tidal analysis. The output from gapcou.exe is helpful for this
task since one prefers a time span of minimal missing data values (gaps).
One can obtain a list of gaps in an observed hourly data file using program GAPCOU on
directory /slp64/util. To run the program, execute
gapcou.exe
The program is interactive and prompts the user for file name attributes. The output is
placed in file MISsssv.TXT (where sss is the station number and v is the version) in the same
directory from which the program was executed.
This utility is also useful for identifying short gaps (see Section 5.4) and for maintaining
a log of missing data.
2.4 Make Blocks of Missing Data Flags
Some programs, such as the time shifting program (Section 2.5) and the filtering program
(Section 6) require the hourly data files to have either data or missing data flags for each hour of
the year. To facilitate the creating of missing data flags, one can use the program fillmonths.py.
To run the interactive program, go to directory /slp64/util and execute:
python fillmonth.py
The output is placed in the directory /slp64/dat within file, fill.dat. One must then use
an editor to paste the missing data flags into the appropriate place within the hourly data file.
The python program fillmonths.py is used to run Fortran program fillvm.exe. The python
program ensures that if fill.dat already exists in /slp64/dat, then it is deleted before executing
fillvm.exe.
2.5 Shift Hourly Data in Time by Increments of An Hour
The hourly data files can be moved forward or backward in time (maximum of 12 hours
in each direction) by exact increments of an hour using the utility, TSALL. This program is
applied to change the time zone reference of the data or to correct timing errors, as will be
discussed in Section 5.3. This package does NOT provide a utility to shift the data by fractional
hours in time. For example one can NOT shift in time by 15 minutes. If a time zone is not
exactly on the hour, for example, locations in India have half-hourly zones, one cannot use this
2-4
SLP64 USER MANUAL
program to shift the data to GMT. The program can run on multiple consecutive years. However,
one must ensure that each observed hourly data file has either a data flag or a data value for each
hour of each year. To create blocks of missing data flags, see utility FILLMONTHS (Section
2.4). To run this interactive program, execute:
tsall.exe
The following prompts will be given:








File Version. Give the version of the input observed data file. For multiple years, each
file must have the same version.
Station Number. Give the station number; for example, 003.
Start Year. Provide complete 4-byte year; e.g. 1998.
Number of Years. Give number of years.
Shift Hours. Supply number of hours to shift (from -12 to +12).
File Version of Output. Give the file version for the output file, which must be different
from the input file version. Be sure a file version for this station and year does not
already exist. Otherwise, the program will abort. If so, remove or rename those output
versions or rerun TSALL and select a different file version.
Update Monthly Header in Output. If the time zone is being changed, choose option N
then provide the new time zone, which will be automatically placed in the monthly
headers of the output files. If one is using TSALL to correct a short-lived timing error,
then choose option Y, since one does not need to update the headers of the output.
Obtain Extra Hours from Other Years. If you are shifting the data forward (positive), if
the data start in January, and if December of the previous year is available (and the file
has the same file version), one has the option to grab hours from this file so the output
will not have missing data flags for the first few shifted hours. Reply Y or N. Conversely,
if the data are shifted backwards (negative), if the last year ends in December, and if
January of the following year is available (and has the same file version), then one can
optionally grab the first hours off the following year.
The output is placed in subdirectory /slp64/dat.
2.6 Convert SLP64 Format to CSV
Most contemporary data analysis is performed with packages such as MS Excel or
Matlab. When SLP64 processing is complete, the users will likely want to convert the format.
To facilitate this task, there are programs for the hourly, daily, and monthly SLP64 format
conversions as HSL2CSV, DSL2CSV, AND MSL2CSV, respectively.
2-5
SLP64 USER MANUAL
2.7 Add Constant to Each Hourly Value
In some cases, one would like to add a constant level adjustment to all hours within a
given series. This is possible with program addval.exe.
2-6
SLP64 USER MANUAL
3 Plotting Routines
Plotting programs supplement many of the core functions in this package. The routines
are introduced here, but one will refer back to them many times in the sections that follow. The
plotting programs are found in subdirectory /slp64/pyplot. The plotting routines are all coded
in Python/Matplotlib. It is assumed the input files have scientific units of millimeters. Output
figures are placed in /slp64/pyplot/plots with a root name based on the input data file and file
extension .png (PNG image).
3.1 Hourly Files Grouped by Year
One of the first tasks after conversion of your local format to the SLP64 format is to plot
the observed hourly data. The hourly data plot programs, hplot.py and hplot_yrs.py, are the
most commonly used programs of this package and supplements the tidal analysis and prediction
routines, as well as the quality control routines. These programs can plot either the observed
hourly values or the residuals. To run this program, execute:
python hplot.py
Carefully answer the interactive input options. The input data files are read from
/slp64/dat. One can either read files in the SLP64 processing format (*.dat files) or in CSV
format (*.csv). The CSV files must have fields as follows: year, month, day, hour, value.
HPLOT plots one year of hourly data for one station at a time. The output graph appears
on the computer screen and is saved in /slp64/pyplot/plots. Alternatively, one can run
HPLOTS_YRS, which allows multiple consecutive years to be plotted. In this case, graphs are
not sent to the computer screen. The output figures are placed in in /slp64/pyplot/plots.
If one desires to plot a year of predicted tides: 1. Copy the data file from /slp64/prd to
/slp64/dat. 2. Edit the file and take off the additional days at the top and bottom of the file that
occur before and after the pertinent year. Make sure you do not take off the last two delimiter
rows at the end of December (all 9s). 3. Use HPLOT (it does not work for HPLOT_YRS).
3-1
SLP64 USER MANUAL
3.2 Expanded Residual Plot
SLPR2 provided a plot program to expand a section of time of the residuals to look more
closely at details. This program is not provided with SLP64, since the figure that is sent to the
screen with HPLOT allows the user to zoom.
3.3 Scatter Diagram for Tide Staff-Gauge Pairs
A basic scatter diagram of the tide staff and tide gauge pairs (Appendix D.4) (found in
/slp64/cal) can be plotted by typing:
python scat.py
Further details are described in /slp64/cal\1readme.txt as well as in Section 5.2.
3.4 Daily and Monthly Data Plots
The daily and monthly plot programs are DPLOT and MPLOT, respectively, and work
similarly. Both are interactive and require information regarding file version and station number.
As with HPLOT, the user can input SLP64 processing format (*.dat) or CSV if fields are year,
month, day, daily-value or year, month, monthly-value.
These programs allow one to plot all years in the daily or monthly data files, to plot select
years, or to plot a single year, which is the smallest time increment for plotting.
Once successfully executed, a figure is sent to the computer screen and saved to a file
within /slp64/pyplot/plots. The filename is based on the input data file name.
There are new programs with SLP64 that did not exist with SLPR2. These are d2diff.py
and m2diff.py, for daily and monthly data, respectively. Each execution produces two graphs in
a single figure. The first graph allows two series to be plotted simultaneously. This is useful for
comparing neighboring sea level stations. The mean is removed for easier comparisons. The
second graph is the difference between two series. More discussion is given in Section 6.
3-2
SLP64 USER MANUAL
4 Tidal Analysis and Prediction
The goal of this package is to quality control the hourly data files. The techniques have
been well documented (Caldwell and Kilonsky, 1992; IOC, 1992). The basis for quality control
is the inspection of plots of residuals, defined as observed data minus predicted tides. In the
following sections, the tidal analysis and prediction routines, the plotting programs, and the
various quality control procedures are outlined. Lastly, a filter routine for the creation of daily
and monthly means from the hourly data is described. Many of these routines must be performed
in a cyclical fashion; thus, one will need to refer backwards and forwards within the manual as
necessary.
In this manual, a majority of the routines are fixed to work with the SLP64 hourly
formats grouped by year (one year per file). For one section of the manual (Section 5.2), the
software works with hourly data grouped by month (one month per file). This is for a special
case for those that have tide staff observations available to calibrate the tide gauge values. If so,
please jump to this section and review the material. When calibration is complete, these monthly
files in turn are concatenated to form SLP64 hourly files grouped by year, on which the
remaining routines can be applied.
This software package utilizes the tidal height analysis and prediction programs of Dr. M.
Foreman of the Institute of Ocean Sciences, Victoria, British Colombia, Canada (Foreman,
1977). Most recent news and versions are listed at http://www.pac.dfompo.gc.ca/science/oceans/tidal-marees/index-eng.html. Various notes regarding his procedures
are found in Appendix F. The analysis program applies a linear least squares fit for specified
harmonic constituents (astronomical arguments), which are based on earth, moon, and sun
astronomical configurations. The output of the analysis is a set of harmonic constants (Appendix
G). These in turn become the input for the tidal prediction function, which produce predicted
tides for any year of any century. The UHSLC/NODC package has facilitated the use of the
Foreman routines by SLP64 formats to (from) the Foreman programs. For this package under
Linux, the Foreman programs were compiled with “gfortran –static –fno-automatic file.for”,
where file is FITTIDE or TIDEP.
4.1 Tidal Analysis
This program analyzes the hourly tide gauge data for a given period of time. Amplitudes
and Greenwich phase lags are calculated via a least squares fit method coupled with nodal
modulation for only those constituents that can be resolved over the length of the input record.
The output is a file of harmonic constants which become input for the tidal height prediction
program.
4-1
SLP64 USER MANUAL
The first step in use of the tidal analysis is to determine a suitable time span as input.
One picks a time span with minimal gaps. To obtain a listing of missing data, use
/slp64/util\gapcou.exe. Another means of inspecting the hourly data is plotting with
/slp64/pyplot\hplot.py or hplot_yrs.py. Plot and inspect the available years of observed data.
This package is fixed to allow a maximum of 68 constituents (Appendices F and G),
which can be achieved through analysis of a record length of 366 days. The program will not
work (it will abort) if an input record greater than 13 months is requested. If one would like to
apply Foreman software for longer periods of analysis, then one has the liberty to use the
Foreman routines, but you will need to provide input in the Foreman format (Foreman, 1977).
The program will function for record lengths less than 366 days; however, the number of output
harmonic constituents will be fewer and subsequently the quality of the constituents will be less.
For instance, a record length of 30 days will produce 30 constituents and a record length of 14
days will produce 11 constituents.
From the hourly plots choose a time span of at least 366 consecutive days for a period
with apparently good data (no obvious spikes or corruption) and with minimal data gaps. From
the example shown in the Figures 1 and 2, a time span from the first hour of 01 April, 1985 to
the last hour of 01 April, 1986 would satisfy the criteria. The Foreman Analysis program can
handle data gaps, but it is preferable to choose a period without them. The time span can begin
on any hour of any day and can end on any hour of any day in the following year, as long as one
does not exceed 13 months. One restriction is that the time span CAN NOT cross the century
boundary; that is, the input span cannot include the consecutive months of December 1999 and
January 2000.
Once a time period for analysis has been selected, one is ready to run the Foreman
program. The procedures are found in subdirectory, /slp64/tide/ana. Go to this directory and
execute: python tideanl.py
Notes:


Review STAINFO.DIN Parameters. The program will display information found in the
stainfo.din file from the /slp64/din subdirectory. If this information is incorrect, hit
CONTROL-C to abort the program. Then correct the stainfo.din file and rerun the
TIDEANL job.
Time Zone of Observed Hourly Data. One must choose if harmonic constants will be
created relative to the local time meridian or relative to Greenwich Mean Time (GMT).
In turn the predicted tides will be relative to the time zone referenced in the harmonic
constants file. The time zone of your observed hourly data files can be changed using
the utility TSALL (see Section 2.5). The time zone of the input (observed hourly data)
determines the time zone reference of the harmonic constants. If you would like to
change the time zone of the hourly data, abort the program by typing CONTROL-C, then
4-2
SLP64 USER MANUAL



run TSALL. If you are ready to continue, choose either Y for data relative to local time or
N for data relative to GMT.
Start of Analysis Period. One must enter the start time and date of the analysis period
in form HHDDMMYYYY where HH is hour (using the British convention, that is, the first
hour of the day is 01 and the last hour of the day is 24), DD is day, MM is month, and
YYYY is 4-byte year (e.g. 1978). For example: 0101011978 (1st hour of 01 January 1978).
End of Analysis Period. Same as above for end of the period. It can be in the same or
the following year, but do not exceed 13 months. For example 2401011979 (last hour
of 01 Jan 1979).
File Version. Version of observed hourly data file; e.g. A.
The output is placed in subdirectory /slp64/tide/harm in files inpSSS.prd (where SSS
denotes the three-digit station number), and a file harmSSS.lis (Appendices F and G). The
amplitude units are in centimeters and the phase units in degrees. File inpSSS.prd is used for the
tidal prediction program, and file harmSSS.lis contains a listing of the harmonic constants
formatted for a hardcopy printout. If the tidal analysis program is ran again for the same station,
these files will be replaced. Thus if one wants to save different versions of the harmonic
constants, then one will have to rename them. For instance, one could have a version of the
harmonic constituents relative to GMT by running the analysis on a time period of data relative
to GMT. Then rename inpSSS.prd to inpSSS.gmt. Conversely, for local time, one could rename
the output to inpSSS.loc. Then one can control the time zone of predictions by copying the
desired harmonic constant file to inpSSS.prd, which is input to the Foreman prediction program,
prior to running the prediction.
If a grossly bad value exists in the input hourly data file within the time span provided to the
TIDEANL job, then the Foreman program will abort. A useful way to find the input record with
the bad data is to do the following.





1) Edit the tideanl.py. Comment out execution of routines following: fittide.exe.
2) Rerun tideanl.py
3) Edit file TGOUTPUT. If a bad record was encountered, it will be identified.
4) Go to the input data file on subdirectory /slp64/dat and replace the questionable
value with a missing data flag, 9999. If the questionable record is not identifiable,
replace all values on that line (12 values) with 9999s.
5) Restore tideanl.py to original form and rerun tideanl.py.
4-3
SLP64 USER MANUAL
4.2 Tidal Prediction
Tidal prediction utilizes the harmonic constituents calculated by the tidal analysis
(Section 4.1) to compute predictions in the form of either equally-spaced (in time) hourly values
in the SLP64 processing format or high-low heights in the form of a tide table.
Prior to running the Foreman tidal prediction program, one may need to alter some
default settings that are set in file prdvp.din located in the /slp64/din subdirectory:
1) *units of equally-spaced predictions
2) units of harmonic constants
3) start month of predictions
4) interval between predictions
5) time scheme (0:1-24 1:0-23)
= 2
= 3
= 12
= 60
= 1
*units code 1:feet (hundredths)
2:millimeters (whole digits)
3:centimeters (whole digits).
For example, prdvp.din contains one record in the following
format: 2 3 12 60 1
As seen above for parameter 1, the scientific units for the predictions defaults to
millimeters. For the quality control routines that follow, one should choose the units of the
predicted tides to be the same as the units of your observed hourly data. Thus, if you do not have
your units as millimeters, YOU MUST MODIFY PRDVP.DIN. It is recommended processing
your data in millimeters.
The parameters 2 - 4 above should not be altered. An explanation for obtaining predicted
tides for equally-spaced intervals less than one hour is given in Appendix H. If the user's data
files have a time scheme convention of 1 to 24 hours, then parameter 5 should be changed to a
value of 0, so that the raw data files and the predicted tides files will be compatible.
To execute, from /slp64/prd, enter: python tideprd.py
Interactive Parameters:
file version for output predictions file (A-Z)
station number (i.e. 005)
start year to be predicted (i.e. 2014)
end year of prediction
time reference meridian of the predicted data,
which should coincide with the reference meridian
of the raw hourly data file that is input into
the tidal analysis program (note the hemisphere
(E or W) is not needed)
(i.e. 075 or 000 for GMT)
- numerical code for the form of predicted data
(0:high-low
1:equally-spaced)
-
4-4
SLP64 USER MANUAL
The output of the program will be placed in subdirectory /slp64/prd either file
CVSSSYY.DAT (Section 1.5) or HLsssyy.DAT for equally-spaced (hourly) or tide tables,
respectively. For convenience, the file name specifics are repeated below:
C: century (o:1800-1899, p:1900-1999, q:2000-2099)
V: file version (letters a-z)
SSS: station number
YY: the last two digits of the year
For example, pa00385.dat is the predicted tides file (version A) for station 003 and year
1985. qa00385.dat would be the same for year 2085. hl00385.lis is the tide table for station 003
in year 1885, 1985, or 2085. The filename for the tide table option does not distinguish century
although the actual predictions do.
Special consideration must be given to the change of century years. If one is running this
program for the last year of a century (eg, 1999), then the TIDEPRD python job will send a
message to the screen showing the last hour of that year. This value should be written down and
saved. Then when making predictions for the following year (e.g. 2000), the value just
mentioned should be hand-edited into the year 2000 predictions to replace the missing data flag
(9999) of the first hour of January. This step is essential for use of the gap interpolation programs
(Section 5.4).
4-5
SLP64 USER MANUAL
5 Quality Control
Quality control ensures the scientific validity of the data. Three main aspects are
emphasized: 1) the linking of the data to a reference level (tidal datum), 2) the inspection of the
timing quality, and 3) the replacement of short gaps and spikes. Technical aspects of quality
control procedures have been well documented (Caldwell and Kilonsky, 1992; IOC, 1992).
The tidal analysis and prediction programs (Section 4) and the plotting routines (Section
3.1) are an integral part of the quality control task. Many of the routines run in a cyclical fashion.
In the discussion below, one will be referred to various sections in this manual.
Prior to initiating quality control of the data, make copies of the original observed data
files. It is not uncommon to make mistakes during editing; thus, it is very important to preserve
the original version of the data. Also, keep a log of modifications as you proceed. It is good
practice to keep editing notes in a text file, which can be saved with the final version of the data
to clearly indicate what changes were made to the original measurements.
5.1 Hourly Residuals
The basis of quality control is the inspection of residuals, defined as observed data minus
predicted tides. Residuals are made from program, RESIDM, in subdirectory /slp64/qc.
The interactive program will ask the following questions:






Input Observed File Version. This is the version of the observed hourly data file in the
UHSLC processing format (Section 1.5).
Input Predicted Tides Version. This is the version of the predicted tides. Normally there
is no need for more than one version.
Output Residuals Version. This is the version of the output. Normally one uses the same
version as used for the input observed file.
Station Number. Provide three-digit number, e.g. 003.
Start Year. Provide complete 4-byte year; e.g. 1998.
Number of Years. Give number of years over which to calculate residuals. There must
be a file of predictions and a file of observations for each desired year.
The output is placed in subdirectory /slp64/dat. Make a residuals file for each year of observed
data.
Next, the residuals are visually inspected through use of the hyplot.py or hplot_yrs.py
plotting programs (Section 3.1).
5-1
SLP64 USER MANUAL
If the observed data is of good quality and the tidal analysis estimated well the tidal
species at the site (ie, good predictions), then the residual plots will have a smooth appearance
(Figure 3). In this case, there are no obvious problems with the data.
If the observed data has erroneous values but the predicted tides are accurate, then the
residuals plot will clearly show the faulty values (Figure 4). In this case, some periods within the
plot are smooth, while others show significant fluctuations, gaps, and spikes. Techniques for
correcting these problems are discussed in Sections 5.3 and 5.4.
In locations with highly non-linear tides due to shallow waters, influence of rivers or
estuaries, or complex basin configurations, the Foreman tidal analysis routines will not be able to
completely resolve all the tidal species. At such a location, assuming the observed data are of
good quality, the residuals will show the tidal species that have not been resolved by the tidal
analysis. This situation is shown in Figure 8 for a location on the western coast of Malaysia.
Thus, quality control is limited at this site. Only grossly extreme spikes in the residuals could
potentially be identified in the observed data file.
For most locations, the harmonic constants, which were produced by the tidal analysis,
are applicable for prediction of any year. However, at some locations, the tidal species are not
completely resolved and thus the quality of the predictions decreases as the predicted year moves
further away in time from the time span used in the tidal analysis. Note the residuals in Figures 9
and 10. The first figure shows residuals of 1995 observed data based on a tidal analysis of
December 1991 - December 1992. Note the significant fluctuations. In the second figure, the
residuals are based on the same year as plotted. In this case, the residuals are smoother. Since the
goal of the residuals is to identify potential errors in the data, then for some sites with tidal
species similar to this example, it would be best to refresh the harmonic constants file using time
spans of tidal analysis close to the year in review.
It is also possible that the harmonic constants could have been created from a time span
that is less ideal than another available time span of the same time series. For instance, assume
one creates a series of residual plots for various years, and notes in the residual plots that the
time period selected for the tidal analysis had timing errors and other data problems. In this case,
it would be best to recreate a harmonic constants file based on a more suitable time span as input
to the tidal analysis. It is good practice to recreate a harmonic constants file after quality control
is complete so that one has the best possible set of constituents to base future inspections.
5.2 Reference Level Stability
The calibration of sea level data is necessary to relate the measurements recorded by a
sensor to a stable datum (Mitchum et al., 1994). Often called the tide gauge datum, this
5-2
SLP64 USER MANUAL
relationship defines the horizontal plane from which heights are measured at the tide gauge. For
most float and stilling well gauges using analog or punched paper tape recorders, the tide staff
zero is the fundamental calibration point, which is geodetically linked to a primary Tide Gauge
Bench Mark and several auxiliary bench marks. The procedures for the establishment of a tide
staff and the associated network of vertical control points are well described (Hicks et al., 1987).
Visual staff readings are used in the calculations of the tide gauge datum as shown below. Many
modern acoustic gauges have internal calibration systems so the following steps are unnecessary
(IOC, 1992). Also, automated calibration devices, such as the UHSLC switch, provide
supplementary calibration information. Use of switch data is not discussed here (see Mitchum et
al., 1994)).
Improper calibration leads to reference level shifts. Most shifts are readily identifiable in
hourly residual plots (see Figure 4 at end of December). They can also be seen in plots of the
daily and monthly data (Section 6) and in plots of differences of daily or monthly values with
nearby tide stations or with redundant sensors at the site in question. If a shift is identified, the
proper means of correction is through analysis of the tide staff readings and corresponding tide
gauge values.
Normally the link to the tide staff is made with the observed data of the highest frequency
of samples; for instance, most data are digitized in 4-, 6-, 10-, 15-, or 60-minute intervals. For
punched paper tape recorders, the tide staff reading is made exactly at the time of the punch. For
analog data, the tide staff reading is taken exactly on the hour.
Since the tide staff readings and most digital and analog punch paper tape or strip chart
rolls come in on a monthly basis, this procedure is performed each month. For this reason the
UHLSC processing format is designed for files consisting of either one month of data or up to
one year of data in monthly increments. The monitoring of the reference level stability is most
naturally performed on monthly files. These monthly files (Section 1.5.2) in turn are
concatenated into yearly files at a later date, at which time the core functions of this package can
be applied. These monthly files of hourly data must be kept on /slp64/dat. An important field
in these files is the REF= in the header. The value given will be the value that must be subtracted
from each hourly value in order for the series to be linked to the station tide gauge datum. The
reference level quality control program SCAT.PY is found on /slp64/pyplot.
The method for calibration using tide staff readings is as follows.

Build PAIR File. Within subdirectory /slp64/cal, construct a "PAIR" file containing paired
tide staff readings and the corresponding tide gauge measurements (Appendix D.4 and
Section 1.5.2). No utility is provided to build this file. One option is to hand input the
values into the PAIR file using a text editor. An example is found in
/slp64/samp/x0038807.dat.
5-3
SLP64 USER MANUAL







Plot PAIR values. Plot the pairs using program SCAT (Section 3.3). Summary statistics
are calculated within SCAT and placed in /slp64/cal in a file with the same root name as
the PAIR file, but with file extension OUT (e.g. X0038807.OUT). A plot is sent to the
computer screen and also placed in a file of the same root name with file extension JPG
(jpeg image). One uses this output to help identify outliers by inspecting the last column
(DIFF-the difference among pairs). Inspect PAIR values. If the tide staff readings were
accurate and the tide gauge was functioning well, then the points on the graph will be
nearly linear with slope equal to 1. It is common for a few points to be well off the linear
trend due to bad staff readings (Figure 5). Note the one extreme outlier. If the gauge
was malfunctioning, for instance, due to a clogged stilling well, then the scatter will be
random (Figure 6). If the gauge zero was altered by a technician, then a reference level
shift would be evident in the scatter (Figure 7).
Correct PAIR File. If outliers were evident in the scatter diagram, then remove them
from the PAIR file using a text editor. Next, replot the scatter diagram (SCAT.PY) to
verify correction.
File Organization. If an identifiable reference level shift was found (as in Figure 7),
create two PAIR files separating the time periods before and after the shift. There is no
convenient file naming convention associated with this package to do this. A suggestion
would be to make temporary files. For instance, rename the PAIR file (Xsssyymm.DAT)
to PAIR.ORI (i.e. ORI denotes Original). Then use an editor to create two files, BEFORE
Bsssyymm.DAT and AFTER Asssyymm.DAT, corresponding to the tide staff/gauge pairs
before and after the shift.
Be sure to keep of log of temporary filenames so not to be confused. Run the program
SCAT (discussed immediately below) separately for the BEFORE and AFTER files. More
instructions will be given later in this section in regards to applying a correction for a
level shift.
Compute Statistics for Pairs. Run SCAT for each section. The output is placed in the
/slp64/cal with filename same as input but file extension .out. These files should be
saved. If a level shift occurred, one will have to rename the output prior to executing the
program for the second input file.
Determine Preliminary Calibration Constant. The last column of the last row of the
*.out file is the Average Difference, which is the best approximation of the output of the
MEAN ZERO REFERENCE LEVEL CORECTION VALUE, or zero level correction for the given
month. It has the same units as the data in the PAIR file. It is a preliminary correction
value only. This is a preliminary calibration constant. The final calibration constant is
determined after an extended time period of no changes to the instrument, staff or the
foundation.
Update Calibration Log. Build a file, calSSS.log (SSS as station number), in the /slp64/cal
directory with the form as shown in Appendix D.5. A sample is given in
/slp64/samp/cal003.log. Make note of the MEAN ZERO REFERENCE LEVEL CORRECTION
VALUE and the NUMBER OF OBSERVATIONS (tide staff/gauge pairs) for the given time
period. This log forms the basic reference text for monitoring the long term stability of
the datum.
5-4
SLP64 USER MANUAL
One does not apply the calibration constant to the data unless there is a clearly evident
reason, such as the resetting of the gauge zero by a technician, a malfunction of the gauge (e.g.
the slippage of the gauge cable), or the vertical displacement of the tide staff. The basic
assumption is that the gauge is stable and month to month variations in the MEAN ZERO
REFERENCE LEVEL CORRECTION VALUE are caused by noise in the tide staff readings. At
the UHSLC, annual reviews of the monthly calibration constants are made, and one value is
assigned as the tide gauge datum correction for an extended period over which no obvious
changes occurred to the instrumentation or the tide staff. These extended periods normally
coincide with the periods between technical visits. Thus, one avoids introducing noise into the
sea level time series due to month to month changes in the MEAN ZERO REFERENCE
CORRECTION LEVEL VALUE.
As a rule of thumb, for float/well analog and punched paper type gauges, the UHSLC
considers changes greater than 0.05 feet (1.5 cm) as significant and worthy of an investigation
into the level stability. Most acoustic gauges have greater accuracy. With the float/well systems,
the UHSLC employs redundant gauges (Mitchum et al., 1994) at a given tide station to bring the
accuracy of datum correction to finer levels.
If a reference level shift has occurred, follow these guidelines:




Identify Date-Time of shift. Verify the date that the shift occurred by looking at a plot of
residuals (see Section 5.1) for the time period of interest. Unfortunately, the residual
and plotting programs require yearly input files in the SLP64 hourly format. If one is
presently working with the monthly files of hourly data, concatenate these into a file
using the yearly file naming convention. Then make and plot residuals. An example of a
shift is seen in Figure 4) at the end of December.
Compute the Correction. Calculate the difference between the old zero reference level
correction and the new zero reference level correction based on information provided
by the statistics of the staff/gauge pairs. The old zero reference level correction can be
found in the file header of the monthly file of hourly values (Section 1.5.2). For example,
if this value is 1030 and the new zero reference level correction is 1050, then the
difference is 20.
Apply Correction to Data. Go to subdirectory /slp64/qc and run program, COREF, to
apply this difference to the individual monthly UHSLC file of hourly data. In the example
used above, provide the value 20. This means that each hour in the file will be increased
by 20. Note this program is only for hourly data files in monthly blocks. Also, this
program corrects the entire month to the desired reference level. The output is placed
in the /slp64/dat directory in file COREF.OUT.
Use Editor to Apply Final Correction. Go to the /slp64/dat directory. When a reference
level change has occurred in the middle of a month, the UHSLC recommends that the
data for that month be corrected to the new zero reference level. A text editor is used
to cut from the output of COREF and paste into the monthly file of hourly values. Be
5-5
SLP64 USER MANUAL
sure a copy of the original monthly file is preserved. This is very important. As an
example, assume the reference level shift occurs on the 1200 hour of the 12th day,
replace all the data BEFORE this point in time in the monthly file of hourly values with
the corrected data from the file COREF.OUT, and replace the old zero reference level
value (REF=) in the header record with the new one. As in the example of the previous
paragraph, replace REF=1030 with REF=1050. When editing is complete, delete
COREF.OUT prior to executing the program again; otherwise, an error will result.
Linking the observed data to a datum is one of the most important steps in quality
control. The following sections describe other important aspects of controlling the quality of the
data.
5.3 Timing Errors
Timing errors are introduced into the data due to mistakes during data processing, to
incorrect setting of the timer on the tide gauge, or to inaccuracies in the gauge time clock. The
errors are evident in the plot of residuals as periodic fluctuations. An example can be seen in
Figure 4. Note the large fluctuations in November and December, and the mild fluctuations
during other months.
As discussed in Section 5.1, residual plots can reflect inaccuracies in either the observed
data or the tide predictions. The ability for the tidal analysis to fully model the tidal species
depends greatly on the location. For gauges in ports adjacent to deep waters, the analysis is
usually very good. For gauges in regions with influence of rivers, shallow coastal shelves,
narrow basins, or complex basin bathymetry, the analysis is of low to moderate quality. In this
case, assuming the observed data are good, the residuals with show periodic fluctuations for tidal
species that were not resolved by the tidal analysis (Figure 8).
With this software package, one can only adjust the timing of observed hourly data files
by exact increments of an hour. Once a timing error is identified in the residuals, there are three
ways to determine the magnitude of the timing error:


Inspect Original Data Tables. If the data were digitized from analog rolls, then inspect
the data tables in comparison with the observed hourly data file to ensure that an hour
was not accidentally added or removed during digitization.
Compare Residual to Observed Data Plot. Place a paper copy of a residual plot on top
of a paper copy of an observed hourly data plot. One can use a light table or hold both
papers, with borders aligned, up to a light. Note if the peaks in the fluctuations of the
residuals is to the left or to the right of the observed hourly peaks. If the residuals are to
the right, then assume a negative shift is necessary. One will have to guess, but start
5-6
SLP64 USER MANUAL

with a guess of magnitude one. If that does not work, increment to greater values (2, 3,
etc).
Compare Observed to Predicted Files. Using a split screen editor (or two windows),
place the observed hourly data file centered on the time span with the shift in one
window, and the corresponding predicted tides file in the other window (Appendix I).
Identify for a given semi-diurnal (or diurnal) tide cycle the hour with maximum
amplitude in both files for the same day. Note if the peak (hour with largest value) in the
observed file is behind or ahead in time of the corresponding peak in the predicted tides
file. If the observed peak is before the predicted peak, then the observed file will need
to be shifted forward by the number of hours of lag. Conversely, if the observed peak
follows the predicted peak, then the observed file will need to be shifted backwards
(negative). One may want to examine peaks in both files for a separate tidal cycle to
verify the magnitude.
Once one has determined the magnitude and direction of the shift, one uses the program
tsall.exe, as described in Section 2.5. This program is in the /slp64/util directory. The program
shifts the entire year and places the output in a file with a different file version in directory
/slp64/dat. For example, choose the output file version "T", so if the input was va00386.dat, the
output would be vt00386.dat. To verify if the estimated shift is correct, use the residm.exe
program (Section 5.1) to calculate residuals for the shifted file. Next, use the hplot.py or
hplot_yrs.py plot programs (Section 3.1) to graph the residuals.
As an example, see the timing error in November 1986 of Figure 4 and the inspection of
magnitude in Appendix I. After applying the steps above for a positive one hour shift, the
residuals are shown in Figure 11. Note that the November period is now smooth. Thus the choice
of a positive one hour shift was correct. Also note in Figure 11 that other time periods now have
significant periodic fluctuations. These can be ignored.
To know exactly which hour the shift began, one can look at the residual file
(ra00386.dat) of the original data file (va00386.dat). For the example given above (shift in
November), there are gaps on both sides of the shift, so it is easy to identify where the time
period with the shift begins and ends. When there is not a gap adjacent to a shift, then on uses the
plot (Figure 11) to guide you to the day of the shift. For instance, there is a timing error on
December 4-7 of this same example. It is obvious where the timing error begins because there is
a gap preceding it. But it is unclear on which hour the timing error ends. Looking at the residual
file for the 6-8 December (first 11 columns of each line of file removed below to fit in text):
198612
198612
198612
198612
198612
198612
61-1784-1449 -774
62-1807-1602-1028
71-1566-1645-1339
72-1438-1601-1404
81 371 -149 -160
82 -135 -151 -158
54 822 1335
-209 646 1336
-733
0 675
-882 -161 577
-159 -137 -109
-154 -135 -106
1468
1689
1120
1176
-88
-77
1188 571 -243-1039-1609
1629 1177 447 -387-1116
1224 975 441 -254 -947
1478 1415 1006 361 -372
-75 -73 -81 -101 -124
-55 -41 -30 -33 -47
5-7
SLP64 USER MANUAL
Note the dramatic difference from value to value during the period with the timing error,
and the small difference between adjacent values for the periods without a timing error. In the
example above, it appears that the timing error ends on hour 00 of 08 December. Make a note of
the hour and date of the start and end of the period with the timing error.
The next step is to make a copy of the original file. For instance, if the file is
va00386.dat, then copy it to vb00386.dat. This will preserve the original in case the following
edits need to be removed. As pointed out, original file versions must be preserved.
Use a text editor to modify the copy of the original (vb00386.dat). One deletes the hourly
values for the time period with the timing error. In a separate window, load the output of
tsall.exe (for example, vt00386.dat). Cut the corresponding period with the corrected shift from
vt00386.dat and paste into vb00386.dat for the deleted time period. At UHSLC, we normally
replace the start and end hour of the shift with a missing data flag (9999) since the exact hour of
when the timing error began or ended is usually not clear. The missing data can later be
interpolated as discussed in Section 5.4.
To verify the correction, create residuals for the edited file (create rb00386.dat) and plot.
If timing errors are still present, repeat the process above.
It is sometimes trial and error in discovering the magnitude and direction of a shift. One
may have to iterate through the steps above several times before a correction is found. And
recall, this package can only correct shifts of exact increments of an hour. If timing errors exist
due to shifts of partial hours (such as 15 minutes, etc.), then this package is not applicable. Note
in Figure 4 the small periodic fluctuations throughout January to October. It is suspected that
these fluctuations are due to minor timing inaccuracies of the tide gauge clock (note how the
amplitudes of fluctuations increase with time). This package cannot correct this timing error. A
note regarding these timing uncertainties would be made in the final document file that
accompanies the data in the final archive. If the originally sampled data are at intervals less than
an hour (6-, 10-, or 15-minutes are common), then one should try the technique mentioned but
for exact increments of the sampling interval of those data. Unfortunately, one will have to
develop their own routines, since this package is geared to hourly values.
5.4 Short Gaps and Data Spikes
Short gaps and data spikes are a common problem in most sea level time series records
(See example in Figure 4 in December). This section describes a technique for using
interpolation to correct such problems.
5-8
SLP64 USER MANUAL
Program gapcou.exe (Section 2.3) is a utility that lists the length (hours) and dates of
missing data. Run this program to obtain this information.
The best procedure for filling gaps is to replace the missing data flags (i.e. 9999) with
quality controlled data from an auxiliary sea level gauge that is linked to the same datum. If a
redundant sensor in not available, then another good method is linear interpolation via the
predicted tide method through application of python program gapfall.py (program fallptm.exe)
on directory /slp64/qc. Technical aspects of running this program will be discussed shortly, but
first, a note about the methodology.
The predicted tide method for filling gaps requires yearly files of observed and
corresponding predicted data. The predicted tides are shifted in time to match the timing
characteristics of the observed series. The residuals between the predicted tides and the observed
data are calculated. Then, a linear interpolation between the end points of the gap in the residual
series is performed and each interpolation constant is added to the shifted predicted tides over the
span of the gap.
The UHSLC recommends using this procedure only for gaps less than or equal to 24
hours. This is essential for the integrity of the daily data which can be calculated from the hourly
data (Section 6).
To run the program, go to the /slp64/qc directory (mandatory), and enter:
python gapfall.py
The program will query the user for the following information:
-
station number (three-digits, e.g. 003)
the file version of the observed data file with the
gap (i.e. A),
the file version for the predicted data (typically A),
(ensure the scientific units of the observed data
and predicted tides are the same)
the file version for the output data with the gap
filled, which must be different from the original
version (i.e. B, C, etc.),
the year with the gap in form yyyy (i.e. 1986),
if a gap occurs on last few days of December, then
one can optionally acquire data from January of
the following year. However, the observed and predicted
files of the following year must exist, and have the
same file versions.
The python job gapfall.py is used to perform file management to ensure existing files
cause the program to abort. The following output is produced:
5-9
SLP64 USER MANUAL




Interpolated Data. The program places the interpolated data file in directory /slp64/dat.
List of Interpolated Data. The gapSSSYYYY.txt (mentioned above) is output to directory
/slp64/qc/gapnotes, where SSS is station number and YYYY is year. At the UHSLC,
interpolated gaps greater than six hours are listed in the station document file which
accompanies the data in the final archive.
Timing Drift. An output file, timing.txt, is placed in directory /slp64/qc/gapnotes. This
file lists the drift (fractional hours) in timing between the observed data and predicted
tides for each month.
Extra Data. File extra.dat is placed in directory /slp64/dat. It contains data for the first
two days of the following year relative to the input of gapfall.py. This is for a special
case: missing data at first hour(s) of a year. If data from the end of the previous year are
available, then this gap can be interpolated. One runs gapfall.py for the preceding year,
choosing affirmative (1) the option that data from the following year exist (step 5
above). Then go to /slp64/dat and replace the missing data flags at the start of the year
(year after which the program was executed) with the output of extra.dat.
The predicted tide method for filling gaps can also be used to correct data spikes and
glitches. A data spike is an obviously wrong data point. A glitch is one or more, but less than or
equal to 24, consecutive obviously wrong data points. These features are easily identified on
plots of residuals (Figure 4).
To correct, first identify the date and time by looking in the residuals data file. The spike
should stand out as an outlier in the data. Second, use a text editor and replace the corresponding
value in the observed data file for the date and time of the spike with a missing data flag, 9999.
Finally, run the gap filling program as described above.
One should verify the output of the gap interpolation by making and plotting residuals.
5-10
SLP64 USER MANUAL
6 Filtering
Software is provided for obtaining daily values from hourly sea level data by use of a
sophisticated, two-step filter and for obtaining monthly values from the daily values with a
simple average. Plotting software is included for the daily and monthly data. Also, programs are
available for calculating the difference between two daily or monthly files. A subdirectory
/slp64/filt has been created for these purposes. An overview of the filtering method, the
input and output files, the plotting routines, and other important items are discussed below.
6.1 The Filter
Daily values are obtained using a two-step filtering operation. First, the dominant diurnal
and semi-diurnal tidal components are removed from the quality controlled hourly values.
Secondly, a 119-point convolution filter (Bloomfield, 1976) centered on noon is applied to
remove the remaining high frequency energy and to prevent aliasing when the data are computed
to daily values. The 95, 50, and 5% amplitude points are 124.0, 60.2, and 40.2 hours,
respectively. The Nyquist frequency of the daily data is at a period of 48 hours which has a
response of about 5% amplitude, thus, aliasing is minimal. The primary tidal periods have a
response of less than 0.1% amplitude.
The filtering operation incorporates an objective procedure to handle gaps. This objective
technique simply replaces the filter weight at any missing observation with a zero and
renormalizes the sum of the modified weight function to unity. This technique is equivalent to
interpolating the missing observation with an estimate of the local mean of the time series. The
local mean is defined as the mean of a given segment of length equal to the length of the filter.
The error associated with this technique can be estimated objectively and is used as a criterion
for accepting or rejecting a daily value computed in an area of the time series which contains a
gap or gaps. This error depends on the ratio of the standard deviations of the input (hourly) and
the output (daily) data. Thus in order to keep the ratio low, it is essential to apply this technique
to the residual series as defined above.
The monthly values are calculated from the daily data with a simple average of all the
daily values in a month. If seven or fewer values are missing, the monthly value is calculated.
The monthly value is stored in the file with the daily data and in a separate file as noted below.
6-1
SLP64 USER MANUAL
6.2 Running the Filter Program
The input files for this program are in the UHSLC processing format (Appendix D.1). A
requirement of the filtering program is that each month have 12 consecutive months of input
hourly data or missing data flags in the SLP64 processing format. If some months are missing in
a year, one may use program fillmonths.py in the /slp64/util directory (Section 2.4) to create
complete months of missing data flags, which can then be inserted by hand using a text editor
into the data file so that the file has twelve consecutive months.
The filter program will ask the user for the start year and number of years to be filtered.
This defines the time window of the data to be filtered. The program also asks if data prior to and
after the time window are available. This is because the filter length is 5 days long; thus, the first
and last day of the time window cannot be calculated without information prior and post of the
time window. For example, one runs the program with a time window of 1982 to 1984. Thus if
data (not missing data flags) begin on 1 January, 1982 and data are available for 30-31
December, 1981, then the program can automatically access this data and thus produce a filtered
daily value for 1 January, 1982. Note for the prior and post year files, the SLP64 processing
format must also be used with the same file version yet the file DOES NOT have to have 12
consecutive months (an exception to the requirement above).
To run the program, go to the /slp64/filt directory and type:
filthr.exe
The output of the program is in the UHSLC processing format for daily data (Appendix
D.2) and monthly data (Appendix D.3) and is placed in directory /slp64/dat. File names are
dVSSS.dat and mVSSS.dat (V:version, SSS:station number) for the daily and monthly data,
respectively. All years for a given station are kept in the same file.
The output daily and monthly file formats can be changed to a more contemporary format
(Date one-value-per-row) in CSV text format within programs dsl2csv.exe and msl2csv.exe
(Section 2.6) in directory /slp64/util.
6.3 Examine the Output
Python/Matplotlib programs dplot.py and mplot.py (Section 3.4) on the /slp64/pyplot
directory are provided to plot the daily and monthly data, respectively.
The daily and monthly values are useful for studying the low frequency variability of sea
level. In addition, they provide another opportunity to judge the quality of the reference level
6-2
SLP64 USER MANUAL
stability (Section 5.2). Large level shifts are clearly apparent in plots of daily and monthly values
as step-functions.
A common way to analyze oceanic variability or to judge the quality of the reference
levels is to compare one tide gauge with another gauge at the same or nearby site. Separate plot
programs, d2diff.py and m2diff.py can be found in /slp64/pyplot.
The programs simply prompt the user for the station numbers and file versions of the two
stations for the input data (daily or monthly) files. The program automatically displays the time
span of each station. One must choose an overlapping set of years. The program generates two
graphs per figure. Figure 12 shows the daily difference plots, figure 13 shows the monthly
difference plots. The top plot depicts both series, which individually have had their average
subtracted (de-meaned). The bottom graph plots the difference.
If a step-function signature is seen in the difference, then there is a possibility that a
reference level shift occurred at one of the stations. One must return to the procedures discussed
previously (Section 5.2) to investigate and correct the problem.
6-3
SLP64 USER MANUAL
7 Final Comment
The JASL Sea Level Processing and Quality Control Software package for 64-bit
Microsoft operating systems is available for enhancing the use and quality of sea level data.
More data centers and oceanographic institutes now have access to personal computers and the
JASL would like to share its software products and data processing techniques. The goal is to
support international capacity building and to enhance data quality. Data collectors are
encouraged to share their hourly sea level holdings with the Global Sea Level Observing System
(GLOSS). The JASL is a GLOSS archive. Your contributions are welcome. Any questions or
comments regarding this manual or software should be directed to:
Mr. Patrick Caldwell
Joint Archive for Sea Level
Dept. of Oceanography
University of Hawaii at Manoa
1000 Pope Rd. MSB 318
Honolulu, Hawaii 96822 USA
phone: 808-956-4105
fax: 808-956-2352
Internet: caldwell@hawaii.edu
Web: http://ilikai.soest.hawaii.edu/UHSLC/jasl.html
7.1 Acknowledgements
I acknowledge Mike Foreman of Fisheries and Oceans, Canada, Institute of Ocean
Sciences, for making his tidal analysis and prediction programs publically available. Also, thanks
go out to Shikiko Nakahara, Bernie Kilonsky, and Brent Miyamoto of the University of Hawaii
Sea Level Center for their contributions to the development of some routines in this package.
Thanks also to Brent for his comments regarding calibration. For helpful examples in learning
Python/Matplotlib, thanks are given to Jules Hummon of the University of Hawaii. Appreciation
is addressed to Glen Rowe of Land Information New Zealand who provided edits to the SLP64
manual. The views expressed herein are those of the author and do not necessarily reflect the
view of NOAA or any of its sub-agencies.
7-1
SLP64 USER MANUAL
7.2 References
Bloomfield, P. 1976. Fourier Analysis of Time Series: An Introduction. New York: John Wiley and
Sons. pp 129-137.
Caldwell, P., and B. Kilonsky, 1992. Data processing and quality control at the TOGA Sea Level
Center. Joint IAPSO-IOC Workshop on Sea Level Measurements and Quality Control, Paris, 12-13
October, 1992. IOC Workshop Report No. 81, UNESCO. pp. 122-135.
Caldwell, P. 1998. Sea Level Data Processing on IBM-PC Compatible Computers, Version 3.0
(Year 2000 Compliant). JIMAR Contribution No. 319, SOEST, University of Hawaii, 72 pp.
Caldwell, P. and M. Merrifield, 2013. Joint Archive for Sea Level: Annual Report 2013. JIMAR
Contribution No. 13-385, Data Report No. 23, SOEST, University of Hawaii, 73 pp.
Caldwell, P. 2014. Hourly sea level data processing and quality control software: Update for 64bit Microsoft operating systems. Version 4.0. October 2014. JIMAR Contribution No. 14-389,
SOEST, University of Hawaii, Honolulu 72 pp. Online:
http://ilikai.soest.hawaii.edu/UHSLC/jasl/slp64/slp64.html
Foreman, M.G.G., 1977. Manual for Tidal Heights Analysis and Prediction. Pacific Marine Science
Report 77-10, Institute of Ocean Sciences, Patricia Bay, Sidney, B.C., 97 pp.
Hicks, D. and P. Morris, H. Lippincott, M. O'Hargan, 1987. User's Guide for the Installation of
Bench Marks and Leveling Requirements for Water Levels. National Ocean Service, National
Oceanographic and Atmospheric Administration. US. Dept. of Commerce.
Intergovernmental Oceanographic Commission, 1992. Joint IAPSO-IOC Workshop on Sea Level
Measurments and Quality Control. Workshop Report No. 81. Paris, 12-13 October, 1992. page
16.
International TOGA Project Office, 1992. TOGA international implementation plan. Geneva,
Switzerland. Fourth Edition, 01 October, 1992. ITPO-No. 1, 73 pp.
Kilonsky, B. J. and P. Caldwell, 1991. In pursuit of high- quality sea level data. IEEE Oceans
Proceedings. Vol. 2, October 1-3, 1991.
Mitchum, G.T., B.J. Kilonsky, and B.M. Miyamoto, 1994. Methods for Maintaining a Stable
Datum in a Sea Level Monitoring System. IEEE Oceans Proceedings. 0-7803-2056-5, 1994.
UNESCO, 1993. IOC Training Course for the Global Sea Level Observing System (GLOSS) directed
to the African and South American Portuguese- and Spanish-speaking Countries. Sao Paulo,
Brazil, 1-19 February, 1993. IOC Training Course Reports No. 20.
7-2
SLP64 USER MANUAL
UNESCO, 1995. IOC/GLOSS-GOOS Training Workshop on Sea-Level Data Analysis. Geodetic and
Research Branch, Survey of India. Dehra Dun, India. 21 November - 1 December, 1995. IOC
Training Course Reports No. 39, 17 p.
Wyrtki, K., 1979. The Response of Sea Surface Topography to the 1976 El Nino, J. Phys.
Oceanogr., 9, pp. 1223-1231.
7-3
SLP64 USER MANUAL
Appendices
A
Summary of Upgrades from SLPR2 to
SLP64 Pertinent to Linux Version
.
- Linux version: relative directory location
One can place the SLP64 package within any level of the directory tree. The paths within
the programs are relative to the SLP64 directory tree.
- Year 1900 Bug Fixed
In SLPR2, year 1900 was mistakenly taken as a leap year. The new SLP64 correctly identifies
1900 as not a leap year.
- Easier Conversion from Originator’s Format to SLP64
In SLP64, a set of Python programs are available to more easily convert hourly data into
the SLP64 hourly sea level processing format. These tools allow a means of converting CSVformatted hourly files of the UHSLC data set and the text-tabbed formatted hourly sets of the
NOAA National Ocean Service COOPS Water Levels Data Access System to SLP64
-
Multiple Consecutive Years from Single Execution
The Foreman Tidal Prediction task was executed one year at a time for SLPR2. For
SLP64, a Python program facilitates execution of multiple years. Similarly, the plotting
programs for the hourly observations or residuals can run over multiple consecutive years.
- Enhanced Plotting Programs
SLP64 uses Python/Matplotlib software for plotting. A new program was added for
comparing daily and monthly series between two stations. It creates two plots per page. The
first plot has the de-meaned daily or monthly series of two stations for overlapping years on the
same graph and the second plot depicts the difference.
A-1
SLP64 USER MANUAL
B
Access to Package
Web site: http://ilikai.soest.hawaii.edu/UHSLC/jasl.html
Under Software, select SLP64Linux version.
B-1
SLP64 USER MANUAL
C
Directory Tree
/slp64
|
|--/din
|
|--/dat
|
|--/cal
|
|--/manual
|
|--/filt
|
|--/src
|
|--/pyplot
|
|--/plots
|
|--/src
|
|--/prd
|
|--/qc
|
|--/src
|
|--/samp
|
|--/tide
|
|---/ana
|
|---/prd
|
|---/harm
|
|---/src
|
|--/util
|
|--/src
|
{data information files}
{data files}
{calibration data files}
{manual}
{filtering software}
{source code}
{Python/Matplotlib plotting software}
{holds plots generated from pyPLOT programs}
{ancillary Fortran programs}
{predicted tide files}
{quality control programs}
{source code}
{sample data}
{tidal analysis}
{tidal prediction}
{harmonic constituents}
{source code}
{utility programs}
{source code}
C-1
SLP64 USER MANUAL
D
Formats
D.1 Hourly Data Processing Format
Each observed hourly data file is given a name, CVSSSYY.dat, where
C: century (u:1800-1899, v:1900-1999, w:2000-2099)
V: file version (letters a-z)
SSS: station number
YY: the last two digits of the year
Example: va00385.dat is the observed hourly data file (version A) for
station 003 and year 1985.
The file contains header records, data records and terminator records.
Each month begins with a one-record header and ends with two terminator
records. Each data record contains one half day of data or 12 hourly values.
The time scheme convention centers the data on hours 0-11 and 12-23 for the
respective data records.
The header for each month is coded as:
field
----------------station id
station name
bytes
------1-3
4-14
7-9
latitude
19-26
longitude
time zone
33-41
50-55
reference offset
61-65
time interval
month
year
units
67-68
70-72
74-75
77
# days in month
79-80
comment
--------------------------------station number - must be 3 digits
station name - note special cases:
file type; PRD for predicted and
RES for residual
latitude in degrees, minutes, tenths
of a minute, and hemisphere
longitude as above
time meridian reference for data,
i.e. GMT or 090W
constant value that is subtracted from
each data point in the month in order
for the data to be relative to the
station tide staff zero
time in minutes between each data point
first 3 characters of month, i.e. JAN
year without century indication, i.e. 78
scientific units of data as:
F : hundredths of feet (1121 = 11.21 ft)
M : integer millimeters(1121 = 1121.0 mm)
C : integer centimeters(1121 = 1121.0 cm)
not a mandantory field
For example (follows on next page):
D-1
SLP64 USER MANUAL
# of days in month
STATION
time interval between values
|
ID
ref level offset |
units|
|
STATION
time zone
| |
year | |
|
|
latitude and longitude
|
| | month| | |
__|_______| ___________________________|____________:_________|__|___|__|_|_|
003BALTRA
LAT=00 26.8S LONG=090 17.2W TMZONE=GMT REF=00000 60 APR 85 M 30
The data records are coded as:
field
---------------station id
station name
year
month
day
record count
data values
bytes
------1-3
4-10
12-15
16-17
18-19
20
21-80
comment
--------------------------------station number - must be 3 digits
station name
year i.e. 1988
numerical month, ie 12
numerical day, ie 31
1 for 1st record; 2 for 2nd record of day
Fortran format 12i5
For example:
STATION
YEAR
ID
| MONTH
|
STATION | | DAY
|
|
| | | RECORD #
HOURLY VALUES IN feet
__|_______|___|_|_|_|__________________________________________________________|
003BALTRA 1985 4 11 1492 1668 1879 2054 2152 2161 2067 1890 1663 1445 1285 1218
003BALTRA 1985 4 12 1266 1431 1667 1921 2123 2235 2230 2097 1893 1650 1432 1299
An extraction from an hourly file for Baltra is shown below.
003BALTRA
LAT=00 26.8S LONG=090 17.2W TMZONE=GMT
REF=00000 60 APR 85 M 30
003BALTRA 1985 4 11 1492 1668 1879 2054 2152 2161 2067 1890 1663 1445 1285 1218
003BALTRA 1985 4 12 1266 1431 1667 1921 2123 2235 2230 2097 1893 1650 1432 1299
003BALTRA 1985 4 21 1277 1385 1604 1868 2101 2268 2303 2201 1974 1681 1397 1185
003BALTRA 1985 4 22 1103 1180 1392 1695 2023 2295 2430 2401 2239 1971 1634 1337
003BALTRA 1985 4 31 1165 1141 1288 1581 1922 2236 2426 2451 2306 2027 1663 1303
003BALTRA 1985 4 32 1057 987 1112 1413 1813 2216 2533 2681 2604 2347 1961 1537
.
.
.
.
.
.
.
.
.
.
.
.
003BALTRA 1985 4291 1823 2014 2145 2191 2124 1964 1740 1519 1345 1260 1294 1432
003BALTRA 1985 4292 1649 1896 2110 2245 2274 2195 2020 1787 1561 1392 1333 1387
003BALTRA 1985 4301 1549 1773 1997 2166 2231 2183 2032 1796 1545 1341 1234 1252
003BALTRA 1985 4302 1396 1638 1930 2186 2359 2402 2314 2108 1828 1550 1341 1251
99999999999999999999999999999999999999999999999999999999999999999999999999999999
99999999999999999999999999999999999999999999999999999999999999999999999999999999
003BALTRA
LAT=00 26.8S LONG=090 17.2W TMZONE=GMT
REF=00000 60 MAY 85 M 31
003BALTRA 1985 5 11 1301 1480 1742 2021 2242 2344 2302 2117 1840 1537 1278 1142
003BALTRA 1985 5 12 1161 1338 1638 1983 2302 2510 2555 2423 2147 1796 1456 1200
003BALTRA 1985 5 21 1093 1167 1405 1736 2076 2345 2460 2400 2173 1834 1470 1175
003BALTRA 1985 5 22 1024 1069 1307 1676 2094 2457 2679 2704 2524 2174 1736 1322
.
.
.
.
.
.
.
.
D-2
SLP64 USER MANUAL
D.2 Daily Data Processing Format
The filename consists of the letter "d" followed by the file version and
the station number. For example, for station 027, file version a, the file
name is "da027.dat". All years for the station are in this file.
Header Record:
NNNxxxxxxxxxxxx Daily values 119-point filter
where NNN is the station number
xxxxxxxxxxxx is station name
The output data record format reserves the first eleven spaces for
station identification, the next four for the year of the data record, the
next two spaces for the month of the data record, the next space for a
sequential number that positions the record within the month.
The next space is a blank. The next 55 spaces contain the daily
filtered values in groups of five spaces per observation, 11 per record. The
data is units of mm. The averages are centered on the 12th hour GMT of the
day. There are three records per month, and the last record contains the
monthly average in the 11th data record group. Missing data is flagged by
four nines (9999). i.e.
station
id
year
|
|
month
|
|
| indicator for 1ST, 2ND, or 3RD part of the month
|
|
| |
|
|
| | |_________________________data_mm______________________
_|________|___|_|_|____|____|____|____|____|____|____|____|____|____|____
01PONAPE 1983 21
538 562 574 567 573 547 555 558 548 532 538
01PONAPE 1983 22
582 600 586 587 590 606 620 592 589 592 569
01PONAPE 1983 23
574 581 590 614 638 638
580
^
|
monthly average
D.3 Monthly Data Processing Format
The filename consists of the letter "m" followed by the file version and
the station number. For example, for station 027, file version a, the file
name is "ma027.dat". All years for the station are in this file.
Header Record:
Consists of the station number, name, and starting year (bytes 55-58).
Example.
003Baltra
Data record
MONTHLY MEANS
1985-
D-3
SLP64 USER MANUAL
Contain a record header (station number and abbrevated name in columns 1-10),
year (columns 11-14), a fixed "M" in column 15, and 12 sea level values of
four-bytes each in the remaining columns. Missing data given by 9999.
Last Record: string of 9s.
Example.
station
id
year
|
| fixed
|
| |
|
| |
|
| ||_________________________data_mm______________________
_|________|__||___|___|___|___|___|___|___|___|___|___|___|___
003Baltr 1985M999999991785179317791801177017551771175517541746
003Baltr 1986M180417981802183318141842182817921813185218971831
9999999999999999999999999999999999999999999999999999999999999999999999999999999
D.4 Tide Staff-Gauge Pair: File Format
PAIR filenaming convention: xsssyymm.dat
where
x: fixed (always "x")
sss is station number, e.g. 003
yy is the year (last two digits, e.g. 78)
mm is the month
The PAIR file holds the tide staff readings (highs and lows) and
corresponding gauge observations.
Record 1: Header with Station name and time span
Record 2 TO N:
Example:
STATION
250 250
330 330
.
.
570 570
360 360
byte
contents
--------- ----------------------------------1-4
highest readings
5-8
lowest readings
Note if only one reading; both the same
9-13
gauge value
30 SANTA CRUZ
1281
1373
01 JUL - 01 AUG 1988
1602
1400
D-4
SLP64 USER MANUAL
D.5 Calibration Log File
STATION = Santa Cruz, 003
GAUGE
= Fischer and Porter Analog-to-Digital Recorder (ADR)
TIDE DATUM = TIDE STAFF ZERO
TIDE GAUGE BENCH MARK = BM UH 1
AUXILLARY BENCH MARKS = BM UH 2, BM UH3, BM II, BM III, BM 4, BM 5,
BM INOCAR 1, BM WS2
DATES
-------------------01 JAN - 01 FEB 1988
01 FEB - 01 MAR 1988
01 MAR - 01 APR 1988
01 APR - 01 MAY 1988
01 MAY - 01 JUN 1988
01 JUN - 01 JUL 1988
01 JUL - 01 AUG 1988
MEAN ZERO
REFERENCE LEVEL
VALUE (units=feet*)
------------------1036
1037
1035
1036
1037
1035
1036
NUMBER OF
OBSERVATIONS
-------------------32
28
26
30
32
31
32
*units in feet to implied hundredths, ie 1036 = 10.36'.
(the numbers above are made up--this is only an example)
D-5
SLP64 USER MANUAL
E
Station Information File
The station information file contains specifications for a given
station that can be utilized by application programs. Each recipient of the
UHSLC/NODC package has the liberty of using one’s own station numbers with
the limit of three digits per number.
The first record is a header to define the fields. The fields are:
field
--------------###
bytes
---------4-6
station name
8-27
latitu
29-34
longitu
36-42
mer
44-46
S
48
strtdate
50-57
country
59-73
P
75-75
Example:
### station name
001 Ponape
002 Tarawa,Betio
003 Baltra
004 Nauru
comment
----------------------------------station number - must be exactly
3 digits
station name - limit to 20 bytes
latitude in degrees and minutes
longitude "
"
"
hour meridian of station
time scheme (0:1-24 1:0-23) (Ignore)
start date of station data history (Ignore)
country
plot factor (obsolete, ignore)
latitu
06 59N
01 22N
00 27S
00 32S
longitu
158 14E
172 56E
090 17W
166 54E
mer
165
180
090
165
S
1
1
1
1
strtdate
Apr 1974
May 1974
Aug 1974
May 1974
country
Micronesia
Kiribati
Ecuador
Nauru
P
2
2
2
2
E-1
SLP64 USER MANUAL
F
Notes on Foreman Tidal Analysis
Program
1) Interpretation of the output file of the tidal analysis program.
For each tidal component there are four columns (after the time span for analysis which is
repeated on every line) corresponding to A, G, AL, and GL. The last two are the fitted amplitude
(AL) and phase (GL) direct from the least squares analysis. The phase is relative to a reference
time origin at 0000 01 Jan 1976. The first two are the amplitude (A) and phase (G) after
applying the so-called nodal corrections (see next paragraph).
Applying nodal corrections allows the fitted components to be used further from the actual time
period used to fit the components. Each fitted component is in reality a sum of a major
component and several satellite components that are too nearby in frequency to resolve in the
least squares analysis. Therefore the AL and GL values do not accurately reflect the amplitude
and phase of the major component, but show a modulation due to the satellites. By assuming
that the ratio is equal to the ratio of the equilibrium components, it is possible to correct the
major component to obtain the amplitude and phase that would be fit if the satellites were
resolved.
2) Foreman updated his programs in 1992 to comply with change of century and to allow
analysis and prediction for previous centuries. Two main changes have been made: 1) the
calendar was extended for calculations outside of the 20th century, and 2) the position of the
moon and sun (i.e., the astronomical arguments) are now approximated by higher order
polynomials (in time) so that sufficient accuracy is maintained for dates well outside the 20th
century.
3) Doodson number conventions (from Geoff Lennon)
The Doodson numbers define the tidal component frequency and also determine the astronomical
argument. The astronomical argument is the phase of the equilibrium tide at some reference time
origin. Unfortunately, the Doodson numbers for a number of low frequency components and a
few high frequency ones are not the same as those used in other programs. This causes the fit to
be done at a slightly different frequency and with a different reference phase (the astronomical
argument). This will not affect prediction, but will cause disparities between our analysis and
those done at the IOS (UK) and the IHO. There is no simple fix for this situation since we do not
know exactly which components are affected and what the preferred Doodson numbers are for
each of them.
F-1
SLP64 USER MANUAL
G Harmonic Constants: Analysis Output
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
003 Baltra
Z0
.00000000
SA
.00011407
SSA
.00022816
MSM
.00130978
MM
.00151215
MSF
.00282193
MF
.00305009
ALP1
.03439657
2Q1
.03570635
SIG1
.03590872
Q1
.03721850
RHO1
.03742087
O1
.03873065
TAU1
.03895881
BET1
.04004043
NO1
.04026859
CHI1
.04047097
PI1
.04143851
P1
.04155259
S1
.04166667
K1
.04178075
PSI1
.04189482
PHI1
.04200891
THE1
.04309053
J1
.04329290
SO1
.04460268
OO1
.04483084
UPS1
.04634299
OQ2
.07597494
EPS2
.07617731
2N2
.07748710
MU2
.07768947
N2
.07899925
NU2
.07920162
H1
.08039733
M2
.08051140
H2
.08062547
MKS2
.08073957
LDA2
.08182118
L2
.08202355
T2
.08321926
S2
.08333334
R2
.08344740
K2
.08356149
MSN2
.08484548
ETA2
.08507364
MO3
.11924210
M3
.12076710
SO3
.12206400
MK3
.12229210
SK3
.12511410
MN4
.15951060
M4
.16102280
SN4
.16233260
MS4
.16384470
MK4
.16407290
S4
.16666670
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0 00 26S 090
785/ 786178.6955
785/ 786 3.7802
785/ 786
.3527
785/ 786
.7544
785/ 786 1.0036
785/ 786
.6430
785/ 786 2.0005
785/ 786
.0856
785/ 786
.1114
785/ 786
.0962
785/ 786
.3442
785/ 786
.1030
785/ 786 1.0145
785/ 786
.3383
785/ 786
.0398
785/ 786
.2466
785/ 786
.0600
785/ 786
.1035
785/ 786 2.6462
785/ 786 1.3144
785/ 786 7.8056
785/ 786
.1121
785/ 786
.1356
785/ 786
.1683
785/ 786
.5376
785/ 786
.2183
785/ 786
.6323
785/ 786
.1186
785/ 786
.1910
785/ 786
.5028
785/ 786 1.7823
785/ 786 2.5555
785/ 786 14.9349
785/ 786 3.0899
785/ 786
.7249
785/ 786 70.0426
785/ 786
.6465
785/ 786
.2083
785/ 786
.5082
785/ 786 1.5764
785/ 786 1.3464
785/ 786 18.7974
785/ 786
.3483
785/ 786 4.9431
785/ 786
.0401
785/ 786
.2544
785/ 786
.0436
785/ 786
.1659
785/ 786
.0158
785/ 786
.0193
785/ 786
.0507
785/ 786
.0261
785/ 786
.0837
785/ 786
.0304
785/ 786
.0394
785/ 786
.0116
785/ 786
.0614
17W
.00
109.26
117.53
62.29
13.21
25.45
32.02
107.06
122.13
148.30
170.27
150.67
47.51
24.71
310.27
344.70
339.19
64.26
51.36
16.16
49.21
202.07
40.90
52.94
69.84
167.59
87.87
113.25
173.92
164.52
188.99
193.17
217.48
222.69
86.90
246.43
79.72
84.68
269.73
280.41
301.44
292.37
284.44
286.66
202.46
289.95
218.71
356.64
274.13
109.26
14.73
359.37
64.54
322.51
116.72
100.18
209.11
178.6955
3.7802
.3527
.7544
1.0036
.6430
2.0005
.1100
.1575
.1145
.4370
.1169
1.1655
.2187
.0474
.2599
.0698
.1028
2.6214
.8941
8.5533
.1138
.1264
.1881
.8075
.2512
.9812
.2156
.1815
.4839
1.6968
2.4735
14.4488
2.9874
.6907
67.9384
.6352
.2550
.4899
1.6704
1.3464
18.8300
.4250
6.2274
.0377
.3487
.0486
.1585
.0182
.0205
.0557
.0245
.0787
.0295
.0383
.0142
.0616
.00
112.59
278.78
239.82
99.16
288.92
96.75
214.71
42.03
344.99
179.94
74.11
148.29
113.63
47.30
311.91
159.43
70.57
61.10
349.24
44.58
196.65
192.38
231.27
165.32
66.73
171.10
278.08
14.83
273.35
115.44
27.63
229.16
142.91
358.86
344.16
180.32
353.46
5.34
303.40
298.11
292.29
102.53
97.63
288.43
189.60
57.22
143.42
14.83
202.36
10.01
108.78
260.00
334.11
214.37
8.89
208.95
G-1
SLP64 USER MANUAL
58
59
60
61
62
63
64
65
66
67
68
SK4
2MK5
2SK5
2MN6
M6
2MS6
2MK6
2SM6
MSK6
3MK7
M8
.16689480
.20280360
.20844740
.24002200
.24153420
.24435610
.24458430
.24717810
.24740620
.28331490
.32204560
0
0
0
0
0
0
0
0
0
0
0
785/
785/
785/
785/
785/
785/
785/
785/
785/
785/
785/
786
786
786
786
786
786
786
786
786
786
786
.0182
.0124
.0068
.0311
.0407
.0819
.0306
.0326
.0212
.0023
.0205
114.89
143.05
238.65
357.93
337.75
37.54
17.82
64.01
108.36
348.03
302.46
.0230
.0128
.0075
.0283
.0371
.0771
.0362
.0317
.0259
.0023
.0182
285.78
333.88
233.86
205.08
270.95
232.93
24.25
161.58
16.98
276.60
333.39
G-2
SLP64 USER MANUAL
H Make Predictions other than Hourly
1) edit tideprd.py
just before tidep.exe (before run tide prediction)
place three quotes, ‘’’ and at end place ‘’’. This changes all those
commands to comments, which are not executed.
2) run tideprd.py
3) edit TIDEDATA
on last non-blank record, in field just after EQUI, enter fraction of hour
(default is hourly = 1.000) e.g. for 15 minutes change it to 0.25000) its
fortran so be byte specific (don’t delete any zeros).
4) now just type tidep.exe, which runs prediction.
Your output will be in eqsprd.dat in the Foreman format.
Remember to take out the comment lines,’’’, in tideprd.py.
H-1
SLP64 USER MANUAL
I
Test for Magnitude of Timing Error
One way to determine the magnitude of a timing error is to compare observed
data to predicted tides. In separate windows for the same time span with the
suspect timing error, place the observed data and predicted files. For
example:
Window 1: Observed File, va00386.dat, 14-16 November, 1996
003BALTRA
003BALTRA
003BALTRA
003BALTRA
003BALTRA
003BALTRA
198611141
198611142
198611151
198611152
198611161
198611162
1243
1120
1258
1129
1357
1239
1389
1166
1269
1050
1227
1029
1684
1399
1463
1177
1301
1026
2055
1751
1793
1477
1547
1230
2416
2131
2176
1867
1905
1573
2683
2454
2523
2262
2291
1977
2767
2637
2730
2564
2596
2351
2645
2627
2748
2683
2733
2592
2360
2437
2559
2604
2660
2651
^
|
Note high tide at hour 20 of
1983
2124
2211
2356
2402
2511
1591
1761
1799
2002
2016
2209
1274
1439
1405
1638
1596
1841
11-16-1986
Window 2. Predicted Tides File, pa00386.dat, 14-16 November, 1996
003Baltra
003Baltra
003Baltra
003Baltra
003Baltra
003Baltra
198611141
198611142
198611151
198611152
198611161
198611162
1277
1219
1402
1372
1607
1592
1238
1068
1223
1093
1314
1222
1384
1120
1235
1010
1190
1011
1682
1358
1437
1140
1268
1014
2059
1718
1778
1446
1528
1226
2420
2110
2174
1850
1903
1588
2675
2439
2525
2254
2300
2014
2752
2621
2737
2559
2615
2401
2627
2608
2747
2685
2759
2651
2333 1939 1536
2412 2088 1720
2552 2199 1773
2605 2348 1985
2689 2423 2027
2701 2547 2236
^
|
Note high tide at hour 21 of 11-16-1986
As can be seen above, it appears that the observed data file lags the
predicted tides by one hour. Thus, one can apply the timing error correction
technique to shift the observed file for this time span by +1 hour.
I-1
SLP64 USER MANUAL
J
Figures
Figure 1
Observed hourly data showing typical semi-diurnal tides.
J-1
SLP64 USER MANUAL
Figure 2
Observed hourly data showing data gaps and suspicious spikes in December. In choosing a time
span for tidal analysis, it is best to choose 366 days of continuous gap-free, apparently good data.
In review of the previous plot for 1985 in addition to this plot, a suitable span for analysis would
be from 00Z April 1, 1985 to 23Z April 1, 1986.
J-2
SLP64 USER MANUAL
Figure 3
Residuals are computed from observed minus predicted values. This plot shows smooth
residuals, which indicates a good tidal analysis and clean data. Minor phase shifts in time are
responsible for the small fluctuations during April and May.
J-3
SLP64 USER MANUAL
Figure 4
Residuals revealing timing errors of varying magnitude, which are especially obvious in
November and December. Also, short gaps, data spikes, and a reference level shift are apparent
in December.
J-4
SLP64 USER MANUAL
Figure 5
Scatter diagram for tide staff readings and tide gauge measurements. In this case, the pairs are
good except for one outlier due to a bad staff reading. This point will be removed from the pair
file prior to computing the mean difference.
J-5
SLP64 USER MANUAL
Figure 6
Scatter diagram for tide staff readings and tide gauge measurements for case with clogged
stilling well.
J-6
SLP64 USER MANUAL
Figure 7
Scatter diagram for tide staff readings and tide gauge measurements. In this case, a reference
level shift occurred due to resetting of the gauge zero during a site maintenance visit.
J-7
SLP64 USER MANUAL
Figure 8
Residuals of hourly data for Kelang, Malaysia. At this site, the tides are highly complex due to
the shallow shelf and narrow basin configuration. The tidal analysis was unable to resolve many
of the higher frequency tidal species. The questionable fluctuations in the residuals are caused by
poor predicted tides. The observed data are of good quality.
J-8
SLP64 USER MANUAL
Figure 9
Residuals based on tidal analysis of observed data from 12/1991-12/1992. The fluctuations are
due to poor tide predictions.
J-9
SLP64 USER MANUAL
Figure 10
Residuals based on tidal analysis of observed data from 1/1995-1/1996. In this case, the residuals
are smoother because the tidal analysis is modelling the same time period as plotted above. It
appears that the complex tides at this site are not fully resolved by the tidal analysis; thus, the
quality of the predicted tides drops as the predicted year moves away from the analyzed year.
This site requires frequent updates of the harmonic constants.
J-10
SLP64 USER MANUAL
Figure 11
The observed hourly data file of Figure 4 was shifted by positive one hour (Appendix I above)
using program TSALL. (The program shifts the entire year). The resultant residuals are shown
above. The time spans of the original data file without a timing error now have fluctuations, but
the time span of the suspect timing shift in November is now smooth.
J-11
SLP64 USER MANUAL
Figure 12
Daily difference plots between version/station a076 (Taranaki) and r071(Wellington). The good
agreement suggests high data quality in the calibration of each series to its reference zero. There
is also interesting oceanic variability seen on various time scales in the difference plot.
J-12
SLP64 USER MANUAL
Figure 13
Monthly difference plots between version/station a076 (Taranaki) and r071(Wellington).
J-13
Download