SEACOOS Data Management Code Repository
Table of contents
This is a place to post your OOS related code developments or links. Not sure the best way to organize this - currently I'd say by individual, institution or project/program.
Developed for use by SeaCOOS?
, but the functionality is general enough that the application and the source code are being made available for general use by the community.
The link to the earlier bulletin board post is still good, but I've added a smaller tar file of how I go from SQL query to dataset to graph here
The code at the above link is designed to be used stand-alone and/or as a web service to help tranform the DODS client http ascii output(the 'Get ASCII' button) to something more column oriented (CSV Comma Separated Value). This can help the data be more easily viewed and understood as well as fed into other analysis tools, web services and databases. General and technical description available.
A basic html form processed by php with mail notification and appending of corresponding html and csv documents. Zip file should contain all files - change form fields, personnel notified by mail and site references as needed. Article reference http://www.thesitewizard.com/archive/feedbackphp.shtml
Some basic scripts showing the processing of a platform datastreams(directly, via telnet) to population of the relational database and some other side products.
Some basic scripts showing the process of screen/web scraping to create sql statements for populating a relational database.
Generating netCDF files for Seacoos aggregation
This perl script
takes a platform_id and measurement_date which guides the query selection against the previous population of the database for that measurement timestamp. The returned query values are substituted into a text netCDF file
with substitutible macros. ncgen is then run to produce a timestamp prefixed 'latest.nc' file.
Similar process to the previous example but with Screen scraping from NDBC of the input data.
Redeveloped some initial scripts authored by Peter Bense with the following bits of shell and perl code to get a better idea of the load on our server and when things might need more immediate attention.
NDBC Message Encoding for FM64, FM13 type messages
I use the attached perl scripts(ndbc.pl, ndbc_fm13.pl) which take the data we get from the platforms and convert them to the ascii message format required which calls the NDBC supplied encoder function to encode the message to a binary format which is then ftp'd to a Carocoops directory.
I'm also attaching the two sample message formats which we use with NDBC. FM13 is the message format for weather observations and water temperature at the surface, while FM64 is a newer message format for ADCP current type information at different bin depths and another 3 variables for depth,temperature and salinity.
Rana wrote the original 'ndbc.pl' script which runs a query against our database to populate a FM64 type message. This was back when we didn't have any weather data coming in from our buoys and I had tables set up by platform instead of by variable. This script could be adapted with the necessary data arguments supplied from the command line or in a file instead of from a database query.
The script ndbc_fm13.pl is a second script I wrote when the weather instrumentation was added to our buoys. It uses the supplied data arguments when called to populate the FM13 type message.
Some sample code demonstrating how data returned from the WFS URL request for the latest station data is processed and merged into an html webpage.
The code at the above link is designed to be used stand-alone and/or as a web service to allow data providers with column oriented ascii data to create a descriptive Seacoos convention XML document which can then be used to transform those datasets into Seacoos compatible netCDF files for aggregation or archival needs. The XML document can also be used by other groups or software in other aggregation or processing efforts.
Recently got into a discussion with at data provider about DODS servers for ASCII data. Researched the DODS FreeForm?
? and JGOFS server a bit and ended up trying the FreeForm?
? server install for Windows. Posting some notes below on the install process. Seems the only sticking point might be that I believe both services prefer data to have the ASCII data columns rigidly fixed in their layout, so CSV or other similarly delimited files are problematic. My suggestion in the absense of a DODS/OPeNDAP developed fix, is using or developing tools/services that filter from delimited to fixed column format.
Lately(January 2006) I've been rethinking some of the table structure issues. While on the one hand the one table per observation approach has been working fine, my temptation is to want to collapse these singular similar structured observation tables into one mega-table with an extra index of observation type. The advantage to this approach is hopefully easier code and database maintenance as there are less individual table references involved, but the disadvantage is also that a singular table reference can get into an all or nothing scenario when it comes to performance or problems at the database table level.
I've been trying to reduce and simplify what we're doing in terms of data collection and sharing for ocean observations down to a single set of relational database tables and scripts. The name I'm choosing for this effort is 'Xenia' which is a kind of coral found in many marine aquarium coral collections which waves through the water with many palms or hands.
This wiki page describes a simple observation catalog (ObsCatalog
) and an initial google earth visualization product which is created by transforming the input data to KML.
Caro-COOPS is supporting the development of unique data management solutions for coastal ocean observing systems. For example, the "Meta-Door" second-generation online metadata entry tool is now available via the link below. This tool evolved from the "Cast-Net" initiative of the Southern Association of Marine Laboratories (SAML) and the Southeastern Universities Research Association (SURA) with funding from NSF EPSCoR?
. Development of the Cast-Net tool resulted from a cooperative effort among five regional sites: the Baruch Institute for Marine and Coastal Sciences at the University of South Carolina, Dauphin Island Sea Lab Alabama, the Louisiana Universities Marine Consortium, the Skidaway Institute of Oceanography of the University System of Georgia and the University of Southern Mississippi.
The Cast-Net metadata tool was developed to provide a convenient and user-friendly mechanism for generating FGDC-compliant metadata. Increased and broad-based use of Cast-Net demonstrated the need for enhanced capabilities and more comprehensive capacities for data and system documentation. Meta-Door is a complete regeneration of the Cast-Net metadata tool designed to handle the needs of a broad range of contemporary users. Meta-Door resembles Cast-Net in appearance and functionality while providing an enhanced user driven file management system and an easier, more reliable application interface. It has been completely re-programmed to work with a relational database that will eventually allow for more sophisticated indexing services. Future versions of Meta-Door will allow users to incorporate additional metadata standards as new standards emerge and become widely used in the ocean coastal observing communities.
Meta-Door has been rebuilt from the ground up with Java Struts to be database- and OS-independent. New features include:
- User administration with simplified registration and profile management
- Additional metadata formatting options for review
- Option to remove published metadata from ISite
- Lightweight GIS for input of Geo-location information
- User interface improvements, including collapsible form layout
Users may register and use the Meta-Door system for entering their metadata records at the following link.
Meta-Door demos and sourcecode download