Skip to topic | Skip to bottom
Main.JCNotesr1.83 - 19 Jan 2009 - 20:07 - JeremyCothrantopic end

Start of topic | Skip to actions
This page is just a compilation of some notes and links to get some better communication going.

December 1, 2004

Generally I'd like to see us move in more of an XML'ish direction in regards to data models/schemas and documentation which is programmatically complete. See the below links for some future OGC specs to focus on, it's particularly helpful to start looking at the UML diagrams and read outward from there. We'll be trying to capture aspects of that in our database and hopefully Metadoor will be a tool for capturing metadata and normalizing/exporting it to other formats for further tool or application use. There's going to be some overlap between Metadoor, Sensor Inventory and OGC efforts - my current idea is that MetaDoor? can focus on initial metadata capture, sensor inventory can be a kind of customized query builder or application interface for collected datas either accessing the database directly or via xml/web service type access and OGC will be focused at a much broader level, helping promote protocol or data specs we can further leverage off of. Attaching a database schema(might need to magnify in to make out names) of the current metadoor database tables below.

General Sensor Web article

SensorML? website

SensorML? listserv(just started)

SensorML? spec

Note the below two specs are for being used for the data itself - we'll probably try to support both with the O&M spec being a more general superset of the MarineXML?.

Observations&Measurement spec

MarineXML? schema

XML Direction

Reading through the OGC specs and MarineXML? specs, what I think will help our efforts better line up with the outside community is better knowledge and use of XML infrastructure for programmatic reference and documentation(XSD(XML Schema Definition) for data description, XML for data). We will continue to use netCDF and relational databases for storage and application optimization needs, but for data interchange(data discovery and access) we need to have a parallel XML track. The OGC WMS and WFS services Charlton has set up partly satisfy this. See also RSSWeather OGCInfo

I'd like to keep an XML version of our seacoos data dictionary, but I'm still fuzzy on what the correct way of doing this is. The SensorML? site has a general version listed , but I'm guessing we'll add a lot more attribute or element fields as well as whatever tags need to be in place to allow this to be pulled into a larger ontology(s).

I'd like to have an XML version of the seacoos netCDF standard, say in two sections or files covering the metadata and data. Initially I started with the idea of a 'flatfile version', trying to simplify down some from netCDF, but I think an XML version wouldn't be too far of an offshoot. Either version would probably require tools to get people using it without being familiar with it, while an XML version would hopefully be better at being self-documenting.

Others have also mentioned the need to keep data and metadata documentation physically together as part of the same file, rather than linked which sounds good to me as long as file size or readability doesn't become an issue.

I'm still fuzzy on ontologies(cross walks between data vocabularies) and putting all the xml components together across systems, but hopefully that'll become clearer over time and with use.

XML Links

XML Schema Tutorial

Using W3C XML Schema

(Excuse the surrounding page advertisements, feel free to add other sites here)

Exporting Access schema using XML
(didn't realize Access could do this, useful for getting a handle on going from an existing database to the XSD, XML representation)

Excellent xml starting descriptions/links

Comparing ontologies and XML Schemas(2001)

Seacoos netCDF convention

The current documentation looks good to me(thanks Sara!) and I'm try to add an attachment in the next few days describing some addendum's regarding extra attributes for quality control. Liz's suggested attributes works for me, I'd like to also add a few other NDBC'ish type things like internal consistency(for redundant sensors) and more descriptive flagging. Also an issue present within the quality control and data dictionary, is the need to specify regional lookup tables(see attachment below) which apply to the definitions or qc. I'd like to convert these lookup tables to an XML representation and have them referenced as a document attribute(similar to the data dictionary) within the netCDF. This would be similar to namespace declarations in an xml file at the top and we could probably parallel that in the netCDF file listing all the namespaces within the 'convention' attribute(this follows the unidata recommendations at ).


Water level is something we could probably move forward on although I'm still unsure of how benchmarks (or lack thereof) effect what we collect. How will this be presented as a product? One as an unbenchmarked reference and another for each benchmark of interest? CSC has a USGS contact they are working with as well who we'll probably want to pull in for discussion. IOOS is actively pinging Charlton for water level data also.

MMI (Marine Metadata Initiative) is asking questions about water currents(ADCP,etc) and chlorophyll - their data representation and collection. Don't know if the groups they deal with have existing data dictionaries or data formats of consideration already or they'd want to adopt whatever we'd plan to do.

In general I'd like to think that we could go ahead and start experimentally collecting some of these data, figuring out what additional attribute exceptions need to be captured as we go. Questions will probably continually linger about the ongoing cost of collection and value presented in the quality of the aggregations.

Also talked with Charlie Barans about his fishcam film clips and mentioned to Charlton how if Charlie could provide timestamped, geolocated metadata and a URL, Charlton might be able to add a 'video' or 'camera' data layer which would provide a URL link to those clips within a given time window. This could be a model for other binary types of data like video and audio which don't really fit into our current in situ variable or raster image scheme.

DODS/OPeNDAP interfaces

We provide access to our datasets with both the netCDF and relational database servers here at USC, but most of the data transfer focus has been through the GIS or OGC WMS/WFS protocols. Several folks I talked with at the seacoos conference talked about issues they were having with their DODS servers - I don't have very good answers, but I wanted to list them in case others want to chime in.

  • For modelers or others retrieving a large resultset, there may(if memory serves) be an artificial inital limit on the size of the resultset at 3000 records. This should be changeable within the DODS configuration files.
  • Xiaoyan seemed to notice problems with some of the returned resultsets using different comparison operators(<,>,etc), not sure if this is documented on the DODS boards or what exactly the problem is
  • Trent was looking for either an existing program/process from the DODS group or otherwise to convert/transform the http row oriented return from a DODS query into a row/column resultset. For example instead of a station listing a long string of variables for temperature and another long string for salinity, that you'd get separate string per timestamp including both.


There's probably some other points I'm forgetting, I'll probably tweak this page as I remember. I'd like to use our and OGC data standards, protocols and tools to help formalize what start as (necessarily) experimental processes. From my view there are 3 levels our products go through, the first being our initial aggregation and technical review, the second being the seacoos community review and the third being a general community review. Each of these levels brings different elements into play and requires changes in emphasis and support as they are addressed.

Links of interest


Protégé is an ontology editor and a knowledge-base editor.

Protégé is also an open-source, Java tool that provides an extensible architecture for the creation of customized knowledge-based applications.

Protégé's OWL Plug-in now provides support for editing Semantic Web ontologies.

EMS PostgreSQL? Manager

(database GUI manager, what we're currently using and would recommend for around $200 per license)

EMS PostgreSQL? Manager is a powerful graphical tool for PostgreSQL? administration and development. It makes creating and editing PostgreSQL? database objects easy and fast, and allows you to run SQL scripts, manage users and their privileges, build SQL queries visually, extract, print and search metadata, export data to 14 available formats and import them from most popular formats, view and edit BLOB fields, and many more...

December 9, 2004

Adding the attachment below regarding my responses to some Marine Metadata Initiative questions.

January 18, 2005

Adding the attachments here (Creating and Publishing Metadata in Support of Geospatial One-Stop and the NSDI) and here provided in response to a query about supplying metadata to the portal(note that can currently harvest metadata from ArcIMS?, z39.50, OAI-PMH or Web folders). Thought it was helpful in explaining some of the metadata harvesting processes as well as the ISO 19115 metadata keywords.

February 1, 2005

Seacoos data import flatfile format (xml metadata header + column oriented data)

To accommodate other potential data providers who may wish to provide their data for aggregation, but without the effort of utilizing netCDF formats or tools, a secondary ‘flatfile’ format is specified which might be more convenient or expedient for data providers or aggregators to use.

The below example for a fixed-point data format category describes a xml metadata header file which describes other data file (.dat) files of the same filename prefix. I'd like to provide similar examples for the other 5 data format categories ( fixed-profiler, fixed-map, moving-point-2D, moving-point-3D and moving-profiler ). The header file is a derivation of the existing seacoos netCDF header file and xml elements and attributes could mirror netCDF development.

For data providers with Excel/column oriented data, a web form based tool could be used to help create the xml header file. This would be associated with data upload streams using a similar filename prefix.

The metadata header file could tie into lookup tables on meta-door(or other similarly designed databases) for platform, sensor_group, sensor, observed variables(standard_names) and units. The header file being in xml has the advantages that an xml schema and xml namespaces could be developed for documentation, validation and exchange.

The following website was used to generate the attached xml schema for the sample xml metadata header document.

Data uploaded could be organized as separate tables by observation type or as a singular table with a observation type index. It would probably be good to split either implementation into recent and archived tables so that queries on the current or recent state respond faster and re-indexing of the archived observations could be offset scheduled/buffered. Table design and spatial indexing to accomodate GIS presentation is also a consideration.


Example xml header file

<?xml version="1.0" encoding="UTF-8"?>
<!-- Format for header filename is <provider_id>_<platform_id>_<sensor_group_type>_hdr.xml

     This metadata header file applies to all submitted data files (.dat) with the same <provider_id>_<platform_id>_<sensor_group_type> prefix
     Valid filename suffixes for .dat files are _<iso8601_style_format_datetime_of_last_measurement>.dat or _latest.dat
     For example, the metadata header file carocoops_buoy6_weatherpak_hdr.xml would apply against the following files:

<!-- For standard_name and units elements reference similar columns in the seacoos draft data dictionary at

    Units are same as units supported by udunits listed at

        <!-- conventions listing may evolve to an xml schema namespace reference, for now represents expected elements and possible values -->
        <!-- repeatable element -->

        <!-- format_category list [fixed-point,fixed-profiler,fixed-map,moving-point-2D,moving-point-3D,moving-profiler] -->


        <!-- longitude +/-180.0 degrees west -->

        <!-- latitude +/-90.0 degrees east -->

        <!-- unique descriptive reference to platform_id, could be autogenerated from platform lookup table using format <provider_id>_<platform_id> -->

        <!-- aggregate_quality_level list [0,1,2,3,-9] where 0=quality not evaluated, 1=bad, 2=questionable/suspect, 3=good, -9=missing values -->

        <!-- url listings for further qc documentation -->
        <!-- repeatable element -->


                <reference>mean sea level (MSL)</reference>

                <!-- positive list [up,down] -->

                <!-- see 'units' column in seacoos data dictionary, same as udunits listing -->


        <!-- repeatable element listed in same order as data file columns-->

                <!-- see 'standard name' column in seacoos data dictionary -->

                <!-- see 'units' column in seacoos data dictionary, same as udunits listing -->
                <units>m s-1</units>

                <!-- height above or depth below z reference point for this observation -->

                <!-- unique descriptive reference to sensor_id, could be autogenerated from sensor lookup table using format <sensor_type>_<sensor_id> -->
                <!-- qcflag element optional: see notes at -->


                <units>degrees Celsius</units>

                <units>degrees Celsius</units>


Example data file

Note that the time(measurement time) field is traditionally listed first in an ISO 8601 format (see ) including the time zone but not requiring a character ‘T’to indicate the beginning of the time element (2003-11-11 20:00:00-04 for example).

2004-11-22 15:00:00-4,9.8,'O:R:C:G',103.6,'O:R:C:G',27.00,'O:R:C:G',28.11,'O:R:C:G'
2004-11-22 16:00:00-4,12.63,'O:R:C:G',112.7,'O:R:C:G',27.23,'O:R:C:G',28.17,'O:R:C:G'
2004-11-22 17:00:00-4,16.52,'O:R:C:G',105.0,'O:R:C:G',27.05,'O:R:C:G',28.17,'O:R:C:G'
2004-11-22 18:00:00-4,18.97,'O:R:C:G',88.8,'O:R:C:G',26.97,'O:R:C:G',28.35,'O:R:C:G'

Update (12 May 2005): I've moved forward on the format of a Seacoos XML for fixed-point platforms and the resulting code, services and documentation is at


Example xml header file

Example data file

February 10, 2005

Here's the link to some of the powerpoint presentations presented at the recent LTER (Long Term Ecological Research) meeting I attended in Albuqurque, NM this week(about 20 attendees). I was the only one there focused on oceans and near real-time data as applied initially to coastal hazards, most of the folks there are focused domain wise on long term archival and reference issues concerning individual or institution ecological research (as could be guessed from LTER acronym). Technically they are focused on the use of EML(a format), metacat(a registry) and Morpho(an entry tool) and bringing their registries in line/cross-walked with other existing registries like FGDC, NBII, etc and use of controlled vocabularies(GCMD). I had a chance to share some links to our existing ocean observing work, metadata, tools and documentation with several members of the group who were interested.

The presentation labeled 'mad.ppt' I thought was fairly informative on pages 3-5(the powerpoint animates page 3, with acronym descriptions on page 5). The graph is fairly intimidating, but there's a lot of stuff out there. And there are some jokes on page 8 :)

Also of interest:

Kepler which is a java based component oriented platform for modeling scientific workloads

NEON (National Ecological Observatory Network) - terrestrial biologically oriented observation network

March 24, 2005

Below are some new seacoos data dictionary suggestions I'm putting up for feedback from the group. I'm proposing these because they cover the full range of observation I currently collect via the Caro-COOPS array and working with George Voulgaris and his ADCP's which also include waves measurements.

The udunits listing that I'm using as a lookup is at

Added code to the repository showing how I'm currently generating latest netCDF files for aggregation here.

corrected standard names

significant_wave_to_direction - corrected a spelling error in the standard name from 'signficant' to 'significant'. Added definition - average wave to direction for the highest one-third of the waves.

significant_wave_height - corrected a spelling error in the standard name from 'signficant' to 'significant'

relative humidity - corrected a spelling error in the units from 'precent' to 'percent'

visibility - changed units from m (meters) to nautical_miles (listed in udunits)

new standard names

wave_peak_period - (definition from peak period, in seconds; inverse of the frequency with the highest energy in the reported spectrum.

wave_peak_direction - (definition from mean direction from which energy is coming at the peak period, in degrees clockwise from true North.

wave_maximum_height - statistically, the maximum wave height in the record is Hmax ~ 9*Hs/5 where Hs represents the significant wave height. Units meters.

wave_average_period - average period, in seconds; derived from the zeroth moment divided by the first moment of the reported energy spectrum.

solar_radiation - sunlight measured in watts per meter squared(watt/meter2)

  • note that that udunits lists the 'langley' which is joule/meter2, but not watt/meter2

Note for the following I'm using sea_surface_ and sea_bottom_ prefixes with surface and bottom both being within 3 meters of those boundaries.

sea_surface_salinity - Measure of salt content of a seawater sample follows UNESCO standards; approximately parts per thousand within 3 meters of the sea surface.

sea_bottom_salinity - Measure of salt content of a seawater sample follows UNESCO standards; approximately parts per thousand within 3 meters of the sea floor.

sea_surface_temperature - in situ temperature of the ocean within 3 meters of the sea surface. Units degree_Celsius.

sea_bottom_temperature - in situ temperature of the ocean within 3 meters of the sea floor. Units degree_Celsius.

sea_surface_chl_concentration - concentration of chloropyll-a in a defined volume of water within 3 meters of the sea surface. Units ug L-1.

sea_surface_current_speed - Magnitude of velocity of ocean current within 3 meters of the sea surface. Units cm s-1.

sea_bottom_current_speed - Magnitude of velocity of ocean current within 3 meters of the sea floor. Units cm s-1.

sea_surface_current_to_direction - Direction towards which ocean current is going within 3 meters of the sea surface. Units degrees.

sea_bottom_current_to_direction - Direction towards which ocean current is going within 3 meters of the sea floor. Units degrees.

sea_surface_northward_current - North/South component of ocean current within 3 meters of the sea surface. Northward is positive. Units cm s-1.

sea_bottom_northward_current - North/South component of ocean current within 3 meters of the sea floor. Northward is positive. Units cm s-1.

sea_surface_eastward_current - East/West component of ocean current within 3 meters of the sea surface. Eastward is positive. Units cm s-1.

sea_bottom_eastward_current - East/West component of ocean current within 3 meters of the sea floor. Eastward is positive. Units cm s-1.

water_column_average_current_speed - average magnitude of velocity of ocean current for a given water column measurement. Units cm s-1.

water_column_average_current_to_direction - average direction towards which ocean current is going for a given water column measurement. Units degrees.

water_column_average_northward_current - average North/South component of ocean current for a given water column measurement. Northward is positive. Units cm s-1.

water_column_average_eastward_current - average East/West component of ocean current for a given water column measurement. Eastward is positive. Units cm s-1.

existing standard names we could collect [with suggested relational database tablename to collect to using the table structure described here

wind_gust [could be another column on existing wind table]

air_pressure [air pressure table]

air_temperature [air temperature table]

relative humidity [relative humidity table]

visibility [visibility table]

depth [depth table]

significant_wave_height [waves table]

new standard names we could collect

wave_peak_period [waves table]

wave_peak_direction [waves table]

wave_maximum_height [waves table]

wave_average_period [waves table]

solar_radiation [solar radiation table]

sea_surface_salinity [salinity table, describing z_desc as 'surface']
sea_bottom_salinity [salinity table, describing z_desc as 'bottom']

sea_surface_temperature [water temperature table, describing z_desc as 'surface']
sea_bottom_temperature [water temperature table, describing z_desc as 'bottom']

sea_surface_chl_concentration [chlorophyll table, describing z_desc as 'surface']

sea_surface_current_speed [currents table, describing z_desc as 'surface']
sea_bottom_current_speed [currents table, describing z_desc as 'bottom']

sea_surface_current_to_direction [currents table, describing z_desc as 'surface']
sea_bottom_current_to_direction [currents table, describing z_desc as 'bottom']

sea_surface_northward_current [currents table, describing z_desc as 'surface']
sea_bottom_northward_current [currents table, describing z_desc as 'bottom']

sea_surface_eastward_current [currents table, describing z_desc as 'surface']
sea_bottom_eastward_current [currents table, describing z_desc as 'bottom']

water_column_average_current_speed [currents table, leaving z null and describing z_desc as 'average']
water_column_average_current_to_direction [currents table, leaving z null and describing z_desc as 'average']
water_column_average_northward_current [currents table, leaving z null and describing z_desc as 'average']
water_column_average_eastward_current [currents table, leaving z null and describing z_desc as 'average']

April 15, 2005

Just reporting that all the Carocoops data, George Voulgaris ADCP waves/currents data and NDBC for salinity(just stations fbis1 and ducn7 - thanks Vembu for getting the wave_significant_height and dominant_wave_period as well as the met stuff) is made available to the Seacoos aggregation scripts or others at;O=D

Sample Carocoops 10m buoy

Sample Carocoops water level station (wls)

Sample George Voulgaris ADCP waves&currents

Sample NDBC with salinity

Using package types 'adcp' and 'wls'

I've used our online Seacoos data dictionary standard_name and units at adding several more definitions where needed. Vembu also reminded me of NDBC's netCDF standard_name and units they are using at

I'm providing the same measurements under more general(water_temperature) and more fully qualified/prefixed standard names(sea_surface_temperature) for variables like water_temperature, salinity, chlorophyll and currents.

I relaxed the definition of 'sea_surface' and 'sea_bottom' to 'near' surface or bottom, since other technical processes can more strictly enforce a certain limit based on the included depth information.

Macro substitution continues to work well both with sql query arguments and netCDF generation

I plan to do the same type monthly report scripts for our new additional in situ variable types as what we have for winds and sst used by the modelers.

July 9, 2005

Google Earth

Wow! This is a really cool 3D GIS browser application that is free to use and develop with and has lots of existing functionality with the potential for lots more.

An article write-up,1895,1831878,00.asp

A listing of features is at

Downloading and installing a browser client (about 10 megabytes) at is required. A listing of desired system specs is given.

After installing, try turning on different data layers and tilting the view(mouse wheel controls zoom). Turning on 3d buildings in larger city areas like New York and Atlanta and flying around is pretty cool. Make sure the 'terrain' layer is turned on to see 3d mapping of surface elevation. Going from one address to another via the search entry has a fun flying effect as well. Some of the satellite imagery is several years old and there's a small fee involved to get the top quality imagery.

Development with OOS sources

While it's very easy to spend a LOT of time just playing with the existing features and maps, there is also a reference manual and tutorial for adding your own static and dynamic content to the application. The immediate thought that comes to mind is to provide the data which we are currently collecting (point in-situ observations and satellite or other area coverage maps) via Google Earth.

The discussion forum for new products, data layers and development is at

and a good resource for featured developments and data is at

Dynamic Data Layers (weather, USGS, precipitation, etc)

KML is an acronym for Keyhole Markup Language. Keyhole was the company Google acquired which developed the technology. KMZ is a Zipped(compressed) KML file.

The tutorial is well written giving some good short starting examples with static KML and more examples using php code to dynamically generate KML (for dynamic content).

Dynamic content such as latest readings or animations can be generated by swapping the reference readings or images at the server which is refreshed at the client.

The application also includes the ability to package and time index events so that packaged animations can also be created.

Postings on the keyhole bulletin board by 'ink_polaroid' have been very useful. He? describes(click on 'Attachment' at the top of the post for the KML to run in google earth) an animation of NEXRAD radar here

and pulling data from the national weather service into google earth here

KML Tutorial

KML Reference Manual

Code Examples

Creating a placemark

From the reference manual

<kml xmlns="">
  <description><![CDATA[<a href="">Google Search!</a>]]></description>
  <name>Google Headquarters</name>

Here's a simple script that will take a text file(sample here) with columns ordered: name, description, longitude, latitude and produce a kml file(sample here).

UPDATE December 19, 2006

Created the following utility which can be used to convert a CSV (Comma Separated Value) file into a KML file for use with Google Earth. The input csv file can be used to create points, lines and polygons of various colors and scale with timestamps. Color and scale can be used as part of a visual legend to indicate georeferenced data qualities or quantities. With Google Earth version 4 and greater, timestamp information can be used to provide time animations of data using the time controls.

Sample transect and point

The below script demonstrates how to plot a line transect with a starting point and another independent point. If you want more lines or points, copy the sections between the <Placemark> tags and change the sub elements as needed. A color chart for the <color> tag can be found at

<?xml version="1.0" encoding="UTF-8"?>
<kml xmlns="">
<name>my folder name</name>
<description>my folder description</description>
  <name>my starting point name</name>
  <description>my starting point description</description>
  <name>my line name</name>
  <description>my line description</description>
  <name>my point name</name>
  <description>my point description</description>

Creating a ground overlay

From the tutorial

<?xml version="1.0" encoding="UTF-8"?>
<kml xmlns="">
  <description>Overlay shows Mount Etna erupting on July 13th, 2001.</description>
  <name>Large-scale overlay on terrain</name>


Within the google earth viewer, we could do placemarks and readings for the station id's currently collected to the Seacoos database. What we should also collect in the Seacoos netcdf is a platform_url such as

platform_url: ""

so viewers can go directly to the platform link as well.

We could also do ground overlays for all our existing maps such as sst, quickscat and hf radar for example since we have the images and can project the lat/long boundaries. An screen shot of such a projection is here

WMS / WFS wrappers

A wrapper to convert wms calls to kml is listed at

The code listed below allows a dynamic reference to the bounding box to be passed, returning the appropriate kml code to the google earth app.

$wms = ",Countries,Topography,Hillshading,Builtup+areas,Coastlines,Waterbodies,Inundated,Rivers,Streams,Railroads,Highways,Roads,Trails,Borders,Cities,Settlements,Spot+elevations,Airports,Ocean+features&STYLES=,,,,,,,,,,,,,,,,,,,&BGCOLOR=0xCEFFFF&FORMAT=image/gif&TRANSPARENT=FALSE&WRAPDATELINE=TRUE";
$url = $wms."&".$bbox;

$kvp = explode("=",$bbox);
$coords = explode(",",$kvp[1]);

header('Content-Type: application/');
header('Content-Disposition: attachment; filename="kml.kml"');

$kml .= '<?xml version="1.0" encoding="UTF-8"?>'."\n";
$kml .= '<kml xmlns="">'."\n";

$kml = "<GroundOverlay>\n";
$kml .= "<Icon>\n";
$kml .= "<href><![CDATA[$url]]></href>\n";
$kml .= "</Icon>\n";
$kml .= "<LatLonBox>\n";
$kml .= "<north>$coords[3]</north>\n";
$kml .= "<south>$coords[1]</south>\n";
$kml .= "<east>$coords[2]</east>\n";
$kml .= "<west>$coords[0]</west>\n";
$kml .= "</LatLonBox>\n";
$kml .= "</GroundOverlay>\n";
echo $kml;

A literal WMS substitution for sst (hardcoded bounding box) is listed here for reference. This will produce the following type screenshot.

<?xml version="1.0" encoding="UTF-8"?>
<kml xmlns="">
  <name>Seacoos OI_SST</name>

Similar wrappers could be built to convert WFS to landmark type data.

Additional links

Google Earth - NASA Worldwind comparison

Recent lawsuit

Why software patents are bad for innovation

July 12, 2005 OceanSITES? netCDF convention development

Charlton clued me into this site

Sample CDL

The OceanSITES? netcdf convention is fairly similar to the Seacoos convention.

  • The focus is on moorings with a format convention similar to the Seacoos 'moving-point-3d'.
  • Not sure if there is a fixed time reference similar to Seacoos
  • Probably could use a better approach with the platform_id similar to Seacoos like institution+platform+sensor
  • QC is the same baby step as everyone else, flagged as good,bad,ugly,missing but not getting into the more contentious issues of ranges,continuities,climatologies and observation/sensor specific issues (yet).
  • both conventions need a platform_url attribute which could be incorporated readouts and map references

Would be good to discuss netCDF issues with this group further so that both conventions can grow together and easily crosswalk.

July 27, 2005 General Info/Links Post

I'm listing below an excerpt from a recent email that I think condenses a lot of general information and links that provides a good starting point into some of the work that goes on.

One of the main Seacoos MapServer? GIS portals is

The documentation we've developed so far on this product is on a wiki at

a powerpoint showing the features and explaining things in a more condensed form is

Earlier technical documentation on the general software we use is at

Most of the stuff we developed is oriented around open source stuff with Linux OS, PostgreSQL? relational database spatially enabled with PostGIS? indexing functions. For in-situ data we have a generalized in-situ observation table model with separate columns for platform id, sensor id, time and space (x,y,z) (see and For raster type data(satellite images) the database only contains a timestamp of the data and a file reference to the corresponding image and projection metadata.

MapServer? works great for generating map images or shapefiles using command line oriented tools (it can convert from shapefiles to map tables as well - see for 'shp2pgsql' and 'pgsql2shp' functions) so that allows us to break out of the GIS tool specific context to automatically pregenerate images or animations for website content. (see Gifsicle, Anis, and GMT under and )

The OGC ( standards for WMS(map images) and WFS(data records) are what we lean on to share data across GIS providers to sites like

WMS/WFS feeds (note the sample URLS at the page bottoms listed as attachments)

Also I've had fun playing with google earth lately and posted some development info at the following link as well

Generally we are relational database oriented and try to take whatever we can down to the relational database level where the volume of data won't be a problem(see * see For the binary dense data we try to establish metadata documents or tables that let us handle the data without expanding it. The output formats that I'd like for Seacoos data to focus support on are:

Output formats

In addition to the usual dynamic or pregenerated text readouts, dashboard displays, time series graphs and maps:

A wiki page on the current possible download methods is at

Other links of interest:

Community effort to define some basic XML/SOAP web services

In regards to data discovery and vocabulary harmonization efforts

In regards to data quality control efforts and description

A code library

see also

A listing of community forums

August 15, 2005 Community XML's of interest

Cutting and pasting some xml info I received from Charles Seaton (CORIE/Nanoos).

Here are some potentially useful references I compiled concerning some of the relevant Marine XML efforts:

This is an article on PMEL's evaluation of AODC Marine XML for use with TOA data.

Their decision was to develop their own schema based on Marine XML, but modified to handle time differently than Marine XML does.
This group is also specifically working on XML/SOAP data transfers. Is anyone in our group familiar with them or their work on this?

The primary author is: Nazila Merati Research Scientist NOAA/OAR/PMEL

The EU MarineXML? project: (the ioc3 server seems to be down at the moment, but there appears to be extensive documentation of their development process over the past year)

The EU MarineXML? projects has a review of existing XMLs and data formats listed at the above link.

The ICES/IOC Study Group on the Development of Marine Data Exchange Systems using XML (SG-XML) has a final report available through the ioc3 site.

The Canadian DFO also has a MarineXML? based on the concept of Keeley Bricks:

Unless the EU Marine XML has recently matured, the Australian AODC Marine XML seems to be the best developed of the marine XML versions:
(address not currently findable, but the Google cache from Jun 27, 2005 can be found here:

There are also the OGC SensorML? and O&M(Observations and Measurements) ML, which appear to have the advantage of being extremely flexible and extensible and the disadvantage of being pretty complex.

SensorML? Powerpoint (data model/field explanations)


August 18, 2005 Perl XML packages of interest

Was looking around for some better perl XML packages for handling XML documents and the candidate after XML::Simple that I liked (haven't tried yet) was XML::EasyObj (EasyObject?)


Also found an interesting article from 2000 about perl and xml

There's also XML::XPath for consideration as well

September 22, 2005 Perl XML::XPath package

UPDATE October 31, 2006 For simple small xml config files (Kb size), XML::XPath is fast enough, but if you are working with larger XML documents(MB+ size), I would strongly recommend using XML::LibXML package instead due to it's use of an external C library which speeds up XML processing by an order of magnitude. XML::LibXML also seems to handle XML attributes better and provide other useful functionality.

Been working with the perl XML::XPath package lately and wanted to post a few notes and links that I thought were useful. Figuring out the exact syntax from the few examples I could gather on the web was a chore. The following bit of code uses the below xml document to display whether the specified platform is online(yes) or not(no). This doesn't really need to be in a loop since I'm specifying the specific platform attribute id I'm looking for, just showing the loop syntax. Found the following links helpful

XPath tutorial

Google group posts

Reading XML documents

Note that you can pass the returned node handle from 'findnodes' as a further subargument or context like below:

my ($platform) = $xp->findnodes('/system/platform');
my ($latitude_dd) = $platform->findnodes('latitude_dd');

#file obs_system.xml

<?xml version="1.0"?>
<system id="carocoops">
 <platform id="CAP2">
  <observation id="air_pressure">
 <platform id="SUN2">
  <observation id="air_pressure">


use strict;
use XML::XPath;

my $xp = XML::XPath->new(filename => 'obs_system.xml');

foreach my $element ($xp->findnodes('/system/platform[@id="SUN2"]/online')) {
        print $element->string_value()."\n";

exit 0;

More XML::XPath example syntax

I puzzled through the xpath syntax and I think I've got it figured out, although it's not intuitive and doesn't quite make sense.

1)You can't specify a substitute for the '*' argument like 'parent::biography', it has to stay 'parent::*' (note: this may be untrue due to confusion related what level the element returns at related to the next point)

2)The findvalue returns one nested level above what I would expect

other than that see the specific syntax used in the file examples.


<group id="green">
<personal_information id="person">
<biography id="1">
<age id="younger">26</age>
<biography id="2">
<age id="older">29</age>

use strict;
use XML::XPath;

my $xp = XML::XPath->new(filename => 'info.xml');
my $age;
my $biography_id;
$age = $xp->findvalue('//biography[@id="'.$biography_id.'"]/child::age');
print "age:$age\n";
my $name;
$name = $xp->findvalue('//biography[@id="'.$biography_id.'"]/age["'.$age.'"]/preceding-sibling::*');
print "preceding-sibling name:$name\n";
#$xp = XML::XPath->new( context => $name );
my $bio;
#this returns all elements at the same level as age
$bio = $xp->findvalue('//*[age=("'.$age.'")]');
print "bio:$bio\n";
#this returns the attribute 'id' for the parent element 'biography'
$bio = $xp->findvalue('//*[age=("'.$age.'")]/@id');
print "bio:$bio\n";
#this returns the attribute 'id' for the parent of parent 'biography' which is 'personal_information'
$bio = $xp->findvalue('//*[age=("'.$age.'")]/parent::*/@id');
print "bio:$bio\n";
#this returns the attribute 'id' of the selected element 'age'
$bio = $xp->findvalue('//*[age=("'.$age.'")]/*/@id');
print "bio:$bio\n";

exit 0;

Note: XML::XPath does not seem to handle namespaces prefixes to element names (like <gml:coordinates>) so I've had to pre-process files, stripping out namespace prefixes(just <coordinates>) when necessary.

Note: You may need to explicitly recast datatypes from a node to the necessary comparison datatype (int, string, etc) when doing comparisons of the results of a xpath query.

Writing xml file out

It is possible using XML::XPath to read in, alter elements(using setNodeText or createNode methods) and write out to file an xml document using the script methods described here Unfortunately while XML::XPath contains a setAttribute method, working with creating/updating attributes seems to have problems so I'd recommend using XML::LibXML for cases where manipulating attributes is needed.

PHP XPath class (April 21, 2006)

Used the following php class code to process xml using XPath.

php class code to include at top of php page
Test php page
Test xml file

Further documentation

December 9, 2005 FusionCharts?, Instrumentation, Graphs

Reviewed some software products today for creating macromedia flash based charts, instrumentation and graphs at

see also

Not sure that this product requires ColdFusion? to run even though 'Fusion' is part of the branding name. Mostly seems to focus on xml wrapping data resultsets and sending the necessary xml for the graph objects and data to the client for rendering on the client side. Prices for their products are inexpensive(around $100) and the control code is available for modification.

I like this product for the library of nice chart,instrumentation,graph interfaces available and the dazzle factor of the animated components.

The only downsides to the product I can currently see are the following:

  • data volume - the graph data is dynamically sent and rendered. For too much data (this is a bandwidth, client issue), you might wait forever to get your graph.
  • for sites that might want to borrow static jpg,png,etc images from another site, I'm not sure that this functionality is exported - can't find a way to borrow and re-embed within another site a chart or control for example. Also can't find a way to export flash components as static images(jpg,png,etc).

Linux Server OS Choice (April 17, 2006)

Currently we run around 4 servers, 3 of which are RedHat? Academic Server based and 1 of which is Gentoo based. I'd like to move our development to a more package based distribution with possibly putting code to a Live CD like knoppix so folks can get ideas of how things run and proceed to move the Live CD implementation down to a hard install if they like what they see. Knoppix is a Debian based OS and so a did a quick search on 'Fedora vs Debian' and the below link provides some arguments in support of using a Debian based OS.

Starting Links (April 21, 2006)

Wanted to create a page documenting some initial links for folks that are interested in setting up similar technical infrastructure for their data management.

Ocean Data View (ODV) (April 22, 2006)

The following may be old news to many, but I downloaded and played around with the free desktop client 'Ocean Data View' and was really blown away by what this tool offers. If you haven't downloaded and looked through the capabilities of this tool for use with hydrographic data (CTD and Bottle collections) then I highly recommend doing so.

The ODV main website is

and browsing through the user guide gives a list of functionality (4.5 MB)

Another standard which we should follow for our own CTD and Bottle collections which this tool leverages off of is the WOCE formats with descriptions of the formats listed at

I think the easiest format to implement/emulate is the 'exchange' format which has a column comma separated value(CSV) orientation.

Additional downloads of these types of well formatted and organized files are at

and additional WOCE collections (well formatted for ODV import) are available at

This instititute also offers software called Ocean Sneaker's Tool(OTS) which looks interesting as well

Xenia Package (June 8, 2006)

UPDATE: March 27, 2008 Xenia documentation is being migrated to


latest version 2

I've been working to develop a standard relational table reference schema for aggregation and product development. Be glad to compare notes with others doing relational schema development as to what works. I'm trying to keep the schema more modular so folks can add or take away components depending on the end products they want to support.

With this kind of development, the table schema represents the format and the data dictionaries are support/lookup tables.

If the database and derived web services/xml records/products become more packaged, then hopefully the job shifts more towards getting messy data into a 'standard' database that provides a minimum set of agreed community web services and records. Redundancy in data sources can provide a failover capacity. Data could be replicated via mirroring for copied implementations or web services where implementations are different.

ObsRSS (September 7, 2006)

Lately I've been wondering if the below xml rss type feeds might be more suitable to simple sharing of latest observation data from platforms.

MapWindow?, Shape2Earth (September 13, 2006)

Folks using GIS shapefiles might be interested in the following:

Freeware windows based shapefile viewer which supports user plug-ins

Also the following plugin for MapWindow? which should let you convert a shapefile to kml (kml is the file format that Google Earth reads)

#shape2earth plugin for MapWindow?


Shape2Earth requires Google Earth Version 3.0.0762, and MapWindow? GIS Version 4.1.2342.

perl2exe, Win32::GUI package (September 13, 2006)

Recently did a short perl file conversion script for us on windows systems. This development was possible using a combination of:

ActivePerl - ActiveState? perl distribution for Windows

ActivePerl? includes the necessary dll's to run perl from a windows command prompt. Includes the perl package manager(ppm) for installing perl packages (type 'ppm', then type 'install ' from a command prompt). You'll need to install 'DateConvert' and 'Win32-GUI' for the sample included scripts.


perl2exe will convert a perl script into a windows executable. perl2exe uses the windows dll's used by ActivePerl? in building its executable. Be sure that the ActivePerl? and perl2exe perl version numbers(5.8 for example) match.

perl Win32::GUI package

(from website) Win32::GUI is a Win32-platform native graphical user interface toolkit for Perl. Basically, it's an XS implementation of most of the functions found in user32.dll and gdi32.dll, with an object oriented Perl interface and an event-based dialog model that mimic the functionality of visual basic.

The script takes an input csv or tab separated file and does some manipulations on the date,time and value fields in creating an output file. It uses a standard windows 'file browse' window (the Win32::GUI package) to select the initial file and folder where all files in that folder will be read and converted. If you're testing this put the sample input files in a folder by themselves so only they will be read and converted.

This script could be modified as needed to perform other similar type file conversions, packaged as a distributable windows executable.

Perl script
Windows executable
Sample input file 1 and file 2

UnxUtils? (April 17,2007)

A good set of common command line unix utilities ported to Windows

This perl script shows how I'm using the above ls,grep,zip commands to move all csv files into a zip.

OOSTechKML (October 16, 2006)

Been developing and modifying a lot of google earth kml related products lately detailed at OOSTechKML

OGR techniques

see also OGR example

Didn't realize that the OGR driver supported the following two useful methods which I'll be taking advantage of in regards to:

  • high-volume gridded data(model output, hf radar, quikscat, hourly maps)

OGR shapefile driver allows you to SQL index/query a shapefile. Handle high-volume data as sets of shapefiles instead of housing data internal to the database. This gives a readily accessible/usable file archive and prevents the data maintenance overhead associated with a relational database approach.

#FWTools package

#ogr shp

  • simpler data source approaches(CSV files, non-geospatially enabled simpler relational databases via ODBC)

#ODBC Virtual Spatial Data

Perl XML::LibXML package (October 31, 2006)

This is the primary package I'd recommend for working with any extensive processing of XML documents in perl. It's a perl wrapper to an external C library, so the C execution is very fast compared to packages using purely perl.

If you were using XML::XPath like me earlier, most of the method calls should be the same for both packages so conversion was just the following:

use XML::XPath;
my $xp = XML::XPath->new(filename => './this_file');
use XML::LibXML;
my $xp = XML::LibXML->new->parse_file('./this_file');



Had problems getting my XML::LibXML installed. Looked at the 'README' file in the download and the following did the trick to get it to use my newly installed libxml2 xml2-config file which reports back a valid version,etc instead of the default old one.

If your libxml2 installation is not within your $PATH. you can set the environment variable XMLPREFIX=$YOURLIBXMLPREFIX to make XML::LibXML recognize the correct libxml2 version in use.
perl Makefile.PL XMLPREFIX=/usr/brand-new
will ask '/usr/brand-new/bin/xml2-config' about your real libxml2 configuration.

My new xml2-config was under /usr/local/bin instead of /usr/bin so the following did the trick (then the usual make, make test, make install)

perl Makefile.PL XMLPREFIX=/usr/local

findnodes ignores elements

I had an xml document which using XML::XPath earlier was identifying and looping on some repeated elements via findnodes but when I switched to XML::LibXML findnodes no longer found the nodes(ignored them, no error).

Turns out that if you declare your namespaces (xmlns attributes) in any other place than the root element, this causes problems. Moving all xmlns namespace attributes that were not declared in the root element to the root element allowed findnodes to work properly. Also all namespaces need to be like xmlns:blah , just using xmlns will also cause the find to fail.

invalid predicate

I got a cryptic 'invalid predicate' error returned and it turned out I had a poorly formed XPath expression (missing a closing bracket on an attribute). XML::XPath used earlier did not flag this as an error. Double check your XPath expressions for errors.

Perl script to generate apache access_log statistics (November 28, 2006)

I've been trying to get a better handle on our web server access log statistics using my own perl script shown at The awstats program has been useful, but it doesn't provide quite the stats or cross lookups (who's looking at what) that I'm needing. The developed script should give you the output files like those shown at for a 3 day access log including a summary file like listed below. You should be able to generate similar output, changing the configuration search arrays and point it to a copy of your access_log(which is hopefully only a few days to month in size and not too large).

Developed this using info on the access_log like found at
Part 1
Part 2

Hits: 354233 
Hits(Success): 289525 
Hits(Fail): 64708 
Hits Success: 81% 
IP ignore(server self references): 61% 178114 
page ignore: 6% 18851 
referer ignore(page load refs): 8% 23342 
bots ignore: 2% 7293 
subscriber: 18% 52333 
good: 1% 3416 
      carocoops_website/index.php 431 
      carocoops_website/buoy_detail.php 375 
      carocoops_website/buoy_graph.php 293 
      gearth 297 
   referred: 62% 2121 
      oifish 49 
      charlestondiving 300 
      catchsomeair 284 
      oceanislefishingcenter 18 
      palmettosurfers 6 1415 
      iopweather 46 3 
general: 2% 6176
####various elements which we are filtering as bad, ignored,good or subscriber - configure as needed
my @bots_ignore = qw(googlebot inktomi livebot crawl;
my @ip_ignore = qw(;
my @page_ignore = qw(TWiki Sandbox robots.txt WebStatistics bin/oops bin/rdiff);
my @page_good = qw(carocoops_website/index.php carocoops_website/buoy_detail.php carocoops_website/buoy_graph.php folly/index.php springmaid/index.php carolinas/ gearth);
my @referer_ignore = qw( search*/imgres);
my @referer_good = qw(oifish magic oceanislebeachsurf charlestondiving catchsomeair oceanislefishingcenter palmettosurfers iopweather;
my @subscriber_good = qw(;
@ARGV = glob 'log/*.log';

REST vs SOAP , document vs protocol (December 15, 2006)

I'd have to say that my preference for the most part is on the REST, document side of things for most of what I work with. The only places where I really see protocols/web services/SOAP being more effective are in

  • discovery/search based query (like searching Amazon or Google for matching hits which are supported by both REST and SOAP methods)
  • data subsetting of extremely large datasets or other specialized analysis/visualization/reformatting not capable/cached locally
  • some messaging/encryption functionality specific to SOAP(which may pose its own security issues)

The protocol/web services based approach puts the load on the provider and is an admission somewhat that we don't really understand/anticipate our audience needs and the products which may be time-consumingly and redundantly created by them on a dynamic/query rather than pregenerated/report basis. Dynamic query services may be needed for truly original/unique queries, but for repeated queries of the same nature, pregenerated/cached document/report based solutions would be more efficient.

#REST versus SOAP article and additional links,289483,sid26_gci1227190,00.html

document vs protocol (part 2)

Emailed Eric on this same subject recently, while I appreciate the efforts towards OGC participation and standards and Eric has made it very easy for me to remap our existing xml data feeds here into that via his perl cookbook scripts - I'm also interested in seeing what document/format/report approaches could do for IOOS in general in addition to protocol/web service/query approaches. The OGC WMS/WFS services have been successful in part because of their simplicity - Google Earth has picked up WMS which also helps re-enforce that standard. But I've also seen postings like the following which are the usual grumblings about the OGC.

I've been more interested in GeoRSS? or KML as they might present a document based approach to data sharing. Unfortunately while both GeoRSS? and KML represent location and KML additionally represents time, neither has an ioos catalog observation_id type lookup or schema. With KML plenty of folks are working towards parsing and displaying national/NOAA feeds but none seem encouraged to take the additional step of placing the observation data into any specific schema leaving it as just html(no particular schema or controlled vocabulary) for display, similar to the same issue of NWS providing station observation feeds in one long string via RSS. I've tried posting some latest obs document examples at , and , but can't seem to stir much interest. This basic observations schema is a similar concern in the oostethys sos_config.xml file and the ioos catalog xml record efforts.

document vs protocol (part 3 February 26, 2007)

NWS (National Weather Service) is publishing their latest observation data using WFS and GML which could be converted into ObsKML. While I'm glad NWS has progressed beyond single string RSS description feeds, it would be interesting to see something simpler adopted more native to KML like ObsKML .

Enjoyed the following article also which echoes my own sentiments towards the 'swamp' of web services discussion and development.

Ten predictions for XML in 2007

from the above article referencing web services:

Put a fork in it; they're done

WS-* (pronounced WS-splat) has peaked. Even a derailed train has a lot of momentum, so people will still be talking about Web services in 2007. However, nobody will be listening. Enterprises have absorbed as much Web services machinery as they're able to stomach. Web Services Description Language (WSDL) and SOAP 1.2 are the end of the line. Many enterprises won't even get that far. WS-Choreography, WS-Transport, WS-Reliability, WS-Security, WS-Resource, WS-ServiceGroup, WS-BaseFaults, WS-Messaging, WS-KitchenSink, and WS-AreYouEvenStillReadingThis won't leave the station. There is a limit to how much complexity any organization can manage, and WS-* has long since exceeded that threshold.

Instead, look for the emergence of Plain Old XML (POX). People will start (in many cases, continue) sending XML documents over HTTP. Authentication, caching, and reliable delivery will be managed using the built-in functionality of HTTP. Applications will be hooked together with a mixture of XSLT, XProc, duct tape, and spackle. Developers will give up on the idea of assembling services and systems without manual configuration (a dream most of them never bought into in the first place). Service-oriented architecture (SOA) might have a role to play, if any of the architecture astronauts pushing it come down to earth long enough to explain what they're doing.

SOA bloated

document vs protocol (part 4 April 27, 2007)

The main problems that I tend to see repeated in integration efforts is general technical approaches or middleware which are intended to be encompassing but which

  • fail to address specific audiences/problems/needs
  • fail to provide data/metadata formats/conventions/standards at a usable level of specific detail in an easily implementable way

The second point is the most common area of failure - there are lots of ways to transport data, but often the data sent or received is not capable of being automatically parsed and processed because of lack of a convention/standard regarding the final particulars of the format or protocol used (the expected grammar or syntax of how data is requested/formatted - (what are the column names, datatypes and order? what do they represent?) and the final terms used (the controlled vocabulary or data dictionary).


High-volume data

For high-volume gridded model data or remotely sensed images, etc(.png,etc files), WMS is well utilized as a simple HTTP addressable way of requesting the images for a given bounding box,time and data product layer.

Also important to this would be for modeling or image processing groups discussion/utilization of common metadata fields/terms regarding

  • product type
  • time/space coverage/resolutions/level of detail
  • qc
  • legend/symbology
  • credits/disclaimers
  • links to further data(shapefiles, raw data, etc)

WMS and associated metadata could be provided directly from the data provider or by a secondary data distributor.

Low-volume data

For low-volume in-situ platform data, utilize WFS with some associated convention for fields/terms/controlled vocabulary similar to say or use some community adopted XML and associated XML Schema(s) and controlled vocabularies similar to say the IOOS Catalog effort or ObsKML.

Existing conventions/standards

Utilize existing format/protocol conventions or standards which have existing momentum, popularity or utility. An example is where we have used the 'exchange' CSV format for describing CTD data at since there is an excellent free desktop visualization utility for water quality profile data in this format.

Important to all of these recommendations would be shared documentation regarding the tools/techniques used in the creation/support of these hopefully more common output formats/protocols and metadata and user/machine registries for discovery/utilization of these metadata/data.

Csv2Kml (December 19, 2006)

Created the following utility which can be used to convert a CSV (Comma Separated Value) file into a KML file for use with Google Earth. The input csv file can be used to create points, lines and polygons of various colors and scale with timestamps. Color and scale can be used as part of a visual legend to indicate georeferenced data qualities or quantities. With Google Earth version 4 and greater, timestamp information can be used to provide time animations of data using the time controls.

ObsKML (January 22, 2007)

KML currently handles spatial and temporal tags, but not data content. ObsKML is an attempt to devise a content schema and mapping which would allow not only the display of data content, but also some meaningful data sharing within the observations community using KML/KMZ as a data transport mechanism. The kml tag used is the 'Metadata' tag which carries the content in a more formalized xml schema.

Trac project management tool (February 26, 2007)

The open source/sourceforge solution that Seacoos is going with for now is the python based 'trac' , which incorporates ticket system, wiki and subversion.

Database replication (February 28, 2007)

This twiki page discusses the details of replicating a Xenia database for purposes of archiving/analysis of data between Xenia instances.

FOSS4G (October 15, 2007)

Here are some of the links to the topics discussed today regarding the FOSS4G(Free Open Source Software for Geospatial) meeting earlier in Victoria, Canada. A listing of the presentations that were given is at

The FOSS4G conference had more of a technical/tool focus than application one. The main open source tools that seemed to be getting the most attention were Geoserver and OpenLayers?.

Geoserver - the Java middleware package for producing various OGC standard and other related outputs/formats from user data

Chris Holmes - a developer with GeoServer? gives an hour presentation covering various topics

OpenLayers? - a free open source javascript library for doing interesting things like 'slippy' maps and overlays at the browser level

Also thought the following slideshows were interesting (you'll need to have flash available in your browser to view these - click 'full' in the lower right corner of the window to maximize the slide display to full screen)

Beyond GPS (presented at the FOSS4G conference)

RESTful OGC Services(see slides 13 & 14 for general table comparison)

XeniaPackageSqlite (January 18, 2008)

UPDATE: March 27, 2008 Xenia documentation is being migrated to

This twiki page discusses the details of an easier to setup/maintain sqlite version of the 'Xenia' RDB schema

XeniaUpdates? (January 18, 2009)

The development follow-on listing to this is continued at
to top

I Attachment Action Size Date Who Comment
metadoor.jpg manage 886.7 K 01 Dec 2004 - 18:07 JeremyCothran  
SVGA_Regional_Limits.xls manage 23.0 K 01 Dec 2004 - 20:14 JeremyCothran  
SensorWebWhPpr030828.doc manage 413.5 K 03 Dec 2004 - 23:47 JeremyCothran  
MarineXML_D6_v5.pdf manage 1607.7 K 03 Dec 2004 - 23:52 JeremyCothran  
JCothran_draft_metadata_plan_sw.doc manage 87.5 K 09 Dec 2004 - 18:14 JeremyCothran  
CreatePublishMeta_August25.doc manage 131.5 K 18 Jan 2005 - 16:00 JeremyCothran  
GOS_CH_Steps_0904.ppt manage 2180.0 K 20 Jan 2005 - 17:23 JeremyCothran  
seacoos_flatfile_v10.xsd manage 3.7 K 03 Feb 2005 - 22:28 JeremyCothran manage 11.5 K 13 Apr 2005 - 18:48 JeremyCothran NA
CAP2_header.txt manage 4.0 K 24 Mar 2005 - 19:41 JeremyCothran NA
google_dennis_0700.jpg manage 166.4 K 13 Jul 2005 - 18:43 JeremyCothran NA
oi_sst.jpg manage 169.0 K 12 Sep 2005 - 20:11 JeremyCothran NA manage 1.5 K 08 Mar 2006 - 16:21 JeremyCothran NA
test1.txt manage 0.1 K 08 Mar 2006 - 16:22 JeremyCothran NA
test1.kml manage 0.7 K 08 Mar 2006 - 16:22 JeremyCothran NA
line_test.kml manage 0.9 K 07 Apr 2006 - 14:16 JeremyCothran NA
XPath.class.php.txt manage 270.3 K 21 Apr 2006 - 14:54 JeremyCothran NA
test_xml.php.txt manage 0.2 K 21 Apr 2006 - 14:55 JeremyCothran NA
test.xml manage 0.1 K 21 Apr 2006 - 14:55 JeremyCothran NA manage 2.0 K 29 Oct 2006 - 16:11 JeremyCothran NA manage 1.3 K 17 Apr 2007 - 19:48 JeremyCothran na

You are here: Main > JCNotes

to top

Copyright © 1999-2016 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding DMCC? Send feedback