Skip to topic | Skip to bottom
Home
Main
Main.CodeRepositoryWFSRequestStationDatar1.4 - 21 Jan 2006 - 15:49 - JeremyCothrantopic end

Start of topic | Skip to actions
Update(01/21/06): While the below WFS method is still supported I'd recommend substituting the web services defined at http://twiki.sura.org/twiki/bin/edit/Main/SoapliteSeacoos since they should have faster execution and are the direction that this development is moving towards.

This webpage describes the processing of data returned from the setup WFS(Web Feature Service) request for latest station data into an html webpage.

For the 'Carolinas Coast' demonstration project, we wanted to be able to display the latest in situ station/observation data collected to the Seacoos database within a subregion specific html webpage.

Charlton created a sql query mapping which responds to a WFS query request with the parameters 'typename=latest_in_situ_obs' and a geographic bounding box argument (BBOX) to return the latest station information for all stations within the bounding box area. The example below for instance returns an xml document with the latest station info for the defined Carolinas area.

http://nautilus.baruch.sc.edu/wfs/seacoos_in_situ?SERVICE=WFS&VERSION=1.0.0&REQUEST=GETFEATURE&BBOX=-91.5%2C22%2C-71.5%2C36.5&typename=latest_in_situ_obs

The other steps involved are the creation of a template html webpage to display the latest data and the processing necessary to pull the WFS request returned data into the html webpage on a regular basis.

Susanna King developed a html template page for the site which can be seen at

http://nautilus.baruch.sc.edu/carolinas

This webpage is a template used for discussion purposes and the only functional aspects to it currently is that the Caro-COOPS stations (FRP1, FRP2, CAP1, CAP2, CAP3, SUN1, SUN2) are refreshed with the latest data from the WFS query every 20 minutes past the hour. The other stations listed for the Carolinas will be added as well soon.

The perl script (lwpCarolinas.pl) developed follows the below general steps using a hash structure(with the station_id as returned from the WFS query as the key) to loop through the station data:

  • 1. Initialize the hash with the static HTML DIV tag information for the station used repeatedly like station description and URL as well as default/missing values for data elements.(carolinas_stations.txt)

  • 3. Loop through and print the contents of the hash to their corresponding HTML elements within the
    tags, using an HTML template with the appropriate macros as placeholders for the substitutions.(carolinascoast_template.txt)

  • 4. Call the above process on a periodic basis(cron_carolinas.txt).

The hardest part in the above process was figuring the exact perl syntax for handling the hash references correctly. The following links were helpful.

Displaying XML files using XSLT, Perl, and Java http://www.ku.edu/~grobe/intro-xml.html

Frequently Asked Questions about XML::Simple http://www.die.net/doc/linux/man/man3/xml::simple::faq.3.html

-- JeremyCothran - 27 Apr 2005
to top

I Attachment Action Size Date Who Comment
lwpCarolinas.pl.txt manage 9.9 K 27 Apr 2005 - 20:44 JeremyCothran NA
carolinas_stations.txt manage 1.4 K 27 Apr 2005 - 20:45 JeremyCothran NA
carolinascoast_template.txt manage 27.9 K 27 Apr 2005 - 20:45 JeremyCothran NA
cron_carolinas.txt manage 0.2 K 27 Apr 2005 - 20:45 JeremyCothran NA

You are here: Main > CodeRepositoryWFSRequestStationData

to top

Copyright © 1999-2017 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding DMCC? Send feedback