Skip to topic | Skip to bottom
Home
Main
Main.DODSFilterColumnr1.4 - 17 Feb 2005 - 19:18 - JeremyCothrantopic end

Start of topic | Skip to actions

General description and use

In response to the need to transform DODS netcdf ascii data to a columnar format rather than single long rows, the following tool can be run from a browser.

For example, here's the browser output using the previous method returning DODS netCDF data in the long row format for temperature and salinity:

http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.ascii?temp,salin

Here's the same DODS netCDF site reference 'wrapped' in a site which I'm running to reformat the output to a column form: http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.ascii?temp,salin&time_units=days%20since%20-0001-1-2%2000:00:00

To download this data, add '&submit_type=download' at the end of the URL:

http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.ascii?temp,salin&time_units=days%20since%20-0001-1-2%2000:00:00&submit_type=download

A general outline of the usage steps are:

*review a netCDF file of interest with the existing DODS interface, for example:

http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.html

deciding which variable names to query like 'temp,salin' and noting the 'units' description for the main time index ( should be a format similar to 'secs since 1970-1-1 00:00:00' )

To transform the data, supply the 'dods_url' and 'time_units' parameters after the following url http://carocoops.org/carocoops_website/dods_filter_column.php? similar to the above listed

http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.ascii?temp,salin&time_units=days%20since%20-0001-1-2%2000:00:00

For the dods_url be be sure to end the url with '.nc.ascii?' and the names of the variables to query like '.nc.ascii?temp,salin'

Note that due to a technical offset glitch with using the particular starting date of 'days since 0000-1-1 00:00:00' I've substituted 'time_units=days since -0001-1-2 00:00:00' which seems to provide the correct date.

To download the file as a csv file(comma separated value file which is viewable in Excel), add '&submit_type=download' at the end of the URL.

Technical description

The following code is designed to be used as a stand-alone and/or web service to help tranform the DODS client http ascii output(the 'Get ASCII' button) to something more column oriented (CSV Comma Separated Value). This can help the data be more easily viewed and understood as well as fed into other analysis tools, web services and databases.

To run this service against other DODS servers which are time index based, the URL parameters to include after http://carocoops.org/carocoops_website/dods_filter_column.php? are

dods_url=
time_units=
optional: submit_type= if left blank the results will be returned to the browser(the default behavior), if 'download' is chosen, the user is prompted to download the results to a local csv file.

As an example, I've listed the following links to Seacoos netCDF DODS servers. When clicking on the link you'll see the results in the browser or be prompted to download depending on whether '&submit_type=download' is included in the URL. The data shown/downloaded is the queried fields transformed from the DODS http response to a column/csv format. Note in the following URL addresses that '%20' is automatically substituted for any space characters, spaces are acceptable in the initial URL query.

View Data

NC-COOS
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://nccoos.unc.edu/cgi-bin/nph-dods/data/nws_metar/proc_data/latest_v2.0/nws-K11J-metar-latest.nc.ascii?wspd&time_units=seconds%20since%201970-1-1%2000:00:00

SABSOON
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.ascii?temp,salin&time_units=days%20since%20-0001-1-2%2000:00:00

*Note on the above link the original time units was 'days since 0000-1-1 00:00:00 ' but I had to slightly correct this to 'days since -0001-1-2 00:00:00' to get things to adjust calculate correctly. Not sure what the issue is here.

COMPS
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.marine.usf.edu/cgi-bin/nph-dods/data/seacoos_rt_v2/ndbc-41001-met-latest.nc.ascii?wind_speed,wind_gust,sea_surface_temperature&time_units=seconds%20since%201970-1-1%2000:00:00-00

Explorer of the Seas
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://oceanlab.rsmas.miami.edu/cgi-bin/nph-dods/explorer/RT/sk_KS003_skmodule_latest.nc.ascii?wind_from_direction,air_temperature,air_pressure&time_units=sec%20since%201995-1-1%2000:00:00%20-0:00

Download Data

NC-COOS
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://nccoos.unc.edu/cgi-bin/nph-dods/data/nws_metar/proc_data/latest_v2.0/nws-K11J-metar-latest.nc.ascii?wspd&time_units=seconds%20since%201970-1-1%2000:00:00&submit_type=download

SABSOON
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.skio.peachnet.edu/cgi-bin/nph-dods/proc_data/r2/near_bot/R2PkgB_1999_09.nc.ascii?temp,salin&time_units=days%20since%20-0001-1-2%2000:00:00&submit_type=download

COMPS
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://seacoos.marine.usf.edu/cgi-bin/nph-dods/data/seacoos_rt_v2/ndbc-41001-met-latest.nc.ascii?wind_speed,wind_gust,sea_surface_temperature&time_units=seconds%20since%201970-1-1%2000:00:00-00&submit_type=download

Explorer of the Seas
http://carocoops.org/carocoops_website/dods_filter_column.php?dods_url=http://oceanlab.rsmas.miami.edu/cgi-bin/nph-dods/explorer/RT/sk_KS003_skmodule_latest.nc.ascii?wind_from_direction,air_temperature,air_pressure&time_units=sec%20since%201995-1-1%2000:00:00%20-0:00&submit_type=download

Documentation

Documentation included in the dods_filter_column.pl file. The attached perl files are the same except that one additionally supports the html formatting and viewing from the web browser. It wouldn't be too difficult to add a further session variable to do a quick time series plot using gnuplot as well.

##################################################################################################

#usage: perl dods_filter_column.pl 'http://nccoos.unc.edu/cgi-bin/nph-dods/data/nws_metar/proc_data/latest_v2.0/nws-K11J-metar-latest.nc.ascii?wspd' 'seconds since 1970-1-1 00:00:00' 12345

#This perl program can be run stand-alone with the correct parameters and/or in conjunction with a sample calling php webpage - dods_filter_column.php (as a web service) which supplies the parameters from the URL session parameters.

##stand alone
#Run stand alone it can perform row-to-column transformation for other programs which need the data in a CSV(Comma Separated Value) format

#Run stand alone it can also serve as a base template for further modification to create relational database population files with additional row output tailoring for the specific INSERT or COPY file table need.

#Note that the NCO ( http://nco.sourceforge.net ) utility operator 'ncrcat' can also be used to concatenate similar files along the time axis to create a larger composite netCDF file which might be easier to query or transform.

##web service
#Run as called from a web service(see sample calling php webpage - dods_filter_column.php ) it can provide the same transformation within the browser clients (without the need for installing software and possibly as part of other pipelined web service processes)

#If running locally change the $tmp_dir reference as needed

###################
#The code below uses the UDUNITS perl package for transforming netCDF times to ISO type datetimes. To install this package on a Redhat Linux server I had to perform the below steps(see the INSTALL document, talks about other prerequisites like perl 5.x, etc) as root:

#download and unzip version 1.12.2 from Unidata(also available at http://carocoops.org/resources/netcdf/udunits-1.12.2.tar.Z )

#add the following 3 lines to the /src/configure file to recognize a Linux system
# CC=gcc
# CFLAGS=-Df2cFortran
# FC=g77

#run ./configure from /udunits-1.12.2/src
#run perl Makefile.PL from /udunits-1.12.2/src/perl
#run make test from /udunits-1.12.2/src/perl
###################

##netCDF file assumptions
#The code assumes that the time values line will be the third line listed and there are a corresponding number of elements for each of the datatypes included in the query.

Questions/Comments

Post questions/comments here

January 21, 2005

In response to a technical question:

The tools which I'd recommend technical staff use to aggregate data from netCDF files to a relational database are:

Unidata has several language API's(Java, Perl, Python, Matlab, Ruby) for interfacing with netCDF files listed here(right hand side) http://my.unidata.ucar.edu/content/software/netcdf/index.html . I didn't need to use a netCDF interface for the example perl code listed at http://carocoops.org/twiki_dmcc/pub/Main/DODSFilterColumn/dods_filter_column.pl.txt since I was using the DODS server to access the data elements for me, but I did use the 'udunits' library http://my.unidata.ucar.edu/content/software/udunits/index.html to convert from the netCDF format time to something which I could more easily read or put into a relational database datetime field.

The NCO(NetCDF Operator) ( http://nco.sourceforge.net ) utility operator 'ncrcat' (download and compile) can also be used to concatenate netcdf files that are the same(for example, the same format of netcdf file except single files for each month) along the time axis to create a larger composite netcdf file which might be easier to query or transform(12 monthly files into a yearly file). I mention the use of ncrcat in the following post for appending new records to an existing netCDF file at the following post http://nautilus.baruch.sc.edu/misc/phpBB2/viewtopic.php?t=40&highlight=ncrcat

Another approach which might work without getting into all the above tools is to create a script which systematically feeds the different DODS netcdf urls, query variables and timebases to the dods_filter_column.php service and takes the output and populates a relational database or concatenates a running text file used for input to some other program.

-- JeremyCothran - 12 Jan 2005
to top

I Attachment Action Size Date Who Comment
dods_filter_column.pl.txt manage 4.9 K 13 Jan 2005 - 01:00 JeremyCothran  
dods_filter_column.php.txt manage 0.6 K 13 Jan 2005 - 00:59 JeremyCothran  
dods_filter_column_no_html.pl.txt manage 5.0 K 13 Jan 2005 - 01:02 JeremyCothran  
dods_filter_column_with_html.pl.txt manage 5.8 K 13 Jan 2005 - 01:02 JeremyCothran  

You are here: Main > DODSFilterColumn

to top

Copyright © 1999-2017 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding DMCC? Send feedback