[Thread Prev][Thread Next][Index]

Re: las xml specification question



Hi Dave,

The solution to this problem is to combine all of the files representing a single timestep into one netCDF file containing all of the timesteps. You can do this with Ferret -- the only catch being that there are size limits in netCDF that you might run into if the amount of data is large (4 GB or so).

I have a set of Perl and Ferret journal scripts that solve this problem for a very specific application. It probably wouldn't be difficult to adapt if you are familiar with Perl. Let me know if you'd like a copy.

Dave Brown wrote:

Hi,

I am trying to come up with a LAS xml specification for a dataset in
which each file contains several variables at a single timestep. The
timesteps are spaced 6 hours apart, resulting in 1460 files for a year.
The timestep is encoded in the file name, as well as in an "initial_time"
attribute associated with each variable, but obviously there is no
explicit time dimension, since each file just represents a single point
along the time dimension. From reading the available LAS docs and FAQS, it
seems that I could model this dataset in the xml specification using
composite variables, but as far as I can tell, for a year's worth of data
I would need to repeatedly specify all 1460 URLs for each
variable. This seems horribly cumbersome and inefficient, so my question is
"Is there a better way?" Can a series of similar URLs be specified
using some kind of regular expression, for instance?

Thanks in advance for any suggestions.
 -Dave Brown
 dbrown@ucar.edu
 

--
Joe Sirott
 
[Thread Prev][Thread Next][Index]

Dept of Commerce / NOAA / OAR / PMEL / TMAP
Contact Us | Privacy Policy | Disclaimer | Accessibility Statement