[Thread Prev][Thread Next][Index]

Re: [ferret_users] slow work with netcdf4 files



Hi Martin,

You might want to play around with the chunking and deflation levels of
your files.

The utility nccopy that now comes with netCDF can be used to reformat
the files with different chunking and deflation.

Russ

On Thu, 2012-02-16 at 06:57 +1100, Martin Schmidt wrote:
> Hi all,
> 
> working with netcdf4 files in a compressed form helps to save storage 
> space - in some cases more than 70 %.
> However, in some cases working with these files reveals as really slow. 
> I have concatinated about 120 files each with
> 6 time steps by a descriptor file. Each file contains 60 variables with 
> 300x200x70 grid points. Not
> a small project. The files are written in time slices with time as 
> "unlimited" variable.
> 
> Making a simple "shade" for a given time step reveals as sufficiently 
> fast. Also z-t plots over all files
> 
> shade/y=0/x=0/z=0:500/t=1-jan-2000:31-dec-2010 some_variable
> 
> returns a figure withing a few minutes, although all files have to be 
> uncomressed partially. However commands like
> 
> shade/t=1-jan-2005 temp[k=@din]
> 
> do not return a result even after 50 minutes.
> 
> As a way out I could use a singe file. However, in future the files will 
> be put onto
> a TREDDS or LAS server using THREDDS aggregation over time. So the user 
> is not aware of the
> single files and would give up for sure.
> 
> Is there a trick to get such operations faster? I think about ideas to 
> confine uncompression to those areas in
> the files that are really needed.
> 
> Best,
> Martin Schmidt




[Thread Prev][Thread Next][Index]
Contact Us
Dept of Commerce / NOAA / OAR / PMEL / Ferret

Privacy Policy | Disclaimer | Accessibility Statement