[Thread Prev][Thread Next][Index]
[ferret_users] slow work with netcdf4 files
Hi all,
working with netcdf4 files in a compressed form helps to save storage
space - in some cases more than 70 %.
However, in some cases working with these files reveals as really slow.
I have concatinated about 120 files each with
6 time steps by a descriptor file. Each file contains 60 variables with
300x200x70 grid points. Not
a small project. The files are written in time slices with time as
"unlimited" variable.
Making a simple "shade" for a given time step reveals as sufficiently
fast. Also z-t plots over all files
shade/y=0/x=0/z=0:500/t=1-jan-2000:31-dec-2010 some_variable
returns a figure withing a few minutes, although all files have to be
uncomressed partially. However commands like
shade/t=1-jan-2005 temp[k=@din]
do not return a result even after 50 minutes.
As a way out I could use a singe file. However, in future the files will
be put onto
a TREDDS or LAS server using THREDDS aggregation over time. So the user
is not aware of the
single files and would give up for sure.
Is there a trick to get such operations faster? I think about ideas to
confine uncompression to those areas in
the files that are really needed.
Best,
Martin Schmidt
[Thread Prev][Thread Next][Index]
Contact Us
Dept of Commerce /
NOAA /
OAR /
PMEL /
Ferret
Privacy Policy | Disclaimer | Accessibility Statement