[Thread Prev][Thread Next][Index]
large NetCDF files (>2.1 GB) in ferret
Hi -
I'm trying to get ferret 5.41 working with the large netcdf files (> 2.1
GB).
Problem: doesn't seem to work.
Symptoms:
A) When trying to read an existing large NetCDF file (set data/format=cdf
large_file.cdf) , I get the error:
** unknown netCDF error: -31
is this a CDF file?
B) When trying to create a large netcdf file (for example, by appending
repeatedly along the t axis), I get the error:
ferret: lnetcdf/lposixio.c:238: px_rel: Assertion 'pxp->bf_offset <=
offset && offset < pxp->bf_offset + (off_t) pxp bf_extent' failed.
core dumped.
Background:
Our netcdf libraries are built with the large dataset flag. I can ncgen
and ncdump large dataset files as in the UNIDATA examples.
Any ideas?
We are setting about reinstalling ferret from scratch and upgrading from
5.41 to 5.5 now, but I thought I'd check to see if this rings any bells.
Thanks,
- Sim
[Thread Prev][Thread Next][Index]
Dept of Commerce /
NOAA /
OAR /
PMEL /
TMAP
Contact Us | Privacy Policy | Disclaimer | Accessibility Statement