Hi Ansley,
Yes, here's a quick test program that I whipped up the other day
that confirms what you've found (if anybody is interested).
Also note that that if you take any file that has the fletcher
checksum turn on and try to compress it (without shuffling) with
nccopy
then it also fails to be processed correctly.
Cheers,
Russ
On 09/02/16 10:32, Ansley C. Manke wrote:
Hi Russ,
This looks to be an issue with the "fletcher32 checksum" setting
in combination with the "shuffle" flag. We had implemented the
default settings for netCDF-4 files following an example in some
of the Unidata NetDCDF pages.
That example also sets the "shuffle" flag to zero. In Ferret's
implementation the shuffle flag is a variable, set by the user,
but the fletcher32 checksum is always set.
If I take out the call to nc_def_var_fletcher32 or if I
turn on the shuffle option, then I get files which
Panoply can read.
I can verify this behavior using the simple test programs that
come as part of the NetCDF distribution, so it is not a result of
how Ferret is linked with the netCDF libraries. I am making a
report to Unidata.
Meanwhile, when writing files using Ferret, I believe that using
/SHUFFLE=1 will work consistently to give you files that netCDF
java will read.
Until we hear back from Unidata, I will leave the settings as they
are in Ferret, except that if SHUFFLE is 0, then the call to set
the fletcher32 checksum will not be made.
-Ansley
On 2/2/2016 5:09 PM, Russ Fiedler
wrote:
Hi Ansley,
I can confirm that turning on the shuffle option in ferret
allows the variables to be read both in Panopoly and our in
house software. Also passing unshuffled compressed files written
by ferret through nccopy and explicity specifying the shuffle
option works. However not specifying -s fails which is odd. Also
taking an uncompressed file and passing it through nccopy works
whether shuffling is specified or not. The command line version
of nccopy definitely does not turn on shuffling by default. That
just seems to be the case for the java version.
http://www.unidata.ucar.edu/software/netcdf/docs/netcdf_utilities_guide.html
Summarizing
save/file=uncmp.nc/ncformat=netcdf4 var !works
save/file=cmp.nc/def/ncformat=netcdf4 var !fails no shuffling
save/file=cmp_shuff.nc/def/ncformat=netcdf4/shuff var !works
Passing though nccopy
nccopy -k4 -d1 cmp.nc cmp_nccopy_noshuff.nc !
still fails. No shuffling
nccopy -k4 -d1 -s cmp.nc cmp_nccopy_shuff.nc !
Now works with shuffling
nccopy -k4 -d1 uncmp.nc uncmp_nccopy_noshuff.nc ! Now
compressed, works but there's no shuffling!
nccopy -k4 -d1 -s uncmp.nc uncmp_nccopy_shuff.nc ! Now
compressed, works and is shuffled
It definitely looks like something is being put in the file by
Ferret that is causing grief. I looked at the Ferret source code
and it all looks fine and mimicked the calls in a simple program
which worked fine. My guess is that it may be something to do
with the static compilation and other compilation options that
may be causing the problem but I didn't get that far in my
tests.
We'll turn on the shuffle option now.
Cheers,
Russ
On 03/02/16 10:32, Ansley C. Manke wrote:
Hi,
I experimented and found that if the file is written in Ferret
using the qualifier /SHUFFLE=1 and deflation, then it seems to
work correctly, at least in Panoply.
In looking at some the netCDF documentation, I find this page
that discusses the netcdf-java tools.
http://www.unidata.ucar.edu/software/thredds/current/netcdf-java/reference/manPages.html
It says that shuffle=true is the default for nccopy. So that
seems to be why running the file through nccopy causes it to
work in the java-netCDF apps, and this may be what's meant by
the "filter type =0" setting in the error message. Does this
seem like a good solution to you Russ?
Ansley
On 1/28/2016 10:52 PM, Russ Fiedler wrote:
Hi,
This is possibly a problem with netcdf java rather than
ferret but anyway...
If we make a variable and save it in compressed netcdf4
format it appears that java based netcdf software han't
handle it.
e.g.
yes? let v=i[i=1:20]+j[j=1:20]
yes? save/file=cmp.nc/ncformat=netcdf4/def v ! compressed
(actually it's larger but just an example)
yes? save/file=uncmp.nc/ncformat=netcdf4 v !
uncompressed
Now loading up in Panopoly (and another of our in house apps
which is where we spotted the problem).When we try to
operate on v
we get a message about an an unknown filter type being zero
for the compressed version
java.lang.RuntimeException: Unknown filter type=0
at
ucar.nc2.iosp.hdf5.H5tiledLayoutBB$DataChunk.getByteBuffer(H5tiledLayoutBB.java:198)
etc
The uncompressed version is fine since there are no filters
being applied.
Now, here's where it gets weird. If nccopy is used to
convert the two files to compressed netcdf4 style the
originally compressed file is
still giving Panopoly grief but the newly compress file is
handled just fine!
xwing-hf% nccopy -k 3 -d 1 cmp.nc cmp_ncopy.nc !
Still no good
xwing-hf% nccopy -k 3 -d 1 uncmp.nc uncmp_ncopy.nc !
compressed file is fine!!!
Any clues? A known issue?
Occurs for V6.95, V6.96 and earlier by the looks.
Cheers,
Russ
|