[Thread Prev][Thread Next][Index]

Re: [ferret_users] problems with short data type



Hi Martin,
That is not our intended behavior when writing data. What Ferret should do is to keep the original attributes and data types when writing a file variable, for instance using Ferret to make a subset.  This is in fact what the most recent version of Ferret does.

Here is the netCDF header for an example file:
> ncdump -h example_scale.nc

        short Elev(Ti, Yc, Xc) ;
                Elev:long_name = "Surface elevation" ;
                Elev:units = "meter" ;
                Elev:valid_range = -9., 9. ;
                Elev:_FillValue = -32768s ;
                Elev:missing_value = -32767s ;
                Elev:scale_factor = 0.000274674967954587 ;
                Elev:add_offset = 0. ;
If I write a subset of this variable with Ferret:
        NOAA/PMEL TMAP
        FERRET v6.85 
        Linux 2.6.32-358.23.2.el6.x86_64 64-bit - 11/12/13
        18-Feb-14 12:40    

yes? use example_scale.nc
yes? sh  att elev
     attributes for dataset: ./example_scale.nc
 Elev.long_name = Surface elevation
 Elev.units = meter
 Elev.valid_range = -9, 9
 Elev._FillValue = -32768
 Elev.missing_value = -32767
 Elev.scale_factor = 0.000274675
 Elev.add_offset = 0
yes? save/y=59/clobber/file=subset.nc elev
 LISTing to file subset.nc

Now here are the attributes
> ncdump -h subset.nc
netcdf subset {
dimensions:
        XC = 35 ;
        YC59_59 = 1 ;
        bnds = 2 ;
        TI = UNLIMITED ; // (1 currently)
variables:
...
        short ELEV(TI, YC59_59, XC) ;
                ELEV:missing_value = -32767s ;
                ELEV:_FillValue = -32768s ;
                ELEV:long_name = "Surface elevation" ;
                ELEV:units = "meter" ;
                ELEV:scale_factor = 0.000274674967954587 ;
                ELEV:add_offset = 0. ;
                ELEV:history = "From example_scale" ;

// global attributes:
                :history = "FERRET V6.85   18-Feb-14" ;
                :Conventions = "CF-1.0" ;
}

In Ferret v6.842, one can ask for this behavior with the setting:
set att/outtype=all elev
We unfortunately implemented the feature which writes data in their original data type, without writing these essential attributes as well. My apologies for the time you had to spend on this.

Ansley

On 2/18/2014 11:16 AM, Martin Schmidt wrote:
Hi,

processing ASCAT netcdf data I stumbled into a problem related to data accuracy and data formats.

Some ASCAT data come in netcdf format, daily data with a time stamp. I want to combine them into a single file.
I use a script, reading the files with ferret and writing the data with "append". This worked well for older ferret versions
like 6.72 or earlier but does not with recent versions (6.842).

The problem seems to be the following:

In the original files data are stored with type "short" and a scale factor:

short wind_speed(time, depth, latitude, longitude) ;
                wind_speed:long_name = "wind speed" ;
                wind_speed:units = "m/s" ;
                wind_speed:scale_factor = 0.01 ;
                wind_speed:add_offset = 0. ;
                wind_speed:valid_min = 0. ;
                wind_speed:valid_max = 6000. ;
                wind_speed:_FillValue = -32768s ;
                wind_speed:standard_name = "wind_speed" ;

A typical wind speed value is 1010 corresponding to 10.1 m/s. ncdump shows integers like 1010 in the original files.

Old ferret reads these data correctly and writes reals (floats):

float WIND_SPEED(TIME, DEPTH, LATITUDE81_400, LONGITUDE481_840) ;
                WIND_SPEED:_FillValue = -327.68f ;
                WIND_SPEED:long_name = "wind speed" ;
                WIND_SPEED:units = "m/s" ;
                WIND_SPEED:history = "From dummy_2011" ;

Checking with ncdump I see values like 10.1 in the output file - everything seems to be correct.

But recent ferret writes shorts but now without a scale factor:

short WIND_SPEED(TIME, DEPTH, LATITUDE81_400, LONGITUDE481_840) ;
         WIND_SPEED:_FillValue = -327s ;
         WIND_SPEED:long_name = "wind speed" ;
         WIND_SPEED:units = "m/s" ;
         WIND_SPEED:history = "From dummy_2011" ;

Checking with ncdump I find integers like 10 in the output file, which means reduced accuracy. Also the
fill value is not correct any more.

My questions:
- this happens without any warning and it took me several hours to find this out. Is this a desired feature? I could understand,
if the input format would be used for output, but what about the reduced accuracy. At least a warning would be fine.

- how to find such dangerous traps and how to avoid them? ferret is great in just opening files and delivering data in their context
of dimensions and coordinates. It was one of the great advantages that no detailed file analysis is needed.

So this leaves me a little bit helpless.

Best,
Martin



[Thread Prev][Thread Next][Index]
Contact Us
Dept of Commerce / NOAA / OAR / PMEL / Ferret

Privacy Policy | Disclaimer | Accessibility Statement