[Thread Prev][Thread Next][Index]

RE: Descriptor hard limitations



Well...If you want one that's giving us fits at NAVOCEANO, it's Ferret's
handling of fill values for packed netcdf data.

For netcdf variable of type "float", Ferret will convert values of
NF_FILL_FLOAT
to Ferret's "undefined", which is great. However, because of the size of our
datasets (see the NAVOCEANO DODS server), we have to pack many variables,
using the attributes "add_offset" and "scale_factor", into a shorter form--
usually type "float" (4 byte float) to "short" (2 byte integer). Ferret will
correctly unpack (Hooray!) almost everything. What it won't unpack properly
is the fill values of the "short" type. It will not unpack the "short"
netcdf
fill value to the "float" netcdf value. We cannot get around this by using
"short"s that unpack to NF_FILL_FLOAT because then we cannot get a usable
range for the data (the fills are very extreme values). We are now defining 
the netcdf attribute "_FillValue" to a value that is not within a valid data

range, but still packable. Then the application program (written in Ferret)
has 
to set instances of this value to Ferret "undefined" for proper handling by
things
like contouring. To do this, the application program needs to know the value
of
the "_FillValue" attribute, but Ferret won't allow you to read it from the
dataset:
you have to look at a dataset dump to find it, then rewrite the application 
program for that particular dataset.

So we need:
1. When Ferret unpacks a packed netcdf dataset, it should convert the netcdf
fill in the packed type to an "undefined" value in Ferret.
2. If "_FillValue" or "missing_data" attributes are present, it should
convert
data (packed or nonpacked) with those values to "undefined"

Down the road I would like:
3. Support for CF convention compression supported (presumably by a
function.)
4. Ferret able to read arbitrary attributes (both global and variable) of a
netcdf
dataset and assign their values to Ferret variables and strings. We really
need
this, mainly for plot labeling.

Paul Farrar

-----Original Message-----
From: Ansley Manke
To: Richard D. Slater
Cc: Patrick Brockmann; ferret_users@ferret.pmel.noaa.gov;
ansley@pmel.noaa.gov
Sent: 9/18/02 2:04 PM
Subject: Re: Descriptor hard limitations

Hi Rick, and everyone,
Thanks for letting us know about your frustrations.  We can
raise all of these limits, and will do so in the next release after
version 5.41.

In general, please do tell us when you're running into any kind
of roadblock with Ferret.  Ferret users are stretching Ferret
to its limits in all kinds of ways, and we don't always hear about
it.  Most of these hard limits are because much of the core source
code in Ferret is f77 code.  Sometimes array dimensions can
simply be increased, or other minor changes can be made to
lift restrictions without too much trouble.

We're also working on a document that will list the values of
limits that are built in to Ferret.  I'll post a note about that when
it's available.

Ansley Manke

"Richard D. Slater" wrote:

> > Hi all,
> >
> > I frequently use descriptor files to work with collections of
> > monthly netCDF model output.
> >
> > Unfortunately, it seems that there is an hard coded limitation in
the size of filenames.
> > I have tested that 60 characters is the maximum size for filename.
> > It is definitevely not enough !
> >
> > I have lived with this problem by using symbolic links but even with
this
> > it is sometime difficult to deal with limited size filenames.
> > Could you take this in consideration for future release ?
>
> I believe that this limit was extended in 5.33? I know that I also had
to circumvent this problem, but don't believe that it is an issue any
longer.
>
> However, I keep bumping into many other hard limits in ferret. These
are:
>
> 1) 30 open files. Managing results from 6 models, 2 versions per
model, often multiple variables in different variables. A limit of 30
can be exhausted quite quickly.
>
> 2) something like 50,000 points? Have run into this when using ferret
to perform global, possibly masked, integrations over the above files.
>
> 3) 500 variables, ran into this when trying to work around (1) by
combining variables into smaller number of files.
>
> 4) Some limit on number of axes. Run into this when iterating over a
page trying to get things just right.
>
> I've gotten to the point, because of (1), I will close all open files
after each plot (not page, mind you) and sometimes for each line on a
plot. This might be causing me to hit some of the other limits, and
oculd be less efficient, but is often the only way that I can make some
of the plots.
>
> Limits (2) and (4) cause ferret to crash. Not sure about (3), but
don't think so.
>
> A couple of final comments/requests. First, I had asked sometime
before whether these hard limits (and any others) could be documented
somewhere, but a brief perusal of the dosumentation didn't show
anything. Could someone point me to such documentation if it exists, or
produce it if it doesn't?
>
> Second, would it be possible to have a version of ferret which does
not have these limits, but is just limited to available memory? Or maybe
thses limits could be dependent on the amount of memory that you
allocate for ferret?
>
> It is getting very tiring having to deal with these limits all of the
time. Enough so that even though I like ferret very much I am looking
for alternatives.
>
> I am currently running version 5.40 under linux.
>
> Rick Slater

--
Ansley Manke  Pacific Marine Environmental Laboratory  Seattle WA
(206)526-6246




[Thread Prev][Thread Next][Index]

Dept of Commerce / NOAA / OAR / PMEL / TMAP

Contact Us | Privacy Policy | Disclaimer | Accessibility Statement