[Thread Prev][Thread Next][Index]

Re: Descriptor hard limitations

Hi Rick, and everyone,
Thanks for letting us know about your frustrations.  We can
raise all of these limits, and will do so in the next release after
version 5.41.

In general, please do tell us when you're running into any kind
of roadblock with Ferret.  Ferret users are stretching Ferret
to its limits in all kinds of ways, and we don't always hear about
it.  Most of these hard limits are because much of the core source
code in Ferret is f77 code.  Sometimes array dimensions can
simply be increased, or other minor changes can be made to
lift restrictions without too much trouble.

We're also working on a document that will list the values of
limits that are built in to Ferret.  I'll post a note about that when
it's available.

Ansley Manke

"Richard D. Slater" wrote:

> > Hi all,
> >
> > I frequently use descriptor files to work with collections of
> > monthly netCDF model output.
> >
> > Unfortunately, it seems that there is an hard coded limitation in the size of filenames.
> > I have tested that 60 characters is the maximum size for filename.
> > It is definitevely not enough !
> >
> > I have lived with this problem by using symbolic links but even with this
> > it is sometime difficult to deal with limited size filenames.
> > Could you take this in consideration for future release ?
> I believe that this limit was extended in 5.33? I know that I also had to circumvent this problem, but don't believe that it is an issue any longer.
> However, I keep bumping into many other hard limits in ferret. These are:
> 1) 30 open files. Managing results from 6 models, 2 versions per model, often multiple variables in different variables. A limit of 30 can be exhausted quite quickly.
> 2) something like 50,000 points? Have run into this when using ferret to perform global, possibly masked, integrations over the above files.
> 3) 500 variables, ran into this when trying to work around (1) by combining variables into smaller number of files.
> 4) Some limit on number of axes. Run into this when iterating over a page trying to get things just right.
> I've gotten to the point, because of (1), I will close all open files after each plot (not page, mind you) and sometimes for each line on a plot. This might be causing me to hit some of the other limits, and oculd be less efficient, but is often the only way that I can make some of the plots.
> Limits (2) and (4) cause ferret to crash. Not sure about (3), but don't think so.
> A couple of final comments/requests. First, I had asked sometime before whether these hard limits (and any others) could be documented somewhere, but a brief perusal of the dosumentation didn't show anything. Could someone point me to such documentation if it exists, or produce it if it doesn't?
> Second, would it be possible to have a version of ferret which does not have these limits, but is just limited to available memory? Or maybe thses limits could be dependent on the amount of memory that you allocate for ferret?
> It is getting very tiring having to deal with these limits all of the time. Enough so that even though I like ferret very much I am looking for alternatives.
> I am currently running version 5.40 under linux.
> Rick Slater

Ansley Manke  Pacific Marine Environmental Laboratory  Seattle WA  (206)526-6246

[Thread Prev][Thread Next][Index]

Dept of Commerce / NOAA / OAR / PMEL / TMAP

Contact Us | Privacy Policy | Disclaimer | Accessibility Statement