Thanks for the report and the example script. What you're seeing is
entirely due to how numbers are evaluated with the grave-accent syntax.
Grave accents are evaluated as the command is read in, at the
command-parsing step. So for instance in the LIST command below, the
/PREC=7 has no effect. When running interactively, you'll see the
command echoed back with grave-accent expressions evaluated, and that
is exactly the command that is run by Ferret. So `xa` is always
evaluated as -5.0008 and so on. There's a setting that has these
numbers evaluated only to 5 digits, less than full single precision.
yes? ! def_0: define axes with predefined constants
What you would want is this (shortened listing)
yes? let nx=577; let ny=257;
yes? let xa=-5.00081; let xe=31.2892
yes? let ya=51.8717; let ye=68.0006
yes? def axis/x=`xa`:`xe`/npoints=`nx`/units="degrees_east" xax0
!-> def axis/x=-5.0008:31.289/npoints=577/units="degrees_east"
yes? def axis/y=`ya`:`ye`/npoints=`ny`/units="degrees_north" yax0
!-> def axis/y=51.872:68.001/npoints=257/units="degrees_north"
yes? list/prec=7 `xa`,`xe`,`ya`,`ye`
!-> list/prec=7 -5.0008,31.289,51.872,68.001
Column 1: cnst is constant
Column 2: cnst is constant
Column 3: cnst is constant
Column 4: cnst is constant
cnst cnst cnst cnst
I / *: -5.000800 31.28900 51.87200 68.00100
yes? list/nohead/prec=7 `xa`,`xe`,`ya`,`ye`
Yours seems to me to be a reasonable request - one should be able to
get back full single precision when evaluating a single precision
number. I've put this on our list to change in the next release.
!-> list/nohead/prec=7 -5.00081,31.2892,51.8717,68.00060
I / *: -5.000810 31.28920 51.87170 68.00060
Torsten Seifert wrote:
please try the attached script, which demonstrates that the conversion
of float constants may give different results (here using explicit
constants an/or the `xa` syntax in definitions of axes).
The deviations are small, but definitely larger than is to be expected
for real*4 (with a 24 bit mantissa the 7th digit may change, but not
the 6th digit).
I am wondering how that could happen. It is clear, that a constant
assigned to a variable by let is converted into real*4 accuracy. I
would assume that the same conversion is done if the constants are
specified explicitly in the axis definition because they are assinged
to internal buffers.
Listing the same constants with Fortran shows, that def_1 is right
since no changes appear up to the 7th digit. Thus the conversions by
`xa` etc. seem to give erroneous results. Couls that be fixed, please?