Hello ferret-ers
Does anyone know how is the standard deviation computed in ferret?
When I compute by my own does not coincide with the one given by ferret.
Maximum, minimum and Mean values do, but no the standard deviation...
The variable is called tmaskI, and is two dimensional
yes? stat tmaskI
Total # of data points: 11400 (100*114*1*1)
# flagged as bad data: 11328
Minimum value: -1.3267
Maximum value: 0.23136
Mean value: -0.48291 (unweighted average)
Standard deviation: 0.39024
yes? list tmaskI[x=@max,y=@max]
0.2314
yes? list tmaskI[x=@min,y=@min]
-1.327
yes? let N=72
yes? list tmaskI[x=@sum,y=@sum]/N
-0.4829
Until now everything coincides, but when I calculate the sd by my own:
yes? let tmaskI2=tmaskI^2
yes? let sd=(tmaskI2[x=@sum,y=@sum]/N)^0.5
yes? list sd
0.6192
I'm using the definition of standard deviation. What does ferret do
different?
Thanks!!