Hi,
Ferret's @var calculates the variance as follows:
var =
((data_point_1-mean_of_data)^2+(data_point_2-mean_of_data)^2+...) /
num_of_data_points
The standard deviation is:
stdev = (var)^0.5
Be careful with using @sum. It may do a weighted sum in a 2-dim case.
Try using a 1-dim calculation first to be sure that the result is not
a weighted sum.
Fabian
Igaratza Fraile-Ugalde wrote:
Hello ferret-ers
Does anyone know how is the standard deviation computed in ferret?
When I compute by my own does not coincide with the one given by
ferret. Maximum, minimum and Mean values do, but no the standard
deviation...
The variable is called tmaskI, and is two dimensional
yes? stat tmaskI
Total # of data points: 11400 (100*114*1*1)
# flagged as bad data: 11328
Minimum value: -1.3267
Maximum value: 0.23136
Mean value: -0.48291 (unweighted average)
Standard deviation: 0.39024
yes? list tmaskI[x=@max,y=@max]
0.2314
yes? list tmaskI[x=@min,y=@min]
-1.327
yes? let N=72
yes? list tmaskI[x=@sum,y=@sum]/N
-0.4829
Until now everything coincides, but when I calculate the sd by my own:
yes? let tmaskI2=tmaskI^2
yes? let sd=(tmaskI2[x=@sum,y=@sum]/N)^0.5
yes? list sd
0.6192
I'm using the definition of standard deviation. What does ferret do
different?
Thanks!!