[Thread Prev][Thread Next][Index]

Time Averaging with Masked Data



Hello again,

I have a rather simple, perhaps 'duh' like question to ask concerning
the way ferret treats masked data points.  I have a variables called
Divergence/Convergence  (calculated from scatterometer data).  The
Div/Con values are masked for data points where there is land and
adjacent coastlines, and the same mask has been applied throughout the
dataset.  This means New Guinea,  Celebes, some of the smaller islands
etc are assigned "missing values."  Now, if I use the following subset
of commands (already opened data, yada , yada, yada):

define symbol hl = hlimits=120:270:10
define symbol vl = vlimits=-60:20:10
define symbol dl = level =(-0.000016,0.000016,0.0000004)

fill/set/($hl)/nolab/($dl) Div[y=5n:5s@ave,l=379:743@shn:3]  ! Looking
at an average latitude band from 5S to 5N

I expected to find on the hovmoller diagram blank regions for all times
near Celebes Island (120 to 123 or so E longitude) and New Guinear (135
to ~145E).  Instead, it looked like Ferret made averages across the
region skipping the data voids (masked data) and making an average with
what data was available.   So my questions are:

1) Exactly how does Ferret handle missing data within latitude bands on
running means?

2) How can I get Ferret not to average when there are large data voids
over a given region (or latitude band)?

Thanks in advance.

Scocks
Atmospheric Sciences
Texas A&M



[Thread Prev][Thread Next][Index]

Dept of Commerce / NOAA / OAR / PMEL / TMAP

Contact Us | Privacy Policy | Disclaimer | Accessibility Statement