[Thread Prev][Thread Next][Index]

3 argument shade, sigma coordinates, layer depths



Dear Ferret users,

I've spent some time trying to make plots of sigma coordinate data as
shown in the example in the FAQ, using the 3 argument shade command. I
want the layer interfaces to be plotted at the right depth, and the
layers should have solid colors. I have it working almost correctly,
except for the depth of the layer interfaces. The problem is that my
depth is defined as layer thickness. I create x_page and y_page
variables as follows:

! copy x coordinates from temp variable
let xfield = x[g=temp_total] + 0*z[g=temp_total]    

! create y (sigma/z) by summing layer thicknesses
let yfield = dp1[g=temp_total,k=@rsum]

x (longitude) is copied from my temp_total variable. To get y (depth)
I use a running sum of the layer depths. So far so good, but when I
make a plot with

shade /vlimits=250:0:50 temp_anom, xfield, yfield

the layer depths aren't interpreted correctly: 
1) the first layer is not extrapolated to 0
2) shaded "boxes" are centered on the layer depth, not drawn between two
   layer depths.

I would use bounds or edges, but I don't know how to combine this with
the 3 argument shade command, as the axis data is coming from a
variable instead of from a real axis.

Any hints on how to do this properly?

    Hein Zelle

>-----------------------------------------------------------------<
    Hein Zelle
    Dept. of Oceographic Research
    KNMI, The Netherlands
    work:        zelle@knmi.nl     http://www.knmi.nl/~zelle
    private:     hein@icce.rug.nl  http://www.icce.rug.nl/~hein
    Phone:       +31 (0)30 2206704
>-----------------------------------------------------------------<
-------------------------------------------------------------- 

Zie ook/see also: http://www.knmi.nl/maildisclaimer.html 
 


[Thread Prev][Thread Next][Index]

Dept of Commerce / NOAA / OAR / PMEL / TMAP

Contact Us | Privacy Policy | Disclaimer | Accessibility Statement