[Thread Prev][Thread Next][Index]

Re: Bug in isCoord? and prob with addXml.pl



Hi Petra,

The following patch in xml/perl/LASNetCDF.pm will fix the problem:

diff -r1.29 LASNetCDF.pm
25d24
<     $self->{var} = $self->{parent}->getVariable($name);
41c40
<     return $self->{var};
---
>     return $self->{parent}->getVariable($self->{name});


Petra Udelhofen wrote:

> Hi Joe,
> could you drop me line on how you fixed the second bug in addXml.pl? I
> have a similar problem:
> Dimension lon doesn't have a coordinate variable at ./addXml.pl line 76
>
> Thank you,
> Petra
> p.s. Here is a sample ncdump from a netcdf file:
> netcdf mm_ucla {
> dimensions:
>         lon = 72 ;
>         lat = 44 ;
>         level = 10 ;
>         time = 12 ;
> variables:
>         float T(time, level, lat, lon) ;
>                 T:missing_value = 9.9e+30f ;
>                 T:long_name = "temperature" ;
>                 T:units = "K" ;
>         float Z(time, level, lat, lon) ;
>                 Z:missing_value = 9.9e+30f ;
>                 Z:long_name = "geopotential height" ;
>                 Z:units = "m" ;
>         float lon(lon) ;
>                 lon:long_name = "longitude" ;
>                 lon:units = "degrees_east" ;
>         float lat(lat) ;
>                 lat:long_name = "latitude" ;
>                 lat:units = "degrees_north" ;
>         float level(level) ;
>                 level:long_name = "pressure" ;
>                 level:units = "millibars" ;
>         int time(time) ;
>                 time:long_name = "time" ;
>                 time:units = "months since 1900-01-15" ;
>
> // global attributes:
>                 :title = "UCLA Model data" ;
>                 :Conventions = "COARDS" ;
>                 :history = "Converted from GrADS mm_ucla.ctl on Tue Feb 29
> 20:27:03 2000 GMT -05:00 with conv_ctl_netcdf (SUNY-SB/ITPA/SPARC-DC/PMU)"
> ;
> data:
>
>  lon = -180, -175, -170, -165, -160, -155, -150, -145, -140, -135, -130,
>     -125, -120, -115, -110, -105, -100, -95, -90, -85, -80, -75, -70, -65,
>     -60, -55, -50, -45, -40, -35, -30, -25, -20, -15, -10, -5, 0, 5, 10,
> 15,
>     20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75, 80, 85, 90, 95, 100,
> 105,
>     110, 115, 120, 125, 130, 135, 140, 145, 150, 155, 160, 165, 170, 175 ;
>
>  lat = -86, -82, -78, -74, -70, -66, -62, -58, -54, -50, -46, -42, -38,
> -34,
>     -30, -26, -22, -18, -14, -10, -6, -2, 2, 6, 10, 14, 18, 22, 26, 30,
> 34,
>     38, 42, 46, 50, 54, 58, 62, 66, 70, 74, 78, 82, 86 ;
>
>  level = 850, 700, 500, 300, 200, 100, 70, 10, 5, 1 ;
>
>  time = 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 ;
> }
> p.p.s. I will drop further info on our LAS site to the list once I have
> data online, Thanks.
> -------------------
> >Hi Andrew,
> >
> >Thanks for the bug fix. The second bug had already been fixed; both bug
> >fixes will be in the next release of LAS.
> >
> >I created a XML file from your DODS URL using the latest version of
> >addXml, plugged into LAS and it works -- kind of. Ferret runs out of
> >memory if you
> >attempt to visualize the full geographical range; this is easily fixed by
> >using a LAS <init_script> (see the documentation). I've attached the XML
> >file for you
> >to peruse.
> >
> >Cheers.
>
>
> >Andrew Woolf wrote:
> >
> >   Two things:
> >
> >   (1) netCDF allows scalar variables (having ndims==0). These, of
> >course,
> >   are not indexed by any coordinate variable. In LASNetCDF.pm, however,
> >   the function isCoord() tries to get the name of the first dimension,
> >unless
> >   the variable has more than one dimension. This seems to be a bug. The
> >fix
> >   is to replace:
> >    return 0 if (scalar @dims > 1);
> >   by:
> >    return 0 if (scalar @dims != 1); >
>
> >   (2) I have what I believe to be a COARDS-compliant netCDF but
> >addXml.pl
> >   has a problem with it. During the parsing, it reports:
> >   Dimension LONGITUDE_U doesn't have a coordinate variable at
> >LASNetCDF.pm line
> >   669
>
> ....

--
Joe Sirott





[Thread Prev][Thread Next][Index]

Dept of Commerce / NOAA / OAR / PMEL / TMAP
Contact Us | Privacy Policy | Disclaimer | Accessibility Statement