[Thread Prev][Thread Next][Index]

[las_users] Re: [thredds] Not a valid CDM File question.



This is an LAS and F-TDS question. I'll leave the THREDDS list in the conversation for now.


On Fri, Apr 18, 2014 at 1:32 PM, Sumagaysay, Rosanna M (3980-Affiliate) <Rosanna.M.Sumagaysay@xxxxxxxxxxxx> wrote:
Hello,

I'm sending this both to the THREDDS and LAS mailing list and hope to get an advise on a solution.


I created a THREDDS catalog that dynamically aggregates the NetCDF files to a virtual netcdf.. 
The configuration works great when accessing via THREDDS URL.

I then used the THREDDS/Opendap link of the Virtual NetCDF file to configure the dataset into LAS. 
When accessing the dataset in the Interface, there are no messages returned in the LAS log nor TOMCAT directory.
I did notice when selecting the (?) from the LAS dataset selection, I selected the .jnl link and it returned:
Error {
    code = 500;
    message = "java.io.IOException: Cant read /usr/local/tomcat/content/las/conf/server/data/SSHA_TEST/data_podaac-ecco.jpl.nasa.gov_thredds_dodsC_kf080_kf080_have_timeAgg.nc.jnl: not a valid CDM file.";
};

Every LAS data set has two representations.  One is the original data set (in your case an OPeNDAP aggregation created by TDS) and the F-TDS URL. If you follow the LAS installation instructions you should have configured the TDS in threddsConfig.xml to use the Ferret I/O service provider and data source class. And you should have configured the las/conf/server/data directory to be have *.jnl files recognized as CDM files. 
 

QUESTION: How can I make the virtual NetCDF file to be a valid CDM file.  

The full details are here: http://ferret.pmel.noaa.gov/LAS/documentation/installer-documentation/installation/installing-and-integrating-tds-with-las/

By looking at your LAS installation, I can see that something about the process has not be completed correclty.

The instructions above is the first place to start for fixing your LAS installation.

Roland
 
Note I used joinExisting because each granule contains more than 1 time value (from 3 to 10).  and using joinNew requires only 1 time entry per granule. 

THREDDS catalog of Dataset  

THREDDS/OPeNDAP link to aggregated dataset:

Main THREDDS Catalog:


Here's the Snippet configuration of the aggregated dataset.  You can also find the scan location directory content here:


(catalog.xml) - a snippet of the dataset.
..
..
      <dataset name="Sea Surface Height (10 day) Time Aggregation (kf080)" ID="kf080_have_timeAgg" urlPath="kf080/kf080_have_timeAgg.nc">
            <aggregation dimName="time" type="joinExisting" recheckEvery="1 day">
               <scan location="/usr/www/ecco/data4/kf080/" regExp="Have_08_08.*\.cdf" />
            </aggregation>
            <attribute name="Convention" value="CF-1.0" />
            <variable name="lat">
                  <attribute name="standard_name" type="String" value="latitude"/>
                  <attribute name="axis" type="string" value="Y"/>
                  <attribute name="units" type="string" value="degrees_north" />
            </variable>
            <variable name="lon">
                  <attribute name="standard_name" type="string" value="longitude"/>
                  <attribute name="axis" type="string" value="X"/>
                  <attribute name="units" type="string" value="degrees_east" />
            </variable>
            <variable name="time">
                  <attribute name="standard_name" type="string" value="time"/>
                  <attribute name="axis" type="string" value="T"/>
                  <attribute name="units" type="string" value="hours since 1970-01-01 00:00:00"/>
            </variable>
         </netcdf>
..
..


Here's the first part of the aggregated NetCDF file it generated and placed in THREDDS ./cache/agg/kf080-kf080_have_timeAgg.nc

<?xml version='1.0' encoding='UTF-8'?>
<aggregation xmlns='http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2' version='3' type='joinExisting' dimName='time' >
  <netcdf id='/usr/www/ecco/data4/kf080/kf080_1993/n10day_01_09/Have_08_08.00001_02160_240.cdf' ncoords='9' >
    <cache varName='time' >201744.0 201984.0 202224.0 202464.0 202704.0 202944.0 203184.0 203424.0 203664.0 </cache>
  </netcdf>
  <netcdf id='/usr/www/ecco/data4/kf080/kf080_1993/n10day_01_09/TRASH/Have_08_08.00001_02160_240.cdf' ncoords='9' >
    <cache varName='time' >201744.0 201984.0 202224.0 202464.0 202704.0 202944.0 203184.0 203424.0 203664.0 </cache>
  </netcdf>
  <netcdf id='/usr/www/ecco/data4/kf080/kf080_1993/n10day_10_18/Have_08_08.02160_04320_240.cdf' ncoords='9' >
    <cache varName='time' >203904.0 204144.0 204384.0 204624.0 204864.0 205104.0 205344.0 205584.0 205824.0 </cache>
  </netcdf>
  <netcdf id='/usr/www/ecco/data4/kf080/kf080_1993/n10day_19_27/Have_08_08.04320_06480_240.cdf' ncoords='9' >
    <cache varName='time' >206064.0 206304.0 206544.0 206784.0 207024.0 207264.0 207504.0 207744.0 207984.0 </cache>
  </netcdf>
  <netcdf id='/usr/www/ecco/data4/kf080/kf080_1993/n10day_28_37/Have_08_08.06480_08880_240.cdf' ncoords='10' >
    <cache varName='time' >208224.0 208464.0 208704.0 208944.0 209184.0 209424.0 209664.0 209904.0 210144.0 210384.0 </cache>
  </netcdf>
..
..



Here's the Access in the LAS interface
select   (kf080_have) -> Sea Surface Height  (it'll just hang).  Select (?) next to the dataset name and click on .jnl you'll find the error message.


Please advise.
Thanks so much.
Rosanna.

------------------------------------------------------------
Rosanna Sumagaysay-Aouda
Physical Oceanography DAAC
http://podaac.jpl.nasa.gov/
Mail Stop:  Raytheon-299
Rosanna.M.Sumagaysay@xxxxxxxxxxxx
------------------------------------------------------------
"You can't depend on your eyes when your imagination is out of focus." - Mark Twain

_______________________________________________
thredds mailing list
thredds@xxxxxxxxxxxxxxxx
For list information or to unsubscribe,  visit: http://www.unidata.ucar.edu/mailing_lists/


[Thread Prev][Thread Next][Index]


Contact Us
Dept of Commerce / NOAA / OAR / PMEL / TMAP

Privacy Policy | Disclaimer | Accessibility Statement