[Thread Prev][Thread Next][Index]

[ferret_users] How to convert a CSV file to netcdf with Ferret

Hi Keith,

(Are you on a Linux box or a PC.  Ferret is much stronger on Linux ... our PC support has been only low level for several years.)

For the basics of reading a delimited file see the Ferret Users Guide at http://ferret.pmel.noaa.gov/Ferret/documentation/users-guide/data-set-basics
Click on


    Reading "DELIMITED" data files

Note that the alias COLUMNS stands for "SET DATA/FORMAT=DELIMITED" in what is below.

The Users Guide text jumps straight into complicated examples, where there are mixed data types between the data columns.  If all of the columns have pure numeric data -- such as
column_1, column_2, column_3

then the simplest read and save in netCDF with Ferret is these two commands:
columns/var=v1,v2,v3/skip=1 test.dat
save/file=test.nc v1,v2,v3

The /SKIP=1 skips over the column headings.     There are several ways to capture the column headings as variable names (btw: they should not have blanks or embedded syntactic characters like "+" to be "nice" names in Ferret or in netCDF).  In the following example I use the Unix command "head -1" to capture the first line.  If the delimiter were not a comma (coincidentally also the correct delimiter for Ferret syntax) I could not have used this simple approach. 
let my_columns = {spawn:"head -1 test.dat"}
columns/var="`my_columns`"/skip=1 test.dat
save/file=test.nc/clobber `my_columns`

Another way to capture the column headings (your variable names) is to read the file's header line using the COLUMNS command as 3 separate Ferret text variables -- one for each column header.   Finally, here is a more complete conversion to netCDF in which we regard the first column as the defining axis of the dataset -- an ordered list of depths.
! define the grid for your data
columns/var=v1,v2,v3/skip=1 test.dat
define axis/depth/units=meters my_z = v1
define grid/z=my_z my_grid

! now read and write the variables imposing your grid on the data
let my_columns = {spawn:"head -1 test.dat"}
columns/var="`my_columns`"/skip=1/grid=my_grid test.dat
save/file=test.nc/clobber `my_columns`

You can put titles, units, standard_name, and other attributes onto the variables using SET VARIABLE/TITLE="ttt"/UNITS="uuu" column_1.    Note that if there are more that 20480 rows in your file, you will have the additional step of defining and using a larger grid for the data.

    have fun - Steve

P.S.  Here is the resulting netCDF file:
> ncdump test.nc
netcdf test {
        MY_Z = 3 ;
        double MY_Z(MY_Z) ;
                MY_Z:units = "METERS" ;
                MY_Z:point_spacing = "even" ;
                MY_Z:axis = "Z" ;
                MY_Z:positive = "down" ;
        float COLUMN_1(MY_Z) ;
                COLUMN_1:missing_value = -1.e+34f ;
                COLUMN_1:_FillValue = -1.e+34f ;
                COLUMN_1:long_name = "COLUMN_1" ;
                COLUMN_1:history = "From test.dat" ;
        float COLUMN_2(MY_Z) ;
                COLUMN_2:missing_value = -1.e+34f ;
                COLUMN_2:_FillValue = -1.e+34f ;
                COLUMN_2:long_name = "COLUMN_2" ;
                COLUMN_2:history = "From test.dat" ;
        float COLUMN_3(MY_Z) ;
                COLUMN_3:missing_value = -1.e+34f ;
                COLUMN_3:_FillValue = -1.e+34f ;
                COLUMN_3:long_name = "COLUMN_3" ;
                COLUMN_3:history = "From test.dat" ;

// global attributes:
                :history = "FERRET V6.193    3-Mar-09" ;
                :Conventions = "CF-1.0" ;

 MY_Z = 10, 11, 12 ;

 COLUMN_1 = 10, 11, 12 ;

 COLUMN_2 = 20, 21, 22 ;

 COLUMN_3 = 30, 31, 32 ;


Gordon Keith wrote:
It sounds like it might be what I am looking for.

Could you please list the commands that would be required to read a csv file, 
use the column headings as variable names and save the file as a netCDF file?

I've installed ferret, but am having some difficulty finding the commands I 


On Tue, 3 Mar 2009 03:33:05 am you wrote:
The Ferret program (http://www.ferret.noaa.gov/) can do this.  It's not
a "basic utility" -- rather a full analysis and visualization
application -- but with two commands it can do the conversion.  With
another couple of commands it can capture and use the column headings as
the variable names.  You can also go the next steps to create a proper
CF-conformant netCDF file by capturing one of the variable (typically
time or depth) as the netCDF "coordinate" variable of the file.

    - Steve


Gordon Keith wrote:
Is there a program available to convert CSV files to netcdf format?

We are finding netcdf a useful format for accessing data and it would be
good to be able to convert data that arrives in CSV (Comma Separated
Variable) format to basic netcdf files.

I'm thinking a basic utility that reads a CSV file, uses the first row as
variable names, uses the second row to determine data types, and puts the
data from the file into a CSV file. It occurs to me that such a utility
shouldn't be too hard to write, so someone has probably done it.



Gordon Keith
Programmer/Data Analyst
Marine Acoustics
CSIRO Marine and Atmospheric Research

    Machines running Windows are by far the most popular,
    with about 90% machines in use worldwide. Linux users, on
    the other hand, may note that cockroaches are far more
    numerous than humans, and that numbers alone do not
    denote a higher life form.

netcdfgroup mailing list
For list information or to unsubscribe,  visit:


Steve Hankin, NOAA/PMEL -- Steven.C.Hankin@xxxxxxxx
7600 Sand Point Way NE, Seattle, WA 98115-0070
ph. (206) 526-6080, FAX (206) 526-6744

"The only thing necessary for the triumph of evil is for good men
to do nothing." -- Edmund Burke

[Thread Prev][Thread Next][Index]

Contact Us
Dept of Commerce / NOAA / OAR / PMEL / TMAP

Privacy Policy | Disclaimer | Accessibility Statement