Instead, netCDF is one of four (and a half) distinct lower layers on which the data and metadata structures for I/O API files are currently available; additional lower layers have been incorporated at times in the past, and may very well be incorporated at various times in the future. Attempts to treat the I/O API as "just a data format" have consistently failed in all cases because the persons attempting to do so have not fully understood the I/O API data structures embedded in these files. (And generally they haven't attempted to contact the I/OL API author about these data structures, either.
- The Models-3 I/O API is A programming interface, not a data format !!
- I/O API files are not synonymous with "netCDF files" !!
(Does anyone want to fund development of an MPI 2 lower layer? -- Contact the author!)
The I/O API follows a "punctuated rolling release" model: great effort has been expended to provide and maintain upwards compatibility, so that any program developed to use the I/O API will work correctly with any subsequent release of the I/O API. Newer releases will contain (a) bug-fixes, and (b) new feature and capabilities. There are only the following upward-compatibility issues:
- I/O API-2.1 to I/O API-3.0: 2002
- Requires re-linking due to netCDF API changes—fully source- and object-file compatible: uses netCDF-3.x which was not fully compatible with netCDF-2.x.
Targets Fortran-90 compilers only, and starts to use Fortran-90 features such as
New features for 3.0 include
MODULE M3UTILIO, containing
INCLUDE-files and other declarations.
Deprecate direct use of
Deprecate use of
M3ERR()routine in favor of the more general
Deprecate m3tools/utmtool, in favor of its replacement m3tools/projtool.
- I/O API-3.0 to I/O API-3.1: 2013
- Requires re-compile: Fully source compatible but not object compatible—fundamental
PARAMETER MXVARS3changed from 120 to 2048.
MODULE MATXATTSsupports full input-grid and output-grid descriptions for matrix files (which may be used, for example, for grid-to-grid transforms by such programs as m3tools/mtxcple).
- I/O API-3.1 to I/O API-3.2:
- Minor source code changes required:
Respond to failure in netCDF-4 API backwards compatibility:
CALL NC*()statements throughout with the corresponding netCDF-3
IERR = NF_*(), to accommodate the fact that recent versions of netCDF-Fortran silently did away with the former.
I/O API-3.2 works with both netCDF-3 and netCDF-4 (but netCDF-4 breaks I/O API-3.1)
More m3tools programs:
findwndw, gridprobe, insertgrid, m3probe, mpasdiff, mpasstat, mpastom3, datshift, greg2jul, , jul2greg, juldiff, julshift, timeshift, wrfgriddesc, wrftom3
MODULE MODATTS3, with additional support for various extra (CF, CMAQ, SMOKE(place-holder for future specification)) metadata.
MODULE MODGCTPfor geospatial transform and interpolation operations; now contains some
INTERFACEs formerly in
MODULE MODNCFIOsafely encapsulates netCDF and PnetCDF declarations, replacing include-file
PnetCDF/MPI distributed I/O support for CMAQ. New
MODULE MODPDATAencapsulates data structures and routines for distributed CMAQ I/O.
MODULE MODWRFIOfor WRF-netCDF-format I/O, and related m3tools programs wrfgriddesc and wrftom3.
MODULE MODMPASFIOfor MPAS-format-netCDF I/O and MPAS (unstructured Voronoi-complex) grid-geometry utilities. and related m3tools programs mpasdiff, mpasstat, mpastom3
Remove previously-deprecated m3tools program utmtool (superseded by program projtool).
The Models-3/EDSS Input/Output Applications Programming Interface (I/O API) provides the environmental model developer with an easy-to-learn, easy-to-use programming library for data storage and access, available from both Fortran and C. The same routines can be used for both file storage (using netCDF files) and model coupling (using PVM mailboxes). It is the standard data access library for both the NCSC/CMAS's EDSS project and EPA's Models-3, CMAQ, and SMOKE, as well as various other atmospheric and hydrological modeling systems. There was a external-package wrapper for the I/O API in the Weather Research and Forecasting Model [WRF], which provided the option of direct I/O API output, enabling (among other things) the use of I/O API coupling mode to couple WRF-Chem with SMOKE—an effort unfortunately dropped by NCAR.
Since it is built around an object based conception of environmental modeling, the I/O API provides a variety of data structure types for organizing the data, and a set of access routines which offer selective direct access to the data in terms meaningful to the modeler. For example,is a direct English translation of a typical I/O API
Read layer 1 of variable 'OZONE' from 'CONCFILE' for 5:00 PM GMT on July 19, 1988 and put the result into array A.
READ3()call. "Selective direct access" means that this
READ3call retrieves exactly this ozone data immediately. It does not have to read through previous hours of data, nor whatever other variables (such as NOX or PAN) are in what order in the file. Data can be read or written in any order (or not at all). This characteristic provides the following advantages:
I/O API files also have the following characteristics
- performance: visualization and analysis programs looking at selected parts of the data don't need to read unrequested data from the files.
- modularity: Data can be read or written in any order (or not at all). The same input files serve both CMAQ and MAQSIP engineering models and full-chemistry models -- the former reading just a few of the variables from the files, the latter reading most of them. The modular-model structure used for CMAQ, MAQSIP, and Models-3 depends upon this.
- simplicity: data is presented in real-world/modeling terms: no calculation of record numbers, etc., is needed (nor is the data accessed in run-specific terms). Moreover, time stepped files may extend for many decades, without any associated performance penalties.
- robustness: data are "tagged" by name, date, and time; miscounted record-numbers don't scramble temperatures with pressures, for example.
- They are machine-independent and network-transparent, except for the NCEP native-binary mode. Files created on a Cray can be read on a desktop workstation (or vice versa) either via NFS mounting or FTP, with no further translation necessary.
- They are self-describing -- that is, they contain headers which provide a complete set of information necessary to use and interpret the data they contain.
- The API provides automated built-in mechanisms to support production application requirements for histories and audit trails:
- ID of the program execution which produced the file
- Description of the study scenario in which the file was generated
- They support a variety of data types used in the environmental sciences (each of which can be vertically layered or not, as needed), among them
- gridded (e.g., concentration fields or meteorology-model temperatures)
- grid-boundary (for air quality model boundary conditions)
- scattered (e.g., meteorology observations or source level emissions)
- sparse matrix (as used in the SMOKE emissions model)
- User customized / vector to cover other potential applications.
- They support three different time step structures, as indicated below.
- time-stepped (with regular time-steps) like hourly model-output concentration fields or twice-daily upper air meteorology observation profiles, or
- time-independent like terrain height.
- restart, which maintains "even" and "odd" time step records, so that restart data does not consume an inordinate amount of disk space
The I/O API also contains an extensive set of utility routines for manipulating dates and times, performing coordinate conversions, storing and recalling grid definitions, sparse matrix arithmetic, etc., as well as a set of geospatial-transform, data-manipulation and statistical analysis programs, and programs for interfacing with a number of external modeling systems, such as CAMX/UAM, WRF, and MPAS. It has an extensive documentation set, including
- requirements document;
- indexes and user manuals for data access routines, utility routines, date/time manipulation routines, and analysis programs;
- Data-manipulation, geospatial-transform, and statistical-analysis programs
- several sample programs
Various extensions and research efforts to the I/O API have been developed or are under development. Developments include the use of the I/O API interacting with PVM for model coupling, and adding operations to read or write entire time series (with multiple time steps) as single operations, and research projects include data-parallel I/O and a very powerful "geospatial cell complex" data type with polygonal-cell decompositions that may be both time independent (as for finite elememt modeling) and time dependent (as for moving-mesh plume-in-grid modeling).
These web pages are copyright © 1992-2002 MCNC and Carlie J. Coats, Jr., © 2003-2005 by Baron Advanced Meteorological Systems (hereinafter, BAMS), © 2005-2013,2017- Carlie J. Coats, Jr., and © 2014- UNC Institute for the Environment. The I/O API source code and documentation are copyright © 1992-2002 MCNC and Carlie J. Coats, Jr., © 2003-2013 BAMS, © 2005- Carlie J. Coats, Jr., and © 2014- UNC Institute for the Environment. The library source code is distributed under the GNU Lesser (library) Public License (LGPL) Version 2.1, and the tools-program source code is distrubuted under the GNU General Public License, Version 2, subject to copyright-statement retention constraints. See the Notices: Copyright, and Acknowledgements page.
$Id: index.html 165 2020-05-27 16:13:18Z coats $