Introduction intro.txt Summary of discussion discuss.txt Recommendations recom.txt References refs.txt Appendices list of participants and addresses Tony P. can add this? agenda " " " " detailed recommendations detail.txt for ADCP data submission The sections have been kept in separate files for convenience in editing. Corrections, additions, objections, etc. are welcome. I hope to then produce the final draft. Eric ----------------------------------------------------------------------------- >From efiring Fri Sep 25 16:46:36 1992 To: mkosro@oce.orst.edu, dluther@ucsd.edu, tchereskin@ucsd.edu, julio@solarmax.whoi.edu, wilson@3338.span.nasa.gov, r.fauquet@omnet, i.perlroth@omnet, ramon@sam.ucsd.edu, mjugan@navo.navy.mil, cokelet@NOAAPMEL.gov, scri@mbari.org Subject: workshop report: intro.txt Content-Length: 1622 X-Lines: 38 Introduction The US National Oceanographic Data Center (NODC) sponsored a Shipboard Acoustic Doppler Current Profiler (ADCP) workshop at the NODC offices in Washington DC on May 14-15, 1992. The workshop was convened by Eric Firing. The non-NODC participants primarily represented ADCP data producers and users from the US academic community, NOAA, and the Navy (Appendix 1). The motivation for the workshop was the idea that it is time for NODC to systematically archive and distribute shipboard ADCP data; the primary purpose of the workshop was to discuss how to do this most effectively. As background for the NODC workshop, participants were referred to the Report from the 1988 WOCE/NOAA workshop on ADCP measurements (Firing, 1988). That workshop was much broader in scope, covering all applications of ADCPs to oceanography. It touched on some issues of data management and archiving, but was primarily directed to technical issues of data quality and new applications. Much progress has now been made on these technical issues, and some of the new applications are becoming routine--ADCP technology has matured. Shipboard ADCPs are available on most research ships, and are used frequently either as the primary instrument of a cruise or as an important supplement to instruments such as the CTD. The quantity of ADCP data being collected has increased to the point where the issue of systematic data archiving can no longer be deferred. This was the focus of the NODC workshop. This report summarizes the discussions and conclusions of the workshop. Technical details are assembled in appendices. >From efiring Fri Sep 25 16:47:04 1992 To: mkosro@oce.orst.edu, dluther@ucsd.edu, tchereskin@ucsd.edu, julio@solarmax.whoi.edu, wilson@3338.span.nasa.gov, r.fauquet@omnet, i.perlroth@omnet, ramon@sam.ucsd.edu, mjugan@navo.navy.mil, cokelet@NOAAPMEL.gov, scri@mbari.org Subject: draft report: discuss.txt Content-Length: 16510 X-Lines: 345 Discussions =========== NODC Director Bruce Douglas opened the workshop, noting his own experience as a researcher working with satellite data: the volume of data required some concern with data management. That is, data management is not to be pursued as an end in itself, but as a necessary means to the goal of extracting information from measurements. The purpose of the workshop is to begin figuring out how to get shipboard ADCP data not just into NODC, but also back out to researchers. For background and motivation, Eric Firing defined categories of shipboard data, listed some applications, and noted the potential benefits of routine data collection. Categories of shipboard ADCP data source: Characteristics of the data to be analyzed and archived will vary. For example, some data will be fully processed and analyzed by the originating institution, some will not; some will be supplied regularly from a given track, some will not. The degree of care taken in data collection and processing is probably the most imprtant characteristic relevant to our overall goal of maximizing scientific benefit from the shipboard ADCP resource. This suggests the following classification system for cruises: 1) ADCP-primary: the shipboard ADCP is a major part of the scientific program. Care is taken in Data Acquisition System (DAS) setup, and the data are fully processed follwoing the cruise. This category includes both conventional physical oceanography research cruises and Volunteer Observing Ship (VOS) programs that include a shipboard ADCP. 2) ADCP-secondary: the shipboard ADCP is at most a minor part of the scientific program, but it is present, and data are collected. The data are not routinely processed. 3) ADCP-ignored: shipboard ADCP data could be collected, but are not. Archiving problems related to the ADCP-primary category include inducing the PI to supply the data, and dealing with the various processing systems and data formats used by different groups. Problems related to the ADCP-secondary category include obtaining all necessary data (ADCP and navigation, for example), dealing with a data set that may have problems due to lack of interest and attention, and the necessity for the data to be processed before it can be used scientifically. The problem with the ADCP-ignored category is obvious: how to induce ship operators, technical groups, and PI's to take the relatively simple and cheap step of routinely collecting and passing on ADCP data. Likely uses of shipboard ADCP data: Shipboard ADCP data is already widely used, but its potential has not yet been fully exploited. The following uses may be expected: 1) Mapping synoptic currents. This has been a common use on cruises for which the ADCP is a primary instrument. So long as the currents to be mapped are reasonably strong, this is probably the application least demanding of high accuracy. 2) Transport measurements. Transport calculations are often made along with current maps. Another example is the calculation of Ekman transport by subtracting the geostrophic transport from the ADCP transport. Transport estimates are highly sensitive to any bias in the estimate of transducer heading. 3) Calculation of terms in dynamic balance equations. This application can be carried farthest when there is a time series, like the NORPAX Tahiti Shuttle. The longer the series, and the better the spatial coverage, the more can be done. 4) Climatology: calculation of mean circulation and/or eddy statistics for large areas. This will be increasingly valuable as the total amount of available data grows. 5) Climatology of small regions and small-scale phenomena such as fronts. 6) Internal wave studies: extension of climatological calculations to the statistics of small vertical and/or horizontal scales. 7) Model verification: comparison of ADCP-measured currents to simulated currents. This is proving to be a useful method of improving the understanding of both models and observations. 8) Data assimilation into models. There does not seem to be much interest in this yet, but it may develop as shipboard ADCP data gets more voluminous and readily available. 9) Biomass measurements. The potential of ADCP measurements for widespread monitoring of biomass in the upper 300 m is clear, but so far it has been exploited only occasionally. 10) Combination with satellite altimetry to improve the accuracy of the geoid, and thereby permit the altimetric measurements to yield accurate estimates of absolute geostrophic currents. Benefits of shipboard ADCP data: An exhaustive list was not attempted, but the following points were made with examples: 1) The horizontal resolution of shears by the ADCP is much better than usually available from geostrophic calculations. On the WOCE Hydrographic Program cruise P17, an example with closely spaced high-quality CTD stations, the dominant eddy activity in mid latitudes is at a horizontal wavenumber not far from the Nyquist; it is just barely resolved by the geostrophic sections, but is well resolved by the shipboard ADCP measurements. Hence, geostrophic sections will tend to underestimate eddy energy. 2) Routine ADCP measurements on a repeated cruise track can reveal significant mean currents that are obscured by variability in any individual section. As an example, the mean of 45 sections between Oahu and a repeated CTD station 100 km north of the island clearly shows the North Hawaiian Ridge Current running west-northwest along the windward (north) side of the Hawaiian Ridge. In the individual sections, this current is often obscured by eddies, inertial oscillations, and internal tides. 3) Routine ADCP measurements on miscellaneous cruises can yield very interesting data at low marginal cost. As an example, a sea floor mapping cruise of the Moana Wave, going from Samoa to New Zealand, crossed the track of Typhoon Ofa just a few days after it had passed. The ADCP data show inertial oscillations of nearly 1 m/s amplitude, with downward propagation of slightly superinertial components as expected from theory. Julio Candela discussed the use of the shipboard ADCP in nearshore studies, showing for example his analysis of tides on the continental shelf near the Amazon delta (Candela et al., 1992). The raw current measurements look very messy, but extraction of the tides by a least-squares model fit yields estimated mean currents in the alongshore direction that constitute a significant fraction of the net transport of the North Brazil Current. Another calculation from this dataset yields an estimate of the tidal energy dissipation in the Amazon--this would have been hard to do using moored current meters. Some conclusions relevant to the workshop: 1) Coastal ADCP data can be highly valuable, so its collection and archiving should be encouraged. 2) Use of bottom tracking for velocity referencing, possible only in coastal regions, has the advantage of reducing the requirement for accurate position and heading measurements. Teri Chereskin discussed the use of shipboard ADCP measurements for studying the surface Ekman layer, giving for example her results from the Atlantic at 11N (Chereskin and Roemmich, 1991). Horizontally averaged Ekman flow as a function of depth was calculated as the difference between the ADCP-measured current and the geostrophic current, both referenced to 250 m. This type of calculation does not depend on some of the more troublesome aspects of shipboard ADCP measurements: because it involves looking at velocity relative to a reference level, it is insensitive to errors in the transducer orientation and navigation. Comparison of the results at 11N with the Ekman transport predicted from the wind stress showed good agreement for total transport. A significant part of the transport occurred below the mixed layer. This discrepancy from slab mixed layer theory implies a smaller northward heat transport than if the entire transport were in the mixed layer. John Easton and Dennis Krinan discussed the history and extent of the NAVOCEANO shipboard ADCP program. Their first measurements were made in 1985 with an Ametek Straza instrument. RDI instruments were installed during 1987-1991, and there are now 8 vessel-mounted systems. Four of these are a unique 115 kHz model running a special version (2.54) of the RDI DAS (on the Hess, Wyman, Maury, and Tanner). The other 4 are standard VM-150 models running DAS 2.48 (on the Kane, Wilkes, Bent, and **????**). Data collected before 1989, with the 115-kHz systems, were logged with an HP-45 acquisition computer. Navigation, from P-code GPS receivers, is stored separately and is classified. The more recent data collected with the VM-150 systems includes navigation logging by the RDI NAVSOFT enhancement to the DAS, and is not classified. Data collection has been primarily in the Norwegian Sea, the North Atlantic, and the Barents Sea. Uses of the data include support of modelling, and engineering studies such as effluent dispersion. NAVOCEANO is using a variety of programs for looking at ADCP data. PC software from the University of Alaska is used for a quick look at the relative velocities. PVWave is used on Suns for graphical editing, mainly for moored ADCP data so far. A database and data viewing system called IDEAS is being developed for oceanographic data in general, and this is intended to include ADCP data. It expands on the NEONS database system (built on the Empress relational database program) and uses Uniras for plotting and Motif as the GUI. All data are stored in packed binary formats. Data can be located by query based on position, time, and 2 data-specific variables. There is a fancy data browser. ADCP data processing has lagged the development of this database and display system. The UH CODAS software is now being studied for possible adoption as a data processing system. Doug Luther discussed his experience and viewpoint as a user rather than a producer of shipboard ADCP data. From the Tahiti Shuttle dataset, including CTD as well as ADCP measurements on repeated sections, Luther and Johnson (1990) were able to calculate mean currents and estimate both mean and eddy terms in the mean momentum balance. The potential for more of this sort of work with ADCP data is great, but the systematic accumulation and processing of data from many sources will be required to give broad spatial and temporal coverage. A problem encountered by data users is difficulty in evaluating the characteristics and limitations of each individual ADCP dataset. Therefore it is recommended that data producers routinely supply as much information as possible about data collection and processing methods for each dataset that they supply to an archive or user. A rough survey was then made of shipboard ADCP data sources: which ships have instruments normally installed, and how often data are collected. It was suggested that the RDI customer list be consulted for a more complete list of installed instruments. Data collection policies vary from institution to institution. For example, ADCP data is recorded on all NOAA cruises on ADCP-equipped ships, and on most Moana Wave cruises. On Scripps and WHOI ships, the ADCP is used only when requested by the science party. A careful survey of the data collection policies of all institutions with ADCPs is beyond the scope of this workshop. Bruce Douglas presented his plan for the evolution of NODC. NODC is moving away from a "tape management" system and toward providing more direct access to archived data. With present hardware--cheap SCSI disks--and software--relational systems--tens of gigabytes can be handled without difficulty. The strategy will be to place all data into a relational database. No information will be lost, although data may be reformatted and rearranged. The name of the investigator who collected each dataset, together with references and other documentation about the dataset, will be included in the database for easy retrieval with the data itself. Maintaining this link between the originator and the data will ensure that the originator gets credit, and will make it easier for the data user to evaluate and understand the data. JGOFS data are now being loaded directly into such a relational system. To improve data access and general communications, NODC will soon be solidly connected to the Internet. An interactive data access system, called Pegasus, is being developed on the PC. This was demonstrated for the workshop participants by its programmers at NODC, ***xxx*** and ***yyy***. The program gives a remote user with a PC flexible access to data in a relational database at NODC. Pegasus has a graphical interface and uses SQL to communicate the the database. Data can be viewed on screen and/or requested in various formats. The NODC Nansen bottle dataset is presently being loaded into the system as a prototype. Another method used increasingly by NODC and others to improve the accessibility of datasets is publication on CDROM. There was general agreement that CDROM would likely be a useful medium for publication of shipboard ADCP data. A second area of change in NODC is the in-house science program, which will be increased to 20% of the budget. The emphasis will be on retrospective analysis of NODC datasets; active scientific research within NODC, using NODC products, will help maintain the quality of NODC service. NODC will also be expanding its area of interest in biological oceanographic data. There was considerable discussion of various strategies for data submission, storage, and dissemination. NODC participants noted that NODC could be flexible, receiving data in several formats, but that binary formats could be hard to handle. They noted that the most important thing is not format but content, including the associated documentation--both the structure of the format and the content must be well defined and understood. Non-NODC participants indicated reservations about the general strategy of dissecting all data sets into their smallest components and loading them into a monolithic relational system, suggesting instead that a hybrid system might work better. A relational system could be used to manage metadata, subsamples of the actual data, and files containing the full data in suitable binary formats. Both NET-CDF (Common Data Format) and CODAS might be useful for the latter. CODAS and NET-CDF, developed independently and roughly simultaneously, have many similarities but also some important differences. Both aim for machine-independent data access, both are designed for flexibility, and both use binary storage for speed and efficiency. In CODAS, machine-independence is achieved by a utility that automatically converts files from one machine number format to another; in NET-CDF the data are stored in a single format, and conversion is done as the data are written or accessed. CODAS is a higher level system than NET-CDF, and has a complete shipboard ADCP processing and analysis system built around it. If it appears that NET-CDF will gain currency as a standard, it may be possible to modify CODAS to use NET-CDF as its lowest level, thereby gaining the advantages of both systems. There was complete agreement that, however the technical details might be worked out, it was time for NODC to become much more active in working with shipboard ADCP data. Given the complexities of this data set, a strong partnership with the academic community would be needed. This could take several forms, from simple advice on what variables and metadata should be sought, to the posting of NODC employees with ADCP responsibilities in academic institutions, to partial support of an ADCP data center. Discussion of such matters led to the recommendations given in the next section. >From efiring Fri Sep 25 16:47:32 1992 To: mkosro@oce.orst.edu, dluther@ucsd.edu, tchereskin@ucsd.edu, julio@solarmax.whoi.edu, wilson@3338.span.nasa.gov, r.fauquet@omnet, i.perlroth@omnet, ramon@sam.ucsd.edu, mjugan@navo.navy.mil, cokelet@NOAAPMEL.gov, scri@mbari.org Subject: draft report: recom.txt Content-Length: 1752 X-Lines: 53 Recommendations =============== 1) Institutions with shipboard ADCPs should get maximum benefit from them: run them routinely, archive all data, and submit it to NODC. As much as possible, data should be processed and inspected by interested personnel at the originating institution. 2) NODC should develop its ability to handle ADCP data, and encourage scientists to submit it. 3) Submissions should be accepted in several forms, from the raw data files in standard manufacturer's format to fully processed files in suitable binary or ASCII formats. 4) The use of the CODAS software system from University of Hawaii should be encouraged; unlimited proliferation of different processing systems and data formats will needlessly complicate the handling of ADCP data by NODC and by data users. 5) An ADCP Center should be established, somewhat along the lines of JEDA and JASL, with the following functions: Disseminate standards and recommendations regarding the collection and submission of secondary ADCP data. Track potential and actual data suppliers, and the characteristics of their ADCP installations. Process and archive secondary data. Check and archive primary data, with additional processing if necessary. Develop, maintain, and distribute software and documents for data acquisition. Actively participate in analyses of shipboard ADCP data. Respond to data requests, either directly or via NODC. Organize data sets and produce CD-ROMs. Train NODC personnel, so as to transfer many of the Data Center functions to NODC. Train people from other institutions, so as to transfer as much as possible of the data processing and quality control to the originating institutions. >From efiring Fri Sep 25 16:48:01 1992 To: mkosro@oce.orst.edu, dluther@ucsd.edu, tchereskin@ucsd.edu, julio@solarmax.whoi.edu, wilson@3338.span.nasa.gov, r.fauquet@omnet, i.perlroth@omnet, ramon@sam.ucsd.edu, mjugan@navo.navy.mil, cokelet@NOAAPMEL.gov, scri@mbari.org Subject: draft report: refs.txt Content-Length: 980 X-Lines: 25 Candela, J., R. C. Beardsley, and R. Limeburner, 1992. Separation of tidal and subtidal currents in ship-mounted acoustic Doppler current profiler observations. J. Geophys. Res., 97, 769-788. Chereskin, T. K. and D. Roemmich, 1991. A comparison of measured and wind-driven Ekman transport at 11N in the Atlantic Ocean. J. Phys. Oceanogr., 21, 869-878. Firing, E., 1988. Report from the WOCE/NOAA workshop on ADCP measurements, held in Austin, Texas, March 1-2, 1988, WOCE Planning Report No. 13, 97 pp., U.S. Planning Office for WOCE, College Station TX. Firing, E., 1991. Acoustic Doppler current profiling measurements and methods. In WOCE Operations Manual, Volume 3, Section 1, Part 3: WHP Operations and Methods, WHP Office Report WHPO 91-1, WOCE Report No. 68/91, WHOI, Woods Hole, MA 02543. Luther, D. S. and E. S. Johnson, 1990. Eddy energetics in the upper equatorial Pacific during the Hawaii-to-Tahiti Shuttle Experiment. J. Phys. Oceanorg., 20, 913-944. >From efiring Fri Sep 25 16:48:39 1992 To: mkosro@oce.orst.edu, dluther@ucsd.edu, tchereskin@ucsd.edu, julio@solarmax.whoi.edu, wilson@3338.span.nasa.gov, r.fauquet@omnet, i.perlroth@omnet, ramon@sam.ucsd.edu, mjugan@navo.navy.mil, cokelet@NOAAPMEL.gov, scri@mbari.org Subject: draft report: detail.txt Content-Length: 6927 X-Lines: 165 Detailed recommendations for shipboard ADCP data acquisition and submission ============================================================ An extensive discussion of shipboard ADCP data collection and processing is given by Firing (1991). It may provide useful background for the following specific recommendations. Data acquisition: For RDI profilers, the following variables should be recorded routinely: horizontal and vertical velocity components; error velocity (the difference between the estimates of vertical velocity from the two pairs of beams); signal strength (or automatic gain control level); percent good pings per depth bin; percentage of 3-beam solutions used; and beam statistics, the percentage of good depth bins in each beam. Spectral width has not proven very useful to date, but one might as well record it also. Ancillary variables, such as transducer temperature, must be recorded. All of these variables can be selected for recording by the RDI DAS, so they present little difficulty. An averaging interval of no less than 1 minute is recommended. Five-minute intervals are commonly used and are probably adequate for most situations. Beyond these general guidelines, we cannot make a definitive statement about averaging intervals. Little work has been done so far to study the limits of useful horizontal resolution with shipboard ADCP data. This may also change in the future, as profiler accuracy and attitude compensation improve. With the present RDI DAS running on a 286-based AT-class PC, excessively short averaging intervals may result in significant aliasing due to the pause in pinging while the ensemble is being processed, recorded, and plotted on the screen. This problem is worse if the data are written to floppy disks rather than to a hard disk. Very short averaging intervals also have the disadvantage of greatly increasing the data processing and storage burden. Excessively long intervals, on the other hand, will surely cause the loss of potentially interesting detail. They can also cause a general degradation of data quality by masking momentarily erroneous data that could be edited out if shorter ensembles were used. Ideally, a DAS should record its own version, the firmware version, a complete account of the state of the profiler, and the values of any parameters used by the DAS. Unfortunately, the RDI DAS provides only part of this information. It remains to be investigated whether there are some reasonably simply ways to record the full information set routinely. As a minimum, it would be good to keep a copy of the default configuration file with the raw data, although this would not give any information about changes made to the instrument configuration via the DAS menus. For additional notes on instrument setup and operation, see Firing (1992); Data submission and archiving: Three versions of shipboard ADCP data should normally be archived: the raw data files as they come from the DAS; a fully processed version of the dataset with maximum horizontal and vertical resolution; an averaged dataset suitable for browsing, and derived from the fully processed dataset. For uniformity, we suggest the following standard: average all data over 1-hr intervals, and interpolate to integral multiples of 10 m. We would expect that if the data have been processed reasonably well, then the raw data files would be accessed only rarely. They should still be kept for several years at least, in case a problem is discovered with the processing. The originating institution would probably be the most logical place for these data files to be kept. The processed dataset is the primary resource for analysis. Typically it would be in the form of CODAS block files. It should be accompanied by substantial information about the dataset and its processing, as will be discussed below. The averaged dataset may in fact satisfy the needs of many data users. It should be submitted to NODC in a very simple ASCII or perhaps NetCDF format. The submission format could also be a useful distribution format. Specification of one or a few preferred formats was not attempted at the workshop, but should not be difficult as a followup. The descriptive information accompanying an ADCP data submission should ideally include the following: 1) Setup: Software and firmware versions and all parameters. 2) Installation: Hardware model and serial numbers. How is the transducer mounted in the hull? Is there an acoustic window? Is the mount rigid, or is it subject to shifts in position? Is it repeatable, so that the transducer will have the same orientation after removal and reinstallation? Where is it positioned on the hull? When was it last removed and reinstalled? 3) Heading data: What compass and/or other heading input (e.g., GPS attitude sensor) was used? How is the compass compensated? 4) Calibration: What calibration data were available and used? What sort of calibration was applied: constant over the cruise, time varying, velocity-dependent, etc? Was any additional correction made for sound speed, either in the velocity or in the depth bins? What sound speeds were used? 5) Navigation data: What were the navigation inputs? At what times relative to the ensemble start and end times were fixes obtained? What if any editing or smoothing was applied to the fixes? Where was the GPS antenna relative to the ADCP transducer? 6) Time: How was the PC clock error calculated and removed? 7) Navigation calculation: What was the form and width of the filter used to smooth the reference layer velocity? What depth range was used for the reference layer? 8) Editing: What procedures and criteria were used? 9) General: Were there any unusual circumstances, such as failure of the temperature sensor or of a transducer beam? Was there particularly notable degradation of performance due to bad weather? Much of this information might be in the form of references to documents describing the installation of a particular profiler, standard processing methods, etc. Even with such references, it might be good to have a tabular summary for each cruise to accompany the dataset for that cruise. The design of a standard form for such a table remains to be done. The above list indicates an ideal toward which we can work, not something that can be achieved immediately and uniformly. Some of the information we would like to have is simply not available for existing datasets, and some of it will undoubtedly be missing from many future datasets. To maximize the availability of this information in the future, we must try to find mechanisms for collecting and tracking it that are as simple and automated as possible. Requesting people to fill out repetitive forms must be minimized.