OceanObs’09Community White Paper Proposal

Quality Assurance of Real-Time Ocean Data:

Evolving Infrastructure and Increasing Data Management to Monitor the World’s Environment

Lead author:

William Burnett

National Oceanographic and Atmospheric Administration

National Weather Service

National Data Buoy Center

1007 Balch Blvd.

Stennis Space Center, MS 39529-5001 USA

Contributing authors:

Richard Crout

National Oceanographic and Atmospheric Administration

National Weather Service

National Data Buoy Center

1007 Balch Blvd.

Stennis Space Center, MS 39529-5001 USA

Mark Bushnell

National Oceanographic and Atmospheric Administration

National Ocean Service

Ocean Systems Test & Evaluation Program

Center for Operational Oceanographic Products & Services

672 Independence Parkway

Chesapeake, VA 23320

Julie Thomas

Scripps Institution of Oceanography

Coastal Data Information Program

9500 Gilman Drive, 0214

La Jolla, CA 92093-0124

Janet Fredericks

Woods Hole Oceanographic Institution

MS #9

Woods Hole, MA 02543

Julie Bosch

National Oceanographic and Atmospheric Administration

National Environmental Satellite, Data and Information Service

National Coastal Data Development Center

Bldg. 1100, Suite 101

Stennis Space Center, MS 39529

ABSTRACT

At the OceanObs’09 Conference, there will be numerous papers and many discussions describing the intense effort by the international community to completely observe the world’s oceans. New technology, new techniques, better ocean vessels, improved sensors and faster data collection – all of these items will be used to observe and understand the ocean more than at any time in our history.

Yet, with all the observations being collected, and all the new technology being used – who, and better yet, how will this data be properly quality controlled, maintained, disseminated and archived? Quality control and quality assurance of ocean observations seems to be an afterthought by data collectors and observation providers. So, this White Paper will attempt to describe how data managers can properly prepare and manage the tidal wave of ocean observations that will arrive in the next few years.

The Quality Assurance of Real-Time Oceanographic Data (QARTOD) working group started with a small group of data managers and data providers located in the U.S., in the winter of 2003. The group understood that more and more real-time ocean observations were being collected and disseminated to the public, decision makers and marine forecast offices – so the goal of the first meeting was to discuss how these instruments could be properly calibrated, how ocean observations could be properly quality controlled and how standards could be properly maintained. Given the enthusiasm of the first meeting, a series of three other meetings around the U.S. were held with more and more people in attendance. At the fourth QARTOD meeting, the first “official” quality control standards were finalized for instruments that collected ocean currents, waves, temperature, and salinity – and the standards sent to the U.S. IOOS Program for approval.

It is time to expand the QARTOD philosophy to meet the needs of the ocean observation community in the next decade. Instrument developers, data providers and data managers will need to meet international standards to ensure real-time observations are properly maintained and disseminated. The grass roots effort of the U.S. QARTOD can and will expand into an international effort to ensure appropriate quality controls are in place for the rapidly expanding ocean observation effort. By 2019, QARTOD will have morphed from a grass roots effort to a standard international body that oversees, manages and approves all oceanographic data disseminated in real-time. This White Paper will provide the steps, and describe the issues, that will ensure the oceanographic data management community achieves this goal