Skip to main content

PERSPECTIVE article

Front. Mar. Sci., 03 April 2024
Sec. Ocean Observation
Volume 11 - 2024 | https://doi.org/10.3389/fmars.2024.1358591

A model for community-driven development of best practices: the Ocean Observatories Initiative Biogeochemical Sensor Data Best Practices and User Guide

Hilary I. Palevsky1*† Sophie Clayton2,3*† Heather Benway4 Mairead Maheigan4 Dariia Atamanchuk5‡ Roman Battisti6‡ Jennifer Batryn7‡ Annie Bourbonnais8‡ Ellen M. Briggs9‡ Filipa Carvalho3‡ Alison P. Chase10‡ Rachel Eveleth11‡ Rob Fatland12‡ Kristen E. Fogaren1‡ Jonathan Peter Fram13‡ Susan E. Hartman3‡ Isabela Le Bras14‡ Cara C. M. Manning15‡ Joseph A. Needoba16‡ Merrie Beth Neely17‡ Hilde Oliver7‡ Andrew C. Reed7‡ Jennie E. Rheuban4‡ Christina Schallenberg18‡ Ian Walsh19‡ Christopher Wingard13‡ Kohen Bauer20‡ Baoshan Chen21‡ Jose Cuevas1‡ Susana Flecha22‡ Micah Horwith23‡ Melissa Melendez9‡ Tyler Menz21‡ Sara Rivero-Calle24‡ Nicholas P. Roden25‡ Tobias Steinhoff26,27‡ Pablo Nicolás Trucco-Pignata3‡ Michael F. Vardaro28‡ Meg Yoder1‡
  • 1Boston College, Department of Earth and Environmental Sciences, Chestnut Hill, MA, United States
  • 2Old Dominion University, Department of Ocean and Earth Sciences, Northfolk, VA, United States
  • 3National Oceanography Centre, Southampton, United Kingdom
  • 4Department of Marine Chemistry and Geochemistry, Woods Hole Oceanographic Institution, Woods Hole, MA, United States
  • 5Department of Oceanography, Dalhousie University, Halifax, NS, Canada
  • 6Cooperative Institute for Climate, Ocean and Ecosystem Studies/Pacific Marine Environmental Laboratory, Seattle, WA, United States
  • 7Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA, United States
  • 8University of South Carolina, School of the Earth, Ocean and Environment, Columbia, SC, United States
  • 9School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI, United States
  • 10Applied Physics Laboratory, University of Washington, Seattle, WA, United States
  • 11Oberlin College, Department of Geosciences, Oberlin, OH, United States
  • 12University of Washington, eScience Institute, Seattle, WA, United States
  • 13College of Earth, Ocean and Atmospheric Sciences, Oregon State University, Corvallis, OR, United States
  • 14Department of Physical Oceanography, Woods Hole Oceanographic Institution, Woods Hole, MA, United States
  • 15University of Connecticut, Department of Marine Sciences, Groton, CT, United States
  • 16Oregon Health & Science University, OHSU-PSU School of Public Health, Portland, OR, United States
  • 17Global Science and Technology, Inc., Greenbelt, MD, United States
  • 18Commonwealth Scientific and Industrial Research Organisation, Environment, Hobart, TAS, Australia
  • 19Independent Researcher, Corvallis, OR, United States
  • 20Ocean Networks Canada, University of Victoria, Victoria, BC, Canada
  • 21School of Marine and Atmospheric Sciences, Stony Brook University, Stony Brook, NY, United States
  • 22lnstituto Mediterráneo de Estudios Avanzados (IMEDEA-UIB-CSIC), Esporles, Spain
  • 23Washington State Department of Ecology, Olympia, WA, United States
  • 24University of Georgia, Skidaway Institute of Oceanography, Savannah, GA, United States
  • 25Norwegian Institute for Water Research (NIVA), Oslo, Norway
  • 26NORCE Norwegian Research Centre, Bergen, Norway
  • 27GEOMAR Helmholtz Center for Ocean Research Kiel, Helmholtz Association of German Research Centres (HZ), Kiel, Schleswig-Holstein, Germany
  • 28University of Washington, School of Oceanography, Seattle, WA, United States

The field of oceanography is transitioning from data-poor to data-rich, thanks in part to increased deployment of in-situ platforms and sensors, such as those that instrument the US-funded Ocean Observatories Initiative (OOI). However, generating science-ready data products from these sensors, particularly those making biogeochemical measurements, often requires extensive end-user calibration and validation procedures, which can present a significant barrier. Openly available community-developed and -vetted Best Practices contribute to overcoming such barriers, but collaboratively developing user-friendly Best Practices can be challenging. Here we describe the process undertaken by the NSF-funded OOI Biogeochemical Sensor Data Working Group to develop Best Practices for creating science-ready biogeochemical data products from OOI data, culminating in the publication of the GOOS-endorsed OOI Biogeochemical Sensor Data Best Practices and User Guide. For Best Practices related to ocean observatories, engaging observatory staff is crucial, but having a “user-defined” process ensures the final product addresses user needs. Our process prioritized bringing together a diverse team and creating an inclusive environment where all participants could effectively contribute. Incorporating the perspectives of a wide range of experts and prospective end users through an iterative review process that included “Beta Testers’’ enabled us to produce a final product that combines technical information with a user-friendly structure that illustrates data analysis pipelines via flowcharts and worked examples accompanied by pseudo-code. Our process and its impact on improving the accessibility and utility of the end product provides a roadmap for other groups undertaking similar community-driven activities to develop and disseminate new Ocean Best Practices.

1 Introduction and motivation

In recent years, the volume of oceanographic data has increased dramatically, prompting a greater awareness of and engagement with Open Science practices (Fecher and Friesike, 2014), which aim to accelerate discovery, promote greater inclusivity and participation, improve transparency, and support collaborations. The US National Science Foundation (NSF)-funded Ocean Observatories Initiative (OOI), the outcome of a decades-long process of envisioning and implementing a new “observatory science” model of oceanography, has embraced this transition to Open Science (Smith et al., 2018; Steinhardt, 2018; Ocean Observatories Initiative Facility Board, 2021). All data collected by OOI are made freely available in near-real time, providing novel opportunities as well as challenges for those in the oceanographic community who seek to use these data.

The OOI arrays incorporate sensors measuring a wide range of Essential Ocean Variables (EOVs; Lindstrom et al., 2012) on moored and mobile autonomous platforms deployed across a variety of coastal and open ocean environments (Trowbridge et al., 2019). These sensors, which include instruments that characterize the physical environment (e.g. CTDs measuring conductivity, temperature, and depth; acoustic doppler current profilers, or ADCPs) as well as biogeochemical sensors that measure biological and chemical EOVs (e.g. chlorophyll fluorescence, dissolved oxygen), provide great potential to study a wide range of important and interdisciplinary oceanographic questions. Despite this potential, OOI has found that the biogeochemical1 parameter data are underutilized. Though biogeochemical sensors represent over a third of the OOI sensors (333 of 932 making up the arrays), the associated data have been used in only ~10% of tracked publications2 (10 of 104 peer-reviewed papers using OOI data published through the end of 2022: de Jong and De Steur, 2016; Lozier et al., 2017; Barth et al., 2018; Henderikx Freitas et al., 2018; Palevsky and Nicholson, 2018; Zhang and Partida, 2018; Greengrove et al., 2020; Levine et al., 2020; Reimers and Fogaren, 2021; Oliver et al., 2022), and in most cases interpreted qualitatively rather than quantitatively.

This underutilization is partly because generating science-ready data products from biogeochemical sensors requires human-in-the-loop (HITL) calibration and validation procedures, such as application of gain or drift corrections, that go beyond those currently included in OOI’s internal data processing scope. Many of the key research questions that oceanographers seek to address using biogeochemical data involve rate calculations (e.g., air-sea carbon dioxide and oxygen fluxes and biological carbon export from the surface ocean) and differentiation between long-term changes and natural variability (e.g., ocean deoxygenation and acidification), which require carefully-calibrated and quality-controlled data. For example, Emerson et al. (2008) found that uncertainty of ±0.5% in moored dissolved oxygen measurements yielded uncertainty of ±50% in their calculated biological oxygen production rate. Despite community recognition that the development and application of robust procedures for automated and HITL post-deployment data processing are necessary to produce science-ready data from bio-optical and chemical sensors (Boss et al., 2012), the funded scope of the OOI program leaves this step to the end-user.

To broaden the use of OOI biogeochemical sensor data and increase community capacity to produce science-ready data products, the OOI Biogeochemical Sensor Data (OOI BGC) Working Group (hereafter referred to as the Working Group)3 was formed in 2021, bringing together participants with expertise in biogeochemical sensor calibration and analysis from within and beyond the existing OOI data user community. The Working Group convened a virtual kickoff workshop in 2021, followed by virtual bimonthly working meetings, and a three-day in-person workshop in 2022. The work culminated in the publication of the Global Ocean Observing System (GOOS)-endorsed OOI Biogeochemical Sensor Data Best Practices and User Guide (Palevsky et al., 2023). With the aim of providing a broadly applicable blueprint for sustained and inclusive collaborative efforts, this paper presents an overview of the process we used to develop this User Guide, which was informed by existing best practices and the extensive experience of the US Ocean Carbon & Biogeochemistry (OCB) Project Office in facilitating community- and consensus-building activities. We then describe the impact of our process on the utility and accessibility of the resultant final User Guide, and conclude with lessons to support future efforts to develop new Best Practices that serve the needs of the scientific community.

2 Community-driven process for best practices development

Our process prioritized bringing together participants across diverse backgrounds and skill sets and creating an inclusive and supportive environment where all participants could effectively contribute their insights and ideas. The Working Group leaders used guiding principles and community- and consensus-building tools drawn from inclusive pedagogical and facilitation practices (e.g., Stanfield, 2000; Cohen and Lotan, 2014; Jack-Scott et al., 2023) to foster effective collaboration and co-develop Working Group products.

2.1 Building the team: openness, transparency, and intention

In recruiting members of the 25-person Working Group and 14 “Beta Testers” who reviewed the initial draft document, we started with an open application, shared widely across US and international oceanographic networks. Casting a wide net is important for capturing a breadth of knowledge and diversity of viewpoints, as is clearly stating the Working Group’s goals at the application stage to ensure that applicants are vested in the process and outcomes. Application questions were designed to query expertise and experience with biogeochemical sensors, familiarity and experience with OOI, capacity to commit to the stated Working Group activities, and experience with other ocean observing networks that would lend broader insights. At the time of application, we described the evaluation process, criteria, and anticipated timeline. We also shared explicit expectations of the time commitment and workload, anticipated frequency and modes of participation (synchronous and asynchronous), project timelines, and anticipated outcomes. In addition to scientific, technical, and sensor expertise, organizers sought to achieve demographic balance (gender, race, ethnicity, career stage, geographic) and representation of groups traditionally marginalized in science. The selected Working Group members4 and Beta Testers (who together comprise the authors of this paper) came from 23 institutions across 7 countries, were more than half women, and included graduate students, postdoctoral researchers, early career faculty, technical staff scientists, and senior scientists.

2.2 Setting the tone

Listen with the possibility of learning. Speak with the knowledge that you will be heard5.

To establish a positive and constructive tone, we began the both the virtual and in-person workshops by sharing a code of conduct (following OCB’s Code of Conduct6), with an emphasis on fostering a culture of mutual respect among members for the expertise and viewpoints each person brought to the group, and toward creating a safe and inclusive collaboration space (see Slide Decks in the Supplementary Materials). Working Group members conducted the majority of their work virtually during a global pandemic. Working Group leaders repeatedly acknowledged the challenges of working remotely while dealing with pandemic illness, trauma, and workplace changes, and ensured that the timelines for the Working Group activities were realistic and generous. While the benefits of oceanographic best practices are widespread, there is typically no salary support to cover the time and effort required to generate materials related to best practices. The Working Group leaders continually fostered norms of respect for each other’s intellectual capital and intellectual property, grace and flexibility for each other’s competing obligations, and acknowledgement and appreciation of each other as people beyond their scientific identities and Working Group tasks.

2.3 Promoting progress with effective tools and interactions

Working Group members paid careful attention to the collaboration process, conducting both synchronous and asynchronous activities and using collaborative organization (Google Docs) and communication (Slack, email) tools. During synchronous time together, the Working Group organizers aimed to maximize interaction and opportunities for meaningful collaboration among the participants, rather than devoting the majority of meeting time to more passive activities (e.g. seminar-style lectures and talks). The need for this approach was especially apparent amidst the realities of Zoom fatigue and coordination across multiple time zones while meeting virtually due to the COVID-19 pandemic, as well as gathering in June 2022 for what was many participants’ first in-person professional meeting in more than two years, but offers more generally applicable lessons as well.

Similar to the “flipped classroom” model of replacing in-person or synchronous lectures with recorded lectures or readings completed prior to class time (DeLozier and Rhodes, 2017), the organizers shared material that might otherwise have been communicated through presentations in advance of both the July 2021 virtual workshop and the June 2022 in-person workshop and asked participants to complete short activities engaging with these materials (see Supplementary Materials). The agendas for these workshops prioritized small-group conversations, each structured around a specific topic and facilitated by prepared discussion prompts and worksheets (agendas, worksheets, and slides with discussion prompts are provided in the Supplementary Materials). Prior to the initial virtual workshop, Working Group members were divided into four smaller sub-groups, each focused on a biogeochemical sensor type, with sub-group composition balanced to ensure that all groups had experts in both the OOI program and other ocean observing programs employing BGC sensors, as well as expertise in the sensors themselves. For each topic, Working Group members first discussed ideas among their own sub-group. To enable idea sharing across the different sub-groups, each topic was subsequently discussed by “mixing groups,” each of which was composed of members of each of the four sensor-specific sub-groups, an approach inspired by the “jigsaw” teaching technique (Aronson, 1978). The composition of these “mixing groups” was rotated throughout the 3-day virtual workshop such that all participants had a chance to engage in a small group conversation with each of the other members of the Working Group at least once during the meeting. Different from a traditional jigsaw activity, Working Group members concluded discussion of each topic by returning to their assigned sub-group after the “mixing group” conversations, enabling them to share lessons learned and discuss applications of those lessons to their own sub-group’s sensor type. At the end of each day of the virtual workshop, organizers collected feedback from all participants through Google Form surveys, which — together with the notes from the small groups’ worksheets and conversations — enabled them to ensure all Working Group members’ perspectives were incorporated in shaping group decisions on the scope and structure of the final product.

The in-person workshop built upon this structure, integrating Beta Testers in with the Working Group sub-groups for discussions focused on reviewing the sensor-specific chapters of the draft User Guide while also intentionally curating small groups that mixed together Beta Testers and Working Group members across different sub-groups. The agenda (see Supplementary Materials) prioritized opportunities for all participants to get to know each other and cross-pollinate ideas during structured introductory activities, informal break times, and two “gallery walks” where participants discussed and then shared ideas, with the results of this collective brainstorming recorded on colored sticky notes posted around the meeting space. The first two days of the workshop featured Beta Tester feedback on the draft User Guide and discussions of the scientific potential offered by science-ready OOI BGC datasets, offering meaningful opportunities for Beta Tester engagement with the Working Group prior to working on revisions to the Guide in mixed Beta Tester and Working Group member sensor-specific sub-groups on the final day of the workshop.

2.4 Iterative review process

Creating a Best Practices and User Guide that reflects consensus across the scientific community and is clear and accessible to a broad range of users requires broad community input. Our process incorporated three distinct stages of review (Figure 1). At each stage, the Working Group members edited and revised the Best Practices and User Guide based on reviewer feedback. Stage 1, the internal Working Group review, consisted of multiple rounds of internal peer review by the Working Group members. Once a full draft of the User Guide had been drafted and internally reviewed, Stage 2 of the review process consisted of review by “Beta Testers” external to the Working Group. Finally, Stage 3 of the review process was an open review by the community. The recruitment of Beta Testers to review the guide represented an addition to the traditional peer review process, and is a practice that was adopted from the technology industry with the aim of verifying the usability of a new product before public release.

Figure 1
www.frontiersin.org

Figure 1 Process of Best Practice development. Diagrammatic representation of the process undertaken by the Working Group (WG) to support the project’s overall goals to capture consensus across the scientific community on the content (highlighted in blue), as well as to produce a document that would be clear and accessible to a broad range of users (highlighted in red). The review process was made up of three distinct stages. At each stage, the Working Group members edited and revised the Best Practices and User Guide based on reviewer feedback. This diagram summarizes the process and highlights how particular stages and/or activities undertaken by the Working Group enhanced the end product and supported the stated goals.

Once work began on the initial draft text of the User Guide, all Working Group members participated in two rounds of internal peer review. To reduce the workload, each Working Group member focused their reviews on one chapter other than their own, with review assignments spread out to ensure that each chapter was reviewed by the lead authors of each other chapter in the document.

Once a full draft had been completed and internally revised by the Working Group, Beta Testers were recruited and tasked with accessing OOI data and applying quality assurance/quality control methods and data corrections to prepare science-ready data based on the instructions in the draft (see Supplementary Materials for full instructions provided to Beta Testers). Beta Testers then gathered together with Working Group members at an in-person workshop, where they provided crucial feedback on their experience using the draft Best Practices document, including identifying steps that were confusing or required further explanation. Following the Beta Tester review and the discussions from the in-person workshop, the Working Group members made significant revisions to the draft guide (see further detail in Section 3 below). A complete revised draft (Version 1.0.0) was broadly disseminated and made available for open community review, following which a final version incorporating community feedback (Palevsky et al., 2023) was reviewed and endorsed by the GOOS Biogeochemistry panel and archived in the Ocean Best Practices Repository (Pearlman et al., 2019).

3 Impact of our community-driven process on the final product

The goal of the Working Group was to produce a set of Best Practices that would not only instruct users on the steps needed to prepare science-ready BGC data products from OOI data streams, but would also achieve this in a way that was broadly accessible and user-friendly, especially for those new to OOI and/or these types of sensor data. The form of the final product, the OOI Biogeochemical Sensor Data Best Practices and User Guide (Palevsky et al., 2023), was intricately and inextricably linked to the process undertaken by the Working Group to develop and test it. The involvement of Beta Testers, who included a number of participants previously unfamiliar with OOI data and representing a range of career stages from graduate students to senior scientists, provided an opportunity to rigorously test the usability of our product.

The final document includes a ‘Quick Start Guide’ with the basic information needed to work with the guide, a more complete Introduction with detailed information on the OOI program and data access, and four chapters specific to each BGC sensor type. Each of the four chapters on specific sensor types follows a parallel structure, and includes recommendations for the end-user processing steps needed to prepare science-ready data products from the OOI data. These steps are fully described in the text, and summarized in one or more sensor-specific data processing flowcharts in each chapter (example in Figure 2). Worked examples are also included in each chapter to illustrate the application of the recommended end-user data processing steps to an example OOI dataset. Each worked example includes pseudo-code to support users in developing their own data analysis pipeline suited to their specific application in the programming language of their choice.

Figure 2
www.frontiersin.org

Figure 2 End-user biogeochemical data processing flowchart from the OOI Biogeochemical Sensor Data Best Practices and User Guide (Palevsky et al., 2023, licensed CC-BY-NC-ND-4.0). This flowchart, included in the introduction, provides a summary of recommended end-user quality control (e.g. QARTOD tests, Toll, 2012) and data processing steps common to all OOI BGC sensors. Each of the subsequent chapters includes a flowchart following these same overall steps that illustrates the sensor-specific application of this processing. The idea to develop and incorporate these flowcharts emerged from conversations among Beta Testers and Working Group members during the June 2022 workshop.

Two of the elements that we believe are most useful to current and prospective OOI BGC data users — the end-user data processing flowcharts (Figure 2) and the worked examples with pseudo-code — emerged from the feedback Beta Testers shared with the Working Group members. The Beta Testers reported that they found it difficult to follow and implement the instructions in the draft version of the User Guide that they had been provided to review. In many cases, the Beta Testers encountered difficulties because the original worked examples were provided in a programming language that they were unfamiliar with. During the 2022 workshop, discussion of Beta Tester feedback in both small breakout groups and a full-group plenary session led us to collectively generate the idea to include end-user flowcharts that would lay out all of the key ‘ingredients’ and steps for working with each type of BGC data, and to restructure the Worked Examples as illustrations of how to implement the steps shown in the flowcharts. Initial versions of the end-user flowcharts were developed during subsequent workshop breakout sessions that mixed together Working Group members and Beta Testers. In the final version of the User Guide, worked examples were updated to explain the data processing steps in pseudo-code, agnostic to the coding language end users will ultimately choose for their own implementation, with parallel structure of the flowcharts and steps across all chapters to support users working with multiple sensor types.

Conversations among the Beta Testers and Working Group members were also essential in clarifying and communicating the scope of the User Guide. We had initially envisioned the document as a “cookbook” for end users seeking to work with OOI BGC sensor data, but it became clear that the processing required to meet the specific needs of all end users across a wide range of potential scientific applications and combinations of OOI BGC data from different sensors and platforms couldn’t be synthesized into a single “recipe”. We therefore opted to provide the background information and principles needed for the end user to successfully identify and understand all the available “ingredients” (data), the types of “cooking” (end-user processing) that are recommended to prepare them, including how to identify and correct common data issues, and a few sample “recipes” (worked examples) to support end users in developing their own “recipes” for science-ready data.

4 Conclusions and outlook

In the decade since the OOI began collecting data, the transition to Open Science has further accelerated, both in the ocean sciences and in other disciplines, as evidenced by NASA’s Transform to OPen Science (TOPS) mission7 and the designation of the year 2023 by the White House Office of Science and Technology Policy as the Year of Open Science. However, there remain challenges to fulfilling the promise offered by Open Science and publicly-available data, highlighted by the challenges in community utilization of OOI BGC data that this Working Group aimed to address. Even when data are freely available, researchers interested in using those data often face other challenges, such as limited access to training, resources, networking opportunities, and freedom to pursue risky or unfunded projects. This acts as a barrier to entry, especially for early career scientists and those from less resource-rich institutions and nations. Such barriers can perpetuate existing systemic inequities that Open Science is intended to counteract. Recognizing this challenge, we endeavored to follow an inclusive process and involve a diverse cross-section of the scientific community to ensure that the Best Practices we produced would not reinforce existing silos and barriers to entry that arise when the needs of data users external to or unaffiliated with an observing program are not considered in the development of training materials aimed at data users.

The Working Group leaders proposed and planned this activity in early 2020, and had to adjust as the COVID-19 pandemic progressed. Although the intention to make this activity as inclusive and supportive as possible was woven into the planning and execution of the activity from the very beginning, significant changes to the medium and sequence of meetings were made to respond to the unfolding global situation, as well as lessons learned along the way. Here we highlight some of the lessons learned through this process and how they impacted the Working Group and the final product, with the goal of supporting organizers of future similar activities:

● Bracketing the Working Group activity with an initial kick-off workshop and a follow-up workshop after a draft document was completed aided in keeping Working Group members engaged and on task, and allowed time for ideas to develop and be tested. This also helped to provide accountability along with clearly set intermediate goals and deadlines. The originally-proposed timeline only included a single in-person workshop at the beginning of the Working Group activity, with the addition of a second workshop driven by the need to meet online rather in person in summer 2021. The additional workshop ended up offering major benefits to the Working Group experience and quality of the final product.

● Encouraging broad discussion of the scope of the Working Group activities is crucial, but leaders must also guard against scope-creep to protect the limited resources available to complete the work (particularly the time of Working Group members). Many exciting ideas were floated during discussions (e.g., developing a GitHub repository for OOI BGC data analysis pipelines) but would have detracted from the core goal of developing user-friendly and accessible Best Practices. However, allowing the time to discuss these ideas was important in enabling Working Group members to share their perspectives on community needs, which ultimately strengthened the final product.

● A “user-driven” process along with close coordination with OOI staff was important to ensure that the final product would be both accurate and useful for the community. OOI staff who served as Working Group members provided invaluable support and expertise related to OOI resources and workflows, while non-OOI Working Group members provided the OOI staff with a much broader view of the needs of existing and potential users.

● Beta Tester input improved the final product. Our experience shows that intentionally incorporating a layer of review that directly addresses the clarity and useability of a Best Practice greatly improves the final product. The input and contributions of the Beta Testers based on their experience of trying to use the draft User Guide for data analysis uncovered many gaps, deficiencies, and inconsistencies across chapters. Including Beta Testers as well as Working Group members in the June 2022 in-person workshop was key in enabling the Beta Testers’ feedback to meaningfully shape the final product.

A key role of a scientific project office (OCB, US CLIVAR, SCOR, and others) is to provide professional facilitation and staff support of community activities to ensure their success. This Working Group’s activities were co-coordinated with the OCB Project Office. OCB is a network of scientists working across disciplines to understand the ocean’s role in the global carbon cycle and the response of marine ecosystems and biogeochemical cycles to environmental change. The OCB network represents a large prospective OOI BGC data user community, and the OCB Project Office has ample experience facilitating small- and large-group activities to build community, capacity, and consensus. This OOI BGC Working Group, in particular, provided a new model for consensus-based group productivity. The leadership and members of this Working Group paid particular attention to implementing a process that resulted in a new level of collaboration and inclusivity, reflected in the quality of the resulting Best Practices and User Guide. This process can serve as a model for other group activities in ocean science and other disciplines. Notably, the OCB Project Office has incorporated elements of this process into OCB’s Guidelines for Workshops & Activities, including explicitly stating goals and expectations at the recruitment stage, establishing clear intentions for collaboration early in the process, and ensuring an open and inclusive process with diverse viewpoints throughout.

Community Best Practices are not static documents. As science and technology advance and evolve, so must the standards and Best Practices that enable and support scientific breakthroughs. The work of developing standards and Best Practices has long been perceived as a “service” activity rather than a robust contribution to scholarship. Given the rigorous community review that is typically required for a Best Practices document and its immeasurable impact on the community in terms of reducing or removing barriers, it is imperative to support, acknowledge, and incentivize these critical contributions to the field and to build capacity to carry forth these activities, particularly for early career scientists.

Data availability statement

The original contributions presented in the study are included in the article/Supplementary Material. Further inquiries can be directed to the corresponding authors.

Author contributions

HP: Conceptualization, Funding acquisition, Project administration, Writing – original draft, Writing – review & editing. SC: Conceptualization, Funding acquisition, Project administration, Writing – original draft, Writing – review & editing. HB: Conceptualization, Funding acquisition, Project administration, Writing – original draft, Writing – review & editing. MaM: Project administration, Writing – original draft, Writing – review & editing. DA: Writing – review & editing. RB: Writing – review & editing. JB: Writing – review & editing. AB: Writing – review & editing. EB: Writing – review & editing. FC: Writing – review & editing. AC: Writing – review & editing. RE: Writing – review & editing. RF: Writing – review & editing. KF: Writing – review & editing. JF: Writing – review & editing. SH: Writing – review & editing. IL: Writing – review & editing. CM: Writing – review & editing. JN: Writing – review & editing. MN: Writing – review & editing. HO: Writing – review & editing. AR: Writing – review & editing. JR: Writing – review & editing. CS: Writing – review & editing. IW: Writing – review & editing. CW: Writing – review & editing. KB: Writing – review & editing. BC: Writing – review & editing. JC: Writing – review & editing. SF: Writing – review & editing. MH: Writing – review & editing. MM: Writing – review & editing. TM: Writing – review & editing. SR-C: Writing – review & editing. NR: Writing – review & editing. TS: Writing – review & editing. PT-P: Writing – review & editing. MV: Writing – review & editing. MY: Writing – review & editing.

Funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work was funded by the National Science Foundation (NSF OCE-2033919 to SC, HIP, and HB and NSF OCE-1946072 to HIP). Logistics and salary support (MaM, HB) for OOI BGC Working Group activities was provided by the Ocean Carbon & Biogeochemistry (OCB, us-ocb.org) Project Office, which receives support from the National Science Foundation (NSF OCE-1850983) and the National Aeronautics & Space Administration (NASA 80NSSC21K0413). We acknowledge additional workshop participant support from the OOI Facilities Board and the OOI program, and publication support from the OOI program (NSF OCE- 1743430).

Acknowledgments

All Working Group members (DA, RB, JB, AB, EB, FC, AC, RE, RF, KF, JF, SH, IL, CM, JN, MN, HO, AR, JR, CS, IW, CW) contributed equally to the process of developing best practices for OOI biogeochemical sensors. Additional authors (KB, BC, JC, SF, MH, MM, TM, SR-C, NR, TS, PTP, MV, MY) contributed to the detailed review of the best practices document as Beta Testers and participated in a workshop focused on improving the final product and exploring broader applications of OOI biogeochemical sensor data.

Conflict of interest

Author MN is employed by Global Science and Technology Inc.

The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fmars.2024.1358591/full#supplementary-material

Footnotes

  1. ^ For the purposes of this paper, and the work described herein, we use “biogeochemical sensors” as a catch-all term for sensors that measure dissolved oxygen, nitrate, bio-optical properties, and carbonate system chemistry components.
  2. ^ https://ooipublications.whoi.edu/biblio.
  3. ^ https://www.us-ocb.org/ooi-dataset-community/.
  4. ^ OOI BGC working group members: https://www.us-ocb.org/ooi-dataset-community/.
  5. ^ Based on the facilitation practices of the Science Museum of Minnesota’s Inclusion, Diversity, Equity, Access, Leadership (IDEAL) Center.
  6. ^ https://www.us-ocb.org/about/ocb-program-code-of-conduct/.
  7. ^ https://nasa.github.io/Transform-to-Open-Science/.

References

Aronson E. (1978). The Jigsaw Classroom. (Thousand Oaks, CA, United States: SAGE Publications, Inc.).

Google Scholar

Barth J., Fram J., Dever E., Risien C., Wingard C., Collier R., et al. (2018). Warm Blobs, low-oxygen events, and an eclipse: the ocean observatories initiative endurance array captures them all. Oceanography 31, 90–97. doi: 10.5670/oceanog.2018.114

CrossRef Full Text | Google Scholar

Boss E., Neely M. B., Werdell J. (2012). Report from the COL-NASA Data QA/QC Workshop. 6-8 June 2012 (Orono, ME, United States: University of Maine).

Google Scholar

Cohen E. G., Lotan R. A. (2014). Designing groupwork: strategies for the heterogeneous classroom third edition (New York, NY, United States: Teachers College Press).

Google Scholar

de Jong M. F., De Steur L. (2016). Strong winter cooling over the Irminger Sea in winter 2014-2015, exceptional deep convection, and the emergence of anomalously low SST: IRMINGER SEA COOLING AND CONVECTION. Geophys. Res. Lett. 43, 7106–7113. doi: 10.1002/2016GL069596

CrossRef Full Text | Google Scholar

DeLozier S. J., Rhodes M. G. (2017). Flipped classrooms: A review of key ideas and recommendations for practice. Educ. Psychol. Rev. 29, 141–151. doi: 10.1007/s10648-015-9356-9

CrossRef Full Text | Google Scholar

Emerson S., Stump C., Nicholson D. (2008). Net biological oxygen production in the ocean: Remote in situ measurements of O2 and N2 in surface waters. Global Biogeochem. Cycles. 22. doi: 10.1029/2007GB003095

CrossRef Full Text | Google Scholar

Fecher B., Friesike S. (2014). “Open Science: One Term, Five Schools of Thought,” in Opening Science. Eds. Bartling S., Friesike S. (Springer International Publishing, Cham), 17–47. doi: 10.1007/978-3-319-00026-8_2

CrossRef Full Text | Google Scholar

Greengrove C., Lichtenwalner S., Palevsky H., Pfeiffer-Herbert A., Severmann S., Soule D., et al. (2020). Using authentic data from NSF’s ocean observatories initiative in undergraduate teaching: an invitation. Oceanography 33 (1), 62–73. doi: 10.5670/oceanog.2020.103

CrossRef Full Text | Google Scholar

Henderikx Freitas F., Saldías G., Goñi M., Shearman R. K., White A. (2018). Temporal and spatial dynamics of physical and biological properties along the endurance array of the California current ecosystem. Oceanography 31, 80–89. doi: 10.5670/oceanog.2018.113

CrossRef Full Text | Google Scholar

Jack-Scott E., Aponte K. L., Bhatia R., Behl M., Burke J., Burt M., et al. (2023). Inclusive Scientific Meetings: Where to Begin. 500 Women Scientists. Available at: https://500womenscientists.org/inclusive-scientific-meetings.

Google Scholar

Levine R. M., Fogaren K. E., Rudzin J. E., Russoniello C. J., Soule D. C., Whitaker J. M. (2020). Open data, collaborative working platforms, and interdisciplinary collaboration: building an early career scientist community of practice to leverage ocean observatories initiative data to address critical questions in marine science. Front. Mar. Sci. 7. doi: 10.3389/fmars.2020.593512

CrossRef Full Text | Google Scholar

Lindstrom E., Gunn J., Fischer A., McCurdy A., Glover L. K. (2012). “A Framework for Ocean Observing,” in By the Task Team for an Integrated Framework for Sustained Ocean Observing, (Paris, France: UNESCO). doi: 10.5270/OceanObs09-FOO

CrossRef Full Text | Google Scholar

Lozier S. M., Bacon S., Bower A. S., Cunningham S. A., Femke De Jong M., De Steur L., et al. (2017). Overturning in the Subpolar North Atlantic program: A new international ocean observing system. Bull. Am. Meteorol. Soc 98, 737–752. doi: 10.1175/BAMS-D-16-0057.1

CrossRef Full Text | Google Scholar

Ocean Observatories Initiative Facility Board. (2021). Ocean Observatories Initiative (OOI) Science Plan: Exciting Science Opportunities using OOI Data. Available online at: https://ooifb.org/reports/ooi-science-plan.

Google Scholar

Oliver H., Zhang W. G., Archibald K. M., Hirzel A. J., Smith W. O., Sosik H. M., et al. (2022). Ephemeral surface chlorophyll enhancement at the new england shelf break driven by ekman restratification. J. Geophys. Res. Oceans. 127 (1), e2021JC017715. doi: 10.1029/2021JC017715

CrossRef Full Text | Google Scholar

Palevsky H., Clayton S., Atamanchuk D., Battisti R., Batryn J., Bourbonnais A., et al. (2023). “OOI Biogeochemical Sensor Data Best Practices and User Guide,” in Version 1.1.1. Ocean Observatories Initiative, Biogeochemical Sensor Data Working Group, 135 pp. doi: 10.25607/OBP-1865.2

CrossRef Full Text | Google Scholar

Palevsky H., Nicholson D. (2018). The North Atlantic biological pump: insights from the ocean observatories initiative irminger sea array. Oceanography 31, 42–49. doi: 10.5670/oceanog.2018.108

CrossRef Full Text | Google Scholar

Pearlman J., Bushnell M., Coppola L., Karstensen J., Buttigieg P. L., Pearlman F., et al. (2019). Evolving and sustaining ocean best practices and standards for the next decade. Front. Mar. Sci. 6, 277. doi: 10.3389/fmars.2019.00277

CrossRef Full Text | Google Scholar

Reimers C. E., Fogaren K. E. (2021). Bottom boundary layer oxygen fluxes during winter on the oregon shelf. J. Geophys. Res. Oceans. 126 (3), e2020JC016828. doi: 10.1029/2020JC016828

CrossRef Full Text | Google Scholar

Smith L. M., Barth J. A., Kelley D. S., Plueddemann A., Rodero I., Ulses G. A., et al. (2018). The ocean observatories initiative. Oceanography 31, 16–35. doi: 10.5670/oceanog.2018.105

CrossRef Full Text | Google Scholar

Stanfield R. B. (Ed.) (2000). The art of focused conversation: 100 ways to access group wisdom in the workplace (Gabriola Island, BC, Canada: New Society Publishers).

Google Scholar

Steinhardt S. B. (2018). The Instrumented Ocean: How Sensors, Satellites, and Seafloor-Walking Robots Changed What It Means to Study the Sea (Ithaca, NY, United States: Cornell University).

Google Scholar

Toll R. (ed.). (2012). U.S. IOOS QARTOD Project Plan. Silver Spring, MD, IOOS, 8pp. doi: 10.25607/OBP-533

CrossRef Full Text | Google Scholar

Trowbridge J., Weller R., Kelley D., Dever E., Plueddemann A., Barth J. A., et al. (2019). The ocean observatories initiative. Front. Mar. Sci. 6. doi: 10.3389/fmars.2019.00074

CrossRef Full Text | Google Scholar

Zhang W. G., Partida J. (2018). Frontal subduction of the mid-atlantic bight shelf water at the onshore edge of a warm-core ring. J. Geophys. Res. Oceans. 123, 7795–7818. doi: 10.1029/2018JC013794

CrossRef Full Text | Google Scholar

Keywords: ocean best practices, biogeochemical sensors, ocean observatories initiative, working group, beta testers

Citation: Palevsky HI, Clayton S, Benway H, Maheigan M, Atamanchuk D, Battisti R, Batryn J, Bourbonnais A, Briggs EM, Carvalho F, Chase AP, Eveleth R, Fatland R, Fogaren KE, Fram JP, Hartman SE, Le Bras I, Manning CCM, Needoba JA, Neely MB, Oliver H, Reed AC, Rheuban JE, Schallenberg C, Walsh I, Wingard C, Bauer K, Chen B, Cuevas J, Flecha S, Horwith M, Melendez M, Menz T, Rivero-Calle S, Roden NP, Steinhoff T, Trucco-Pignata PN, Vardaro MF and Yoder M (2024) A model for community-driven development of best practices: the Ocean Observatories Initiative Biogeochemical Sensor Data Best Practices and User Guide. Front. Mar. Sci. 11:1358591. doi: 10.3389/fmars.2024.1358591

Received: 20 December 2023; Accepted: 23 February 2024;
Published: 03 April 2024.

Edited by:

Jay S. Pearlman, Institute of Electrical and Electronics Engineers (France), France

Reviewed by:

Sutara H. Suanda, University of North Carolina Wilmington, United States

Copyright © 2024 Palevsky, Clayton, Benway, Maheigan, Atamanchuk, Battisti, Batryn, Bourbonnais, Briggs, Carvalho, Chase, Eveleth, Fatland, Fogaren, Fram, Hartman, Le Bras, Manning, Needoba, Neely, Oliver, Reed, Rheuban, Schallenberg, Walsh, Wingard, Bauer, Chen, Cuevas, Flecha, Horwith, Melendez, Menz, Rivero-Calle, Roden, Steinhoff, Trucco-Pignata, Vardaro and Yoder. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Hilary I. Palevsky, palevsky@bc.edu; Sophie Clayton, sophie.clayton@noc.ac.uk

These authors share first authorship

These authors have contributed equally to this work

Download