Improving the Statistical Soundness of Monitoring Programs
Municipalities rely heavily on water quality monitoring programs. These programs are used to identify stressors, trace identified stressors back to root causes, and assess the effectiveness of management actions in mitigating these stressors. But do modern monitoring programs actually satisfy modern expectations around these needs?
Monitoring programs sample “important” parameters at “important” locations during “important” events. Traditionally, the meaning of “important” has been defined from a hydrological understanding of the watershed, coupled with the ability to satisfy simple quantitative metrics like benchmarks, limits and indices. However, as expectations from the State and the public increase, and the world becomes increasingly digital and computationally-driven, this is no longer enough. Today, the goal should be a data-rich monitoring program providing a complete quantitative picture of the watershed.
The USGS categorizes monitoring program objectives into three categories: status, trend, and compliance. This session will discuss how to optimize measured parameters, temporal frequency, and spatial coverage for the objectives and specifics of each program. For example, it will demonstrate how to reduce the number of parameters and stations considered in trend-oriented programs and how to improve spatial coverage for better root cause identification in compliance-oriented programs. It will additionally discuss how statistical modeling should be regarded not just in design but also as a consumer of monitoring data. Through iterative improvement, the models presented in this session enable a hydrology-based monitoring program to additionally achieve statistical soundness, both in design and for consumption.
This session will begin by discussing numerical challenges with traditional monitoring program design. It will demonstrate issues of size and complexity that challenge stream order hierarchical approaches, problems that arise from mapping qualitative observations to quantitative measures in risk-based approaches (RBAs), and the limits of confidence intervals, trend analysis and other simple numeric measures common to many Data Quality Objectives (DQOs). It will then cover several statistical concepts as they relate to monitoring program design, specifically the tradeoffs between confidence, precision and frequency, a topic relevant to both traditional approaches like DQOs as well as more statistically rigorous ones. Next, it will present several statistical methods that can advise with program design, including multiple correlation and regression analysis (MCRA), principal component analysis (PCA), and hierarchical cluster analysis (HCA) to reduce optimize parameter selection, spatial coverage and temporal frequency.
This session will then discuss how statistics should be considered not just as a design tool but also as a program data consumer. This is the issue that originally brought the presenters to this topic, and they will provide several examples of how statistical models provide greater insight and understanding. For example, they will show how limits on spatial coverage and temporal frequency create issues with even simple quantifications like accumulation factors to differentiate natural and anthropogenic contributions, how multivariate models can simplify exceedance investigations and how exploratory factor analysis (EFA) can be used to detect latent constructs not evident through simple analysis.
Rather than digging into the specifics of the statistical methods, this session will focus on providing attendees with a conceptual understanding of the methods. Through the use of real world examples, motivators and analogies, this presentation is intended to excite and embolden attendees to return home asking how they can iteratively improve the statistical soundness of monitoring programs in the modern data-heavy, computationally-based world in which we now live.
Eric Bollens is the Chief Technology Officer of CloudCompli, Inc. Blending a decade of IT executive experience with a passion for the environment, he leads a team developing stormwater protection and compliance software for the modern data-driven “smart” city. CloudCompli’s platforms serve construction sites, industrial facilities and MS4s nationwide with mobile data collection, automated change response, actionable analytics, IoT integrations and more. Last year, Eric’s work, in collaboration with Orange County, was recognized as the first place winner of the California Water Board’s Data Innovation Challenge.