Why standardization keeps failing at plant level
Across the water and wastewater sector, standardization has long been the default strategy. Instruments are specified centrally, control philosophies are copied from one site to the next, and digital tools are often deployed with the expectation that one configuration will fit many plants. This approach works reasonably well for hardware. It breaks down quickly when applied to process behavior and operational decision-making.
Every treatment plant is shaped by a unique combination of influent composition, hydraulic design, biology, industrial contributions, operating history, and local regulatory constraints. Two facilities a few kilometers apart can treat nominally similar wastewater streams and still behave very differently. Operators know this intuitively. Many digital solutions do not.
The result is a persistent gap between generic digital tools and the plant-specific insights operators actually need.
Traditional digital approaches and their limits
Most digital monitoring and optimization tools in water treatment still follow one of two models.
The first relies on standardized correlations. A sensor signal is mapped to a regulatory or process parameter using a fixed model, often derived from laboratory calibration. This can work under stable conditions but degrades as influent composition, seasonality, or process configuration changes. Maintaining accuracy requires frequent recalibration and operator effort.
The second model is plant-specific engineering. Custom models are built manually for each site, often by consultants or system integrators. These can deliver excellent results, but they do not scale. Each new plant requires significant engineering time, deep site knowledge, and ongoing maintenance. For most operators, this approach is too slow and too expensive to deploy broadly.
The industry has largely treated this as a binary choice: standardize and accept reduced relevance, or customize and accept limited scalability.
Scalable customization as a third path
In a recent discussion, Michael Kuhns, CEO of Liquisens, described a different approach: scalable customization. The concept starts from a simple premise. Decision support in water treatment must be customized to each plant, but the way that customization is built does not have to be reinvented every time.
Instead of standardizing outputs, this approach standardizes the internal data engine. Incoming data from existing instrumentation is structured into reusable building blocks. These building blocks are then combined and weighted differently for each plant, producing a model that is specific to that site and that process.
The internal logic is standardized. The resulting model is not.
This distinction matters. It allows hundreds of plants to be supported using the same underlying architecture while still respecting the fact that a predictive model for one facility will not work if simply transplanted to another. As Kuhns noted, even two plants receiving similar wastewater can produce different outcomes due to biology, hydraulics, or operational history. Treating them as interchangeable is a technical error, not a simplification.
How data availability shapes the model
Scalable customization does not assume a perfect data environment. Most industrial and municipal plants already have extensive data sets, often hundreds of points spanning flows, dosing rates, pH, conductivity, solids, and supervisory control signals. The challenge is not data scarcity, but relevance.
In cases where data coverage is limited, model precision is reduced. This is not hidden. Instead, uncertainty is explicitly acknowledged, and additional instrumentation may be recommended where it materially improves insight. In practice, Kuhns notes that the vast majority of sites already have sufficient data to support meaningful plant-specific models.
The key is correlation across parameters rather than dependence on a single measurement. By combining multiple process and quality signals, the model reflects how the plant actually behaves, not how it was designed to behave on paper.
Operational implications for operators and utilities
For operators, scalable customization changes the role of digital tools. Instead of delivering generic dashboards, the system provides predictions and alerts that are specific to their plant and to the parameters they care about. This can include regulatory indicators such as COD or nitrogen, as well as operational variables that drive chemical dosing, energy use, or process stability.
One municipal example discussed in the interview involved methanol dosing for denitrification. Without real-time insight, setpoints are typically conservative, leading to systematic overdosing. With plant-specific predictive insight, even a modest reduction in dosing can translate into meaningful cost savings without increasing compliance risk. The technical value lies not in the percentage reduction itself, but in the confidence that the reduction is appropriate for that specific plant.
Just as importantly, this approach improves early warning. Laboratory results often arrive days after an exceedance has already occurred. Plant-specific predictive models can flag emerging risks hours in advance, when corrective action is still possible.
A shift in how digital tools are evaluated
Scalable customization challenges a long-standing assumption in the water sector: that standardization is always the safest path. For decision support, the opposite is often true. Over-standardization obscures the very differences that matter most at plant level.
The emerging alternative is not bespoke engineering for every site, but standardized engines that produce customized insight. This is a subtle but important shift. It aligns with how experienced operators already think about their plants, while offering a way to deploy advanced analytics at scale.
As regulatory pressure increases and operational margins tighten, tools that respect site-specific reality without becoming unmanageable will define the next phase of digital water management. Scalable customization is not a marketing term. It is a practical response to the complexity the industry has always known was there.