Predict Tomorrow’s Influent Flow With Today’s Data

Wastewater plant operations make important operational decisions based on the influent flow rate to the plant, and despite the ample availability of sensors, there is no accurate industry standard for predicting influent flow rate to the plant.

Knowing the performance of a collection system is difficult because there are few industry-recognized benchmarks on what “performance” is and how it should be determined. Performance of sewer collection systems are often simply educated guesses. Quantifying the areas of highest inflow and infiltration can be difficult due to large networks of pipes, the expense of water monitoring, and varying weather conditions impacting soil saturation.

Municipal sanitary sewer collection and conveyance systems are an extensive, valuable, and complex part of the nation’s infrastructure. Collection systems consist of pipelines, conduits, pumping stations, force mains, and any other facility collecting wastewater and conveying it to facilities that provide treatment prior to discharge to the environment

Plant operators are responsible for ensuring there is enough treated water available for pumping into the distribution or discharge system as well as enough water to maintain ongoing operations. Many operators overlook production water in addition to effluent pumping rates when determining influent rate, this factor ensures treatment is consistent.

Influent flow rates are usually estimated by the operators based on experience and local weather forecasts. These back-of-the-napkin calculations are necessary to engage in master planning for the future of the facility. Determination of the future capacity should be based on needs and sizing, as well as the plant’s ability to meet regulations in the future, and expected timing to update or build new facilities, are all impacted by the irregular and unpredictable amount of influent entering a system.

EPA estimates that the more than 19,000 collection systems across the country would have a replacement cost of $1-2 trillion dollars. The collection system of a single large municipality can represent an investment worth billions of dollars. Usually, the asset value of the collection system is not fully recognized and the collection system operation and maintenance programs are given low priority compared with wastewater treatment needs and other municipal responsibilities.

Typically, small amounts of infiltration and inflow are anticipated and tolerated. Yet, unpredictable weather can increase this load and cause overflows. Management of these events are costly in terms of unplanned labor expenditures, repair of damaged equipment and health and environmental impacts sometimes incurring monetary fines and coverage on the evening news.

As one of the most serious and environmentally threatening problems, sanitary sewer overflows are a frequent cause of water quality violations and are a threat to public health and the environment. Beach closings, flooded basements and overloaded treatment plants are some symptoms of collection systems with inadequate capacity and improper management, operation, and maintenance. The poor performance of many sanitary sewer systems and resulting potential health and environmental risks highlight the need to optimize operation and maintenance of these systems.

Wastewater collection systems suffer from inadequate investment in maintenance and repair often due in large part to the “out-of-sight, out-of-mind” nature of the wastewater collection system. The lack of proper maintenance has resulted in deteriorated sewers with subsequent basement backups, overflows, cave-ins, hydraulic overloads at treatment plants, and other safety, health, and environmental problems.

Managing these complex water systems relies on heavy physical infrastructure and reactive governing attitudes. This is changing with the development of cyber-physical systems, real-time monitoring, big data analysis and machine learning with advanced control systems through the Internet of Things (IoT). These “smarter” systems; in which technology, components, and devices talk to each other and feed information to each other in a more sophisticated way bring about a more optimized, efficient process.

Data provided by weather radar are important in weather forecasting. Rainfall data are typically introduced to provide stormwater information at different locations in the vicinity of the wastewater treatment plant. Several consecutive days of rainfall appears to correlate with increased WWTP flows, indicating a trend that is historically related to interflow.

Goals of prediction to prevent overflows:
  • Reduce ratepayer costs by implementing all cost-effective I&I reduction projects
  • Minimize liability from water pollution and public health risks by eliminating storm-related SSOs
  • Proactive reduce overall I&I to avoid capital costs of capacity expansion in anticipation of future population growth
  • Eliminate enough I&I to offset the environmental and regulatory impact of sewer system expansion and increased water demand

Though sensors helped to combat the overflows in South Bend, Indiana for a while, they could only read out that they were being overwhelmed in a recent storm. Yet, if the data from those sensors flowed into a system powered by Artificial Intelligence, operators could have a forecast to predict that storm and may have be able to proactively divert in preparation.

Predictive influent flow rate information is helpful to determine the the most cost-efficient schedule of operating wastewater pumps. Pluto AI has developed a state-of-the-art prediction system which delivers a high accuracy influent flow forecast based on weather forecasts, recent influent flow trends, and the hydraulics of the plant and sewer system to predict influent flow into a wastewater plant.

To assess extraneous water entering your system at least a year of influent flow data to the treatment facility should be examined. Pluto recommends two. Contact us to learn more about integrating predictive forecasting for overflow prevention into your system.

Sources:
https://www.southbendtribune.com/news/local/south-bend-s-smart-sewers-overwhelmed-by-floodwaters/article_cb75b63c-aaa9-5b39-9c9c-df4fcd2b62b3.html
https://www.mass.gov/eea/docs/dep/water/laws/i-thru-z/omrguide.pdf
https://www.globalw.com/support/inflow.html
https://www.ce.utexas.edu/prof/maidment/giswr2012/TermPaper/Boersma.pdf
https://www.mountainview.gov/civicax/filebank/blobdload.aspx?blobid=6979

3 Reasons Why We Need Deep Learning For Water Analytics

Over the past few years, the business world has seemed to enter a frenzy around buzzwords like “analytics,” “big data,” and “artificial intelligence.” There are two key elements to this phenomenon. First, the amount of data generated has exploded recently. Second, effective marketing schemes have created an “analytics” frenzy. In many cases, business and utilities don’t even know why they need or want hardware (sensors, meters) that will allow them to collect data every 15 seconds. Even when they do that, they are not sure why they need an analytical software component to study the abundance of data. Business and utility managers simply want to increase revenue and decrease costs, but don’t care about the specifics.

Unfortunately, all this frenzy allows for the entry of charlatans that just want to create noise. Another problem is that this prevents end users from reaching their full business potential. Now why is that? Because unsuspecting customers may end up purchasing poor analytics solutions. This forces them to conclude that analytics just doesn’t work and they revert back to their old inefficient ways.

Aren’t all analytics solutions equivalent?

Not at all! This is true for a variety of reasons, but let’s go through some of the key attributes of the most popular analytics solutions provided to end users today. We promise to not go too far down the technical rabbit hole.

The most common technique that you’ll come across is conditional monitoring. This is just monitoring the values coming from sensors and taking action based on some simple thresholding. As you can imagine, this is not sufficient at all. Setting thresholds manually and hoping that nothing goes wrong is like walking blindfolded in the middle of a busy freeway.

How about extracting rolling stats?

Extracting rolling stats refers to calculating metrics in real time based on a time window. You can also extract things like variance or autocorrelation. But these metrics are very simplistic and don’t tell you too much about data. You will not be able to infer anything about the cause and effect, which is where all the money is.

You can get a bit more sophisticated and build autoregressive models that can analyze timestamped data. The problem with autoregressive models is that they assume that the current output is a direct result of the previous ‘n’ values. And neither the value of ‘n’ nor the relationship with those values is allowed to evolve over time. Other machine learning techniques impose similar restrictions. It’s like forcing you to stick with the shoe size based on what you bought when you were 12 years old. It’s not going to fit all your life!

One technique to rule them all

This is where Deep Learning becomes really relevant. If we were to summarize the shortcomings of all those techniques:

  • The time difference between cause and effect has to be small (and not variable)
  • The relationship of the current output (effect) with the previous input measurements (cause) is not allowed to evolve with time
  • The current output (effect) is not dependent on previous outputs (effect)

Deep Learning is really good at solving these problems. Due to the inherent nature of deep neural networks, there’s very little manual intervention. This allows the engine to train itself very efficiently and solve problems with high accuracy.

Moving forward

Now, when businesses and utilities need to solve difficult business intelligence problems, they will have an intelligent understanding of what analytics solutions can offer. As stated above, there are pros and cons to various solutions, but the technique which stands out as superior in efficacy, speed, and quality is Deep Learning. The good thing is that customers don’t need to know anything about Deep Learning in order to use it. All they need to know is that it’s like Winston Wolf from Pulp Fiction … It solves problems!

Deep Learning and the Water Industry

For years, the water industry has been thought of as a slow moving sector that’s resistant to change. This makes it difficult for startups to come up with creative solutions and iterate on them quickly. Water utilities are filling up with new, vast amounts of data that can be utilized to create unforeseen jumps in operational efficiencies and margins. But it’s difficult for startups to build and test solutions because the water industry doesn’t want to change its status quo. This creates an unfortunate barrier for modern technologies to enter the water market. Why is it relevant now? Why do we need to care about it?

Winter is coming

After years of prolonging and promoting the status quo, time and change seems to be catching up with the industry. A change appears to be on the horizon, not only technological, but also psychological. Two key elements have sparked this potential inflection point within the industry — 1) rapid decay of our nation’s water infrastructure 2) proliferation of low cost internet connected devices.

Pipes seem to work just fine. What’s the big deal?

A large portion of our nation’s water infrastructure is either approaching or has passed its useful life. One might say — So what? Well, this decaying infrastructure promotes the waste of water resources via leakage and pipe bursts. They also contribute to the introduction of harmful elements into the nation’s drinking water — look no further than the lead crisis at Flint, Michigan. Not only is it irresponsible to waste our most precious resource, it’s dangerous too.

Where’s the data?

In addition to replacing the physical infrastructure elements like pipes, one might also wonder about the IT infrastructure. Luckily, given Moore’s Law, we have seen an amazing increase in processing power coupled with an equally amazing decrease in prices; especially for hardware devices. The age of internet connected devices is upon us when you look at sensors, smart meters, and so on. This ecosystem of internet connected devices is collectively referred to as Internet of Things (IoT). This system allows the industry to collect, analyze, and act upon real-time data coming into their IT systems.

How do we analyze that data?

The internet connected devices generate a lot of data continuously. One might wonder — Why do we even need fancy techniques to analyze the data? Why can’t we just use thresholding and call it a day? Well, the good ol’ ways of using manual thresholds to make huge business decisions are not sufficient anymore. The complexities of modern data far exceed the simplistic techniques that people use. We need a machine that can analyze sequential data and extract relevant insights from it. This machine should be capable of adapting to shifting baselines, prolonged delays between cause and effect, learning to detect new anomalies, and so on. A human looking at spreadsheets and manual processes is not going to help you manage your modern infrastructure. This is where Deep Learning becomes extremely relevant. People tend to think of it as some dark magic. It is actually a really effective tool that understands sequential data from sensors like no other technique ever has. It’s beautiful in so many ways!

Moving forward

As of right now, the world is only in the 4th inning of the IoT revolution and the US water industry might be even further behind than that. With that said, the future looks potentially bright when one considers the power and responsiveness of the real-time monitoring capabilities the IoT devices offer. Additionally, as the water industry’s analytical sophistication and mindset increases, they will have the ability to leverage these data streams into predictive insights, in addition to reactive monitoring. Some areas of opportunity include predictive asset management, anomaly detection, demand forecasting, and operational efficiency.