Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

3 Reasons Why We Need Deep Learning For Water Analytics

Over the past few years, the business world has seemed to enter a frenzy around buzzwords like “analytics,” “big data,” and “artificial intelligence.” There are two key elements to this phenomenon. First, the amount of data generated has exploded recently. Second, effective marketing schemes have created an “analytics” frenzy. In many cases, business and utilities don’t even know why they need or want hardware (sensors, meters) that will allow them to collect data every 15 seconds. Even when they do that, they are not sure why they need an analytical software component to study the abundance of data. Business and utility managers simply want to increase revenue and decrease costs, but don’t care about the specifics.

Unfortunately, all this frenzy allows for the entry of charlatans that just want to create noise. Another problem is that this prevents end users from reaching their full business potential. Now why is that? Because unsuspecting customers may end up purchasing poor analytics solutions. This forces them to conclude that analytics just doesn’t work and they revert back to their old inefficient ways.

Aren’t all analytics solutions equivalent?

Not at all! This is true for a variety of reasons, but let’s go through some of the key attributes of the most popular analytics solutions provided to end users today. We promise to not go too far down the technical rabbit hole.

The most common technique that you’ll come across is conditional monitoring. This is just monitoring the values coming from sensors and taking action based on some simple thresholding. As you can imagine, this is not sufficient at all. Setting thresholds manually and hoping that nothing goes wrong is like walking blindfolded in the middle of a busy freeway.

How about extracting rolling stats?

Extracting rolling stats refers to calculating metrics in real time based on a time window. You can also extract things like variance or autocorrelation. But these metrics are very simplistic and don’t tell you too much about data. You will not be able to infer anything about the cause and effect, which is where all the money is.

You can get a bit more sophisticated and build autoregressive models that can analyze timestamped data. The problem with autoregressive models is that they assume that the current output is a direct result of the previous ‘n’ values. And neither the value of ‘n’ nor the relationship with those values is allowed to evolve over time. Other machine learning techniques impose similar restrictions. It’s like forcing you to stick with the shoe size based on what you bought when you were 12 years old. It’s not going to fit all your life!

One technique to rule them all

This is where Deep Learning becomes really relevant. If we were to summarize the shortcomings of all those techniques:

  • The time difference between cause and effect has to be small (and not variable)
  • The relationship of the current output (effect) with the previous input measurements (cause) is not allowed to evolve with time
  • The current output (effect) is not dependent on previous outputs (effect)

Deep Learning is really good at solving these problems. Due to the inherent nature of deep neural networks, there’s very little manual intervention. This allows the engine to train itself very efficiently and solve problems with high accuracy.

Moving forward

Now, when businesses and utilities need to solve difficult business intelligence problems, they will have an intelligent understanding of what analytics solutions can offer. As stated above, there are pros and cons to various solutions, but the technique which stands out as superior in efficacy, speed, and quality is Deep Learning. The good thing is that customers don’t need to know anything about Deep Learning in order to use it. All they need to know is that it’s like Winston Wolf from Pulp Fiction … It solves problems!

Deep Learning and the Water Industry

For years, the water industry has been thought of as a slow moving sector that’s resistant to change. This makes it difficult for startups to come up with creative solutions and iterate on them quickly. Water utilities are filling up with new, vast amounts of data that can be utilized to create unforeseen jumps in operational efficiencies and margins. But it’s difficult for startups to build and test solutions because the water industry doesn’t want to change its status quo. This creates an unfortunate barrier for modern technologies to enter the water market. Why is it relevant now? Why do we need to care about it?

Winter is coming

After years of prolonging and promoting the status quo, time and change seems to be catching up with the industry. A change appears to be on the horizon, not only technological, but also psychological. Two key elements have sparked this potential inflection point within the industry — 1) rapid decay of our nation’s water infrastructure 2) proliferation of low cost internet connected devices.

Pipes seem to work just fine. What’s the big deal?

A large portion of our nation’s water infrastructure is either approaching or has passed its useful life. One might say — So what? Well, this decaying infrastructure promotes the waste of water resources via leakage and pipe bursts. They also contribute to the introduction of harmful elements into the nation’s drinking water — look no further than the lead crisis at Flint, Michigan. Not only is it irresponsible to waste our most precious resource, it’s dangerous too.

Where’s the data?

In addition to replacing the physical infrastructure elements like pipes, one might also wonder about the IT infrastructure. Luckily, given Moore’s Law, we have seen an amazing increase in processing power coupled with an equally amazing decrease in prices; especially for hardware devices. The age of internet connected devices is upon us when you look at sensors, smart meters, and so on. This ecosystem of internet connected devices is collectively referred to as Internet of Things (IoT). This system allows the industry to collect, analyze, and act upon real-time data coming into their IT systems.

How do we analyze that data?

The internet connected devices generate a lot of data continuously. One might wonder — Why do we even need fancy techniques to analyze the data? Why can’t we just use thresholding and call it a day? Well, the good ol’ ways of using manual thresholds to make huge business decisions are not sufficient anymore. The complexities of modern data far exceed the simplistic techniques that people use. We need a machine that can analyze sequential data and extract relevant insights from it. This machine should be capable of adapting to shifting baselines, prolonged delays between cause and effect, learning to detect new anomalies, and so on. A human looking at spreadsheets and manual processes is not going to help you manage your modern infrastructure. This is where Deep Learning becomes extremely relevant. People tend to think of it as some dark magic. It is actually a really effective tool that understands sequential data from sensors like no other technique ever has. It’s beautiful in so many ways!

Moving forward

As of right now, the world is only in the 4th inning of the IoT revolution and the US water industry might be even further behind than that. With that said, the future looks potentially bright when one considers the power and responsiveness of the real-time monitoring capabilities the IoT devices offer. Additionally, as the water industry’s analytical sophistication and mindset increases, they will have the ability to leverage these data streams into predictive insights, in addition to reactive monitoring. Some areas of opportunity include predictive asset management, anomaly detection, demand forecasting, and operational efficiency.