100th Episode Of The Dan Smolen Podcast

Prateek Joshi, Founder and CEO of Plutoshift, discusses how A.I. makes the world a better place on the 100th episode of The Dan Smolen Podcast. The Dan Smolen Podcast is the best at covering future of work and meaningful work topics and trends.

In this episode, Prateek:

  • Describes Plutoshift and his role in the company. Starts at 3:03
  • Defines A.I. and contrasts it with Machine Learning. Starts at 3:51
  • Addresses workforce concerns that A.I. takes jobs away from people. Starts at 8:52
  • Illustrates how Plutoshift helps clients involved with providing clean and potable water. Starts at 13:03
  • Identifies the training and advanced skill that he seeks in hired talent. Starts at 20:25
  • Tells us how, beyond his work, he adds fun and enjoyable activity to each day. Starts at 27:59


Listen to the A.I. and the Future of Work podcast.

5 Things To Consider When Implementing Advanced Analytics For Industrial Processes

In the previous blog post, we talked about how to measure the success of an asset performance monitoring solution. With the buzz of AI and machine learning out there, we at Plutoshift hear questions about what exactly machine learning analytics can actually do. The quick answer is a lot, but the longer and more important answers will be considered here in blog post #2 of this series. What factors should you consider when you’re implementing advanced analytics for industrial processes?

When thinking about introducing these new technologies to your company, here are the 5 considerations that will help:

1. What are the specific business goals that AI can solve?

This may sound obvious, but not identifying a key business pain point to solve is frequently the reason pilots do not progress. Even when they appear successful, they will stall at some point. Exploring new technologies and how to improve your business is the sign of a vibrant company.

However, when a pilot flies under the radar of executive’s awareness, the hurdle to take a pilot to the next level is difficult. A business objective that’s stated from the outset will improve your odds greatly. This quote from a savvy Utilities Manager is spot on:

Well, I guess it’s good to know if I needed to know it.

Some examples of business pain points that can get the right attention from the outset are:

  • Reduce unplanned downtime: You can forecast performance metrics and schedule maintenance to reduce downtime
  • Reduce energy costs: You can take advantage of off-peak energy prices
  • Reduce production material cost: You can lower chemical dosing amounts

2. What improvement in process will be attained?

When pilots succeed but don’t progress, it’s because the results were not very exciting. This doesn’t mean that the results must be a slam dunk. In fact, some of the most exciting results are when performance improvements weren’t obtained but a clear reason is determined as to why it didn’t happen. Identifying where to invest with reasonable certainty of improved results is an outstanding thing to learn.

Typically, new technology investigations have a champion at the company. Since you’re reading this article, perhaps that’s you! Your vision is vital to a successful enterprise.

The challenge is to find a project with which everyone is comfortable. The idea of getting some kind of pilot just to get an evaluation started seems reasonable. Yet, in these situations, buy-in is hard to come by. Pilots take up people’s time and goodwill runs short. You as the champion get tired of carrying the project alone. When a pilot is complete most of us are happy to be done with it. We are not all that excited to dive back in unless there is something to really entice us.

This is where concrete meaningful goals become important. Without the expectation of a real payoff, it’s hard to progress. This is certainly true with AI solutions but generally true with any project. Your vendor should be leading this improvement charge. If they can’t, consider this before making a commitment. As one old pool player, who also happens to be a Director of Plant Operations, said to me:

Call your shots! If you don’t, it really doesn’t matter whether you make it or not.

3. What access to data do you have to support the considered project?

This is specifically an AI project concern. As far as data is concerned, there are three key aspects that form the backbone of an AI project — quantity, quality, and access. AI projects use historical data to train algorithms that can predict future outcomes.

More data is always better. It may not all be used, but data scientists will want to tease out any correlations and look for causal effects. Lack of data certainly makes it challenging, but it does not mean that the project goals cannot be met.

Gaps in data can be overcome. Lacking one or more sensor inputs may be overcome. This is the type of initial investigation a data scientist team can do for you. More on this in blog #3 of this series.

4. Do you have a combination of data scientists and subject matter experts for the project proposed?

I spoke to the role of data scientists in this process. Equally important is the strong collaboration between data scientists and the SME who understands the process to be optimized. Without this, the project will likely not be successful.

This is also important because it is rare. Several solutions are available that have good AI expertise and others that have subject matter expertise. These types of projects, at least for the next couple of years, will require both of these. Both should be equally held responsible for the successful outcome.

5. How to assess a potential solution provider?

After you’ve checked all the points above, there’s still the need to evaluate the plan and execute the project. Is it to find a pure analytics company when you have your own subject matter expertise? Relying on a consulting engineering firm to organize the project? Getting a one-stop vendor to do the whole thing? All of these are viable options. The key is to know that the analysis can be done. This is not guaranteed because historical data is crucial.

This means that the data analysis should at least be completed and vetted initially. Can your team or your provider tell you within certain limits that this analysis will yield prescriptive recommendations that will meet the goals of the project?

However, you combine the resources to execute this project. This initial analysis should have little to no cost. You can call it the Phase Zero of data analysis. If a sizable payment must be made before any data analysis occurs, it would mean that you’re funding the learning curve for whomever required the purchase order.

3 Questions to Ask Yourself for Improved Membrane Performance

Optimal performance of any membrane system to operate at the points of highest efficiency and lowest cost requires a delicate balance; Cleaning membranes too frequently reduces the system lifespan, too infrequently reduces product quality and increases energy costs. Providing proper maintenance of membranes is not a one-size-fits-all approach, as each system and each train are unique based on their purpose, age and placement within a treatment system.

What if it were possible to detect early warning signs of fouling to minimize the amount of time troubleshooting the system?

Maintaining membrane systems on their unique cycles and fouling rates, rather than a manufacturer’s specified time-frame, allows for maximization of operating conditions and total profits.

The challenge of membrane longevity and integrity is that each system design is unique to it’s plant location and objective. These factors also depend on the feed water source and the target product water quality. Plant managers and service engineers are required to maintain and when possible, reduce, total O&M and energy costs in order to meet achieve product margins.

How can I predict the best cleaning schedule for my membrane systems?

Analysis of the challenges of each specific train can depend on regional water quality, past performance and energy used are complicated equations based on a wide variety of factors. Applying data science to a plant’s existing data streams can provide insights to predict the ideal time to clean and service a membrane to improve and extend performance and life cycles of membrane systems to help manage these costs.

The unexpected shutdown of a membrane system can be a catastrophe for any processing plant. This can be due to the product water quality deteriorating or having to discharge the system to identify a membrane problem.

What if you could have peace-of-mind that each system was being maintained regularly and have remote monitoring to oversee the entire operation?

Remote monitoring centers now have the opportunity to use Big Data and informed decision-making to collaborate with service engineers in the field and add to the value delivered. Pluto’s predictive analytics dashboard provides data analytics and actionable insights to these big companies in order to optimize how they maintain a global fleet of membrane systems 365 days/year.

Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

Predict Tomorrow’s Influent Flow With Today’s Data

Wastewater plant operations make important operational decisions based on the influent flow rate to the plant, and despite the ample availability of sensors, there is no accurate industry standard for predicting influent flow rate to the plant.

Knowing the performance of a collection system is difficult because there are few industry-recognized benchmarks on what “performance” is and how it should be determined. Performance of sewer collection systems are often simply educated guesses. Quantifying the areas of highest inflow and infiltration can be difficult due to large networks of pipes, the expense of water monitoring, and varying weather conditions impacting soil saturation.

Municipal sanitary sewer collection and conveyance systems are an extensive, valuable, and complex part of the nation’s infrastructure. Collection systems consist of pipelines, conduits, pumping stations, force mains, and any other facility collecting wastewater and conveying it to facilities that provide treatment prior to discharge to the environment

Plant operators are responsible for ensuring there is enough treated water available for pumping into the distribution or discharge system as well as enough water to maintain ongoing operations. Many operators overlook production water in addition to effluent pumping rates when determining influent rate, this factor ensures treatment is consistent.

Influent flow rates are usually estimated by the operators based on experience and local weather forecasts. These back-of-the-napkin calculations are necessary to engage in master planning for the future of the facility. Determination of the future capacity should be based on needs and sizing, as well as the plant’s ability to meet regulations in the future, and expected timing to update or build new facilities, are all impacted by the irregular and unpredictable amount of influent entering a system.

EPA estimates that the more than 19,000 collection systems across the country would have a replacement cost of $1-2 trillion dollars. The collection system of a single large municipality can represent an investment worth billions of dollars. Usually, the asset value of the collection system is not fully recognized and the collection system operation and maintenance programs are given low priority compared with wastewater treatment needs and other municipal responsibilities.

Typically, small amounts of infiltration and inflow are anticipated and tolerated. Yet, unpredictable weather can increase this load and cause overflows. Management of these events are costly in terms of unplanned labor expenditures, repair of damaged equipment and health and environmental impacts sometimes incurring monetary fines and coverage on the evening news.

As one of the most serious and environmentally threatening problems, sanitary sewer overflows are a frequent cause of water quality violations and are a threat to public health and the environment. Beach closings, flooded basements and overloaded treatment plants are some symptoms of collection systems with inadequate capacity and improper management, operation, and maintenance. The poor performance of many sanitary sewer systems and resulting potential health and environmental risks highlight the need to optimize operation and maintenance of these systems.

Wastewater collection systems suffer from inadequate investment in maintenance and repair often due in large part to the “out-of-sight, out-of-mind” nature of the wastewater collection system. The lack of proper maintenance has resulted in deteriorated sewers with subsequent basement backups, overflows, cave-ins, hydraulic overloads at treatment plants, and other safety, health, and environmental problems.

Managing these complex water systems relies on heavy physical infrastructure and reactive governing attitudes. This is changing with the development of cyber-physical systems, active performance monitoring, big data analysis and machine learning with advanced control systems through the Internet of Things (IoT). These “smarter” systems; in which technology, components, and devices talk to each other and feed information to each other in a more sophisticated way bring about a more optimized, efficient process.

Data provided by weather radar are important in weather forecasting. Rainfall data are typically introduced to provide stormwater information at different locations in the vicinity of the wastewater treatment plant. Several consecutive days of rainfall appears to correlate with increased WWTP flows, indicating a trend that is historically related to interflow.

Goals of prediction to prevent overflows:
  • Reduce ratepayer costs by implementing all cost-effective I&I reduction projects
  • Minimize liability from water pollution and public health risks by eliminating storm-related SSOs
  • Proactive reduce overall I&I to avoid capital costs of capacity expansion in anticipation of future population growth
  • Eliminate enough I&I to offset the environmental and regulatory impact of sewer system expansion and increased water demand

Though sensors helped to combat the overflows in South Bend, Indiana for a while, they could only read out that they were being overwhelmed in a recent storm. Yet, if the data from those sensors flowed into a system powered by Artificial Intelligence, operators could have a forecast to predict that storm and may have be able to proactively divert in preparation.

Predictive influent flow rate information is helpful to determine the the most cost-efficient schedule of operating wastewater pumps. Pluto AI has developed a state-of-the-art prediction system which delivers a high accuracy influent flow forecast based on weather forecasts, recent influent flow trends, and the hydraulics of the plant and sewer system to predict influent flow into a wastewater plant.

To assess extraneous water entering your system at least a year of influent flow data to the treatment facility should be examined. Pluto recommends two. Contact us to learn more about integrating predictive forecasting for overflow prevention into your system.

Sources:
https://www.southbendtribune.com/news/local/south-bend-s-smart-sewers-overwhelmed-by-floodwaters/article_cb75b63c-aaa9-5b39-9c9c-df4fcd2b62b3.html
https://www.mass.gov/eea/docs/dep/water/laws/i-thru-z/omrguide.pdf
https://www.globalw.com/support/inflow.html
https://www.ce.utexas.edu/prof/maidment/giswr2012/TermPaper/Boersma.pdf
https://www.mountainview.gov/civicax/filebank/blobdload.aspx?blobid=6979

Highlights from the 2018 Membrane Technology Conference

Back in March, I attended the opening day of the AWWA & AMTA Membrane Technology Conference in West Palm Beach, Florida to meet Pluto customers. I wanted to learn more about the challenges facing them and explore the new processes and solutions being employed to meet those challenges.

The conference opened with an inspiring keynote address given by Water for People CEO, Eleanor Allen. Her speech offered a glimpse into the progress made through collaborative partnerships of social entrepreneurs around the world to provide potable water to the millions in need. Distinct from the technologically-focused presentations given throughout the day, this talk was an uplifting reminder of the life-sustaining impact of the advancements and efforts of the water industry’s products, services, and people.

After the lunch hour, Val Frenkel Ph.D., PE, D.WRE., of Greely and Hansen, presented a thought-provoking presentation entitled “What We Don’t Know About RO.” Dr. Frenkel provided a comprehensive review of the history of RO systems and the introduction to the commercial marketing dating back to the 1970s. He discussed the impact of specific system configurations to enable different types of RO systems to achieve individual targets of product quality or meet specific operating procedures for different applications.

Dr. Frenkel went on to describe pretreatment of membranes as a cost-effective way to insure integrity. Now that the performance of RO systems is no longer a question of achievability, the longevity and integrity of the RO membrane is the new focus for furthering system performance.

Another talk that stood out was a presentation by Pierre Kwan of HDR, regarding the Basin Creeks membrane operation, “All-Gravity Membrane Filtration: Design and Operational Considerations.” Kwan described an almost certainly unique circumstance of having a water reservoir with enough altitude above the plant to not only eliminate to the expensive pumping usually required but, created the complication of managing high pressure, instead.

Building a sustainable operation under these conditions had several interesting ramifications. Along with this gravity challenge was the high-water quality requirement, the two-stage membrane process implemented was impressive. The net result of this unique system design was that this facility consumed only 5% of the energy typically expected of a membrane plant. Kwan painted a vivid description of how thoughtful, custom design can overcoming the geographical and infrastructure challenges; the result was an compelling speech about how to achieve energy efficiency in the face of adversity.

Overall, the advancements in membrane integrity analysis and the appetite for increasing efficiencies is a rich area for predictive technologies. Pluto’s predictive analytics dashboard has helped several utilities and companies determine convenient cleaning schedules and discover optimal points for normalization of RO membrane trains, typically with a 3-5x ROI. Click here for more information (link to Demo)

Pluto AI Raises $2.1M for Smart Water Management

On World Water Day, I’m excited to announce that we have raised $2.1M in funding from some of the top Silicon Valley VC firms including Refactor Capital (cofounded by David Lee of SV Angel and Zal Bilimoria of Andreessen Horowitz), Fall Line Capital (cofounded by Eric O’Brien of Lightspeed), 500 Startups, Unshackled Ventures, Jacob Gibson (cofounder of NerdWallet), and a few other amazing investors. With such an awesome team around us, 2017 is going to be fantastic.

Pluto is an analytics platform for smart water management. We enable water facilities like treatment plants or beverage processing plants to prevent water wastage, predict asset health, and minimize operating costs. We use cutting edge Artificial Intelligence (AI) to achieve this. Our pilot customers include some of the largest water and beverage companies in the world.

Around 2.1 trillion gallons of clean water is lost in the US every year. With more than 150,000 water facilities in the country, this continues to be a massive problem.

Our goal is to address this issue by maximizing the water resource efficiency. Growing up in Gulbarga (town in southern India), I experienced the effects of water shortage firsthand. I feel that AI has been limited to first-world problems so far, which is why Pluto plans to use it to solve a meaningful problem like the ongoing water crisis. After having published 7 books on AI, I’ve become good friends with it.

Aging assets contribute heavily to water loss, with the average age of U.S. water pipes at 47 years. Replacing them is very expensive! We leverage large amounts of existing data to extract actionable insights that enables water facilities to manage their assets better. Pluto extracts wisdom from unstructured data in real time to enable operators and plant managers to take proactive action.

We are aiming to disrupt a massive industry that is in dire need of a solution. Andrew Ng very nicely posited that AI is the new electricity. It is transforming many industries and water is no different. Pluto is at the forefront of a huge wave of change in the world of water.

Contrary to what people might believe about AI, we are actually using it to create more jobs. We need more water operators to use the insights provided by us to take action. In order to scale it up and have a meaningful impact, we need your support. We are actively hiring right now. If you want to collaborate with us in any capacity, feel free to ping us at hello@plutoai.com.

According to Leonardo da Vinci, water is the driver of nature. Pluto provides the Iron Man suit to that driver.

Four-Pronged Strategy For Asset Management In Water

When you look at the water treatment facilities, assets are very critical to their operations. These assets can be pumps, pipes, evaporators, chlorinators, and so on. Most of the inefficiencies like water leakage, monetary losses, or compliance related fines can be directly attributed to assets’ performance. So why don’t water facilities just replace the assets when they go down in efficiency? One of the biggest problems here is that assets are very expensive. Replacing them is not an option until it completely dies down. Given this situation, what can the water facilities do to solve their problems?

What are the main problems?

Water and wastewater treatment facilities face enormous challenges when it comes to managing their operations. These challenges represent significant expenses to operators. Some of the highest ranking problems include asset health prediction, anomaly detection, performance forecasting, combined sewer overflow avoidance, and many more. Understanding the asset health and learning how to predict it can open up a lot of doors, especially when we can’t replace them frequently.

Understanding the definition

Before we dig into asset health prediction, we need to understand asset management. What exactly is asset management anyway? Sounds like it’s just managing the assets, right? Well, there’s a lot more to it than that. When it comes to wastewater asset management, we need to be aware of all the variables that impact a particular asset’s health. It includes the operation, maintenance, and replacement of assets on the critical path. For example, the critical path for a water utility will be retrieving, purifying, and dispersing clean water. This path will include water pumps, water transportation pipes, stormwater sites, and many other components.

What exactly is the problem?

One of the first and foremost questions that comes to mind is — What’s the big deal here? Why can’t we just use simple thresholds to get alerted about assets? The problem is that the data is very unstructured. This data is usually a combination of numbers, free form text, SCADA, ERP, event logs, and more. It’s usually referred to as a “data lake”. Extracting meaningful insights from this data lake takes several areas of expertise like:

  • Automatic data processing engine to parse the data
  • Natural Language Processing to understand text
  • Time-series modeling to analyze sensor data
  • Predictive analytics for event prediction
  • In reference to the title of the post, these are the four prongs we need to build anything meaningful. Modern water facilities are gathering data using many different sources, so we need to make sure we use all that data to drive efficiency upwards.
Okay I understand the problem, but what’s the solution here?

We need a solution that can extract wisdom from this data lake consisting of large amounts of unstructured data. More importantly, we need wisdom that’s specific to water. We don’t need some generic “Artificial Intelligence platform” that uses the same model for many verticals like healthcare, energy, mining, and so on. Artificial Intelligence is an amazing tool that can solve really difficult problems, but only if we use it in the right way. Water is a very unique vertical that has a lot of nuances associated with it. An Artificial Intelligence solution that takes this into account when extracting wisdom will totally outperform a generic Artificial Intelligence platform. Artificial Intelligence deserves to be used in the right (and slightly constrained) way so that it can have a meaningful impact.

3 Reasons Why We Need Deep Learning For Water Analytics

Over the past few years, the business world has seemed to enter a frenzy around buzzwords like “analytics,” “big data,” and “artificial intelligence.” There are two key elements to this phenomenon. First, the amount of data generated has exploded recently. Second, effective marketing schemes have created an “analytics” frenzy. In many cases, business and utilities don’t even know why they need or want hardware (sensors, meters) that will allow them to collect data every 15 seconds. Even when they do that, they are not sure why they need an analytical software component to study the abundance of data. Business and utility managers simply want to increase revenue and decrease costs, but don’t care about the specifics.

Unfortunately, all this frenzy allows for the entry of charlatans that just want to create noise. Another problem is that this prevents end users from reaching their full business potential. Now why is that? Because unsuspecting customers may end up purchasing poor analytics solutions. This forces them to conclude that analytics just doesn’t work and they revert back to their old inefficient ways.

Aren’t all analytics solutions equivalent?

Not at all! This is true for a variety of reasons, but let’s go through some of the key attributes of the most popular analytics solutions provided to end users today. We promise to not go too far down the technical rabbit hole.

The most common technique that you’ll come across is conditional monitoring. This is just monitoring the values coming from sensors and taking action based on some simple thresholding. As you can imagine, this is not sufficient at all. Setting thresholds manually and hoping that nothing goes wrong is like walking blindfolded in the middle of a busy freeway.

How about extracting rolling stats?

Extracting rolling stats refers to calculating metrics in real time based on a time window. You can also extract things like variance or autocorrelation. But these metrics are very simplistic and don’t tell you too much about data. You will not be able to infer anything about the cause and effect, which is where all the money is.

You can get a bit more sophisticated and build autoregressive models that can analyze timestamped data. The problem with autoregressive models is that they assume that the current output is a direct result of the previous ‘n’ values. And neither the value of ‘n’ nor the relationship with those values is allowed to evolve over time. Other machine learning techniques impose similar restrictions. It’s like forcing you to stick with the shoe size based on what you bought when you were 12 years old. It’s not going to fit all your life!

One technique to rule them all

This is where Deep Learning becomes really relevant. If we were to summarize the shortcomings of all those techniques:

  • The time difference between cause and effect has to be small (and not variable)
  • The relationship of the current output (effect) with the previous input measurements (cause) is not allowed to evolve with time
  • The current output (effect) is not dependent on previous outputs (effect)

Deep Learning is really good at solving these problems. Due to the inherent nature of deep neural networks, there’s very little manual intervention. This allows the engine to train itself very efficiently and solve problems with high accuracy.

Moving forward

Now, when businesses and utilities need to solve difficult business intelligence problems, they will have an intelligent understanding of what analytics solutions can offer. As stated above, there are pros and cons to various solutions, but the technique which stands out as superior in efficacy, speed, and quality is Deep Learning. The good thing is that customers don’t need to know anything about Deep Learning in order to use it. All they need to know is that it’s like Winston Wolf from Pulp Fiction … It solves problems!

Deep Learning and the Water Industry

For years, the water industry has been thought of as a slow moving sector that’s resistant to change. This makes it difficult for startups to come up with creative solutions and iterate on them quickly. Water utilities are filling up with new, vast amounts of data that can be utilized to create unforeseen jumps in operational efficiencies and margins. But it’s difficult for startups to build and test solutions because the water industry doesn’t want to change its status quo. This creates an unfortunate barrier for modern technologies to enter the water market. Why is it relevant now? Why do we need to care about it?

Winter is coming

After years of prolonging and promoting the status quo, time and change seems to be catching up with the industry. A change appears to be on the horizon, not only technological, but also psychological. Two key elements have sparked this potential inflection point within the industry — 1) rapid decay of our nation’s water infrastructure 2) proliferation of low cost internet connected devices.

Pipes seem to work just fine. What’s the big deal?

A large portion of our nation’s water infrastructure is either approaching or has passed its useful life. One might say — So what? Well, this decaying infrastructure promotes the waste of water resources via leakage and pipe bursts. They also contribute to the introduction of harmful elements into the nation’s drinking water — look no further than the lead crisis at Flint, Michigan. Not only is it irresponsible to waste our most precious resource, it’s dangerous too.

Where’s the data?

In addition to replacing the physical infrastructure elements like pipes, one might also wonder about the IT infrastructure. Luckily, given Moore’s Law, we have seen an amazing increase in processing power coupled with an equally amazing decrease in prices; especially for hardware devices. The age of internet connected devices is upon us when you look at sensors, smart meters, and so on. This ecosystem of internet connected devices is collectively referred to as Internet of Things (IoT). This system allows the industry to collect, analyze, and act upon streaming data coming into their IT systems.

How do we analyze that data?

The internet connected devices generate a lot of data continuously. One might wonder — Why do we even need fancy techniques to analyze the data? Why can’t we just use thresholding and call it a day? Well, the good ol’ ways of using manual thresholds to make huge business decisions are not sufficient anymore. The complexities of modern data far exceed the simplistic techniques that people use. We need a machine that can analyze sequential data and extract relevant insights from it. This machine should be capable of adapting to shifting baselines, prolonged delays between cause and effect, learning to detect new anomalies, and so on. A human looking at spreadsheets and manual processes is not going to help you manage your modern infrastructure. This is where Deep Learning becomes extremely relevant. People tend to think of it as some dark magic. It is actually a really effective tool that understands sequential data from sensors like no other technique ever has. It’s beautiful in so many ways!

Moving forward

As of right now, the world is only in the 4th inning of the IoT revolution and the US water industry might be even further behind than that. With that said, the future looks potentially bright when one considers the power and responsiveness of the active performance monitoring capabilities the IoT devices offer. Additionally, as the water industry’s analytical sophistication and mindset increases, they will have the ability to leverage these data streams into predictive insights, in addition to reactive monitoring. Some areas of opportunity include predictive asset management, anomaly detection, demand forecasting, and operational efficiency.