5 Digital Transformation Lessons from Dune

Last week, sci-fi fans finally got to see the latest film adaptation of Dune. When published in 1965, Frank Herbert’s novel was a groundbreaking, eco-conscious sci-fi epic. Set 20,000 years in the future with intergalactic dynasties and secret orders battling for control of the scarcest resource in the universe, Dune seems both completely alien but also very familiar.

Much has been written about Herbert’s inspiration for dune. But while the author had plenty of history and his own time to draw from, the story is even more relevant today, given how dire some of the same issues have become. So if Dune does such a great job of reflecting our current situation, what insights can it offer into how to address our challenges?

Here are 5 lessons from Dune on digital transformation

1. Bring back the thinking machines

In the Dune universe, a war against machines results in a prohibition against AI or “machines in the likeness of a human mind.” Subsequently, over thousands of years, humanity has filled the role of advanced computers with Mentats. After undergoing conditioning at specialized schools, these ‘human computers’ are able to process large amounts of data, identify patterns, apply logic, and then deduce probable future outcomes. The prescience and strategic abilities of Mentats make them valued advisors, with the great houses of the universe vying for their service.

Atreides Mentat Thufir Hawat

Sound familiar? With organizations across all industries racing to capitalize on AI, there’s been growing demand for data science related roles. Companies have to compete with big tech companies for talent, and there is simply not enough supply to meet the demand.

The solution? Automation. “Many machines on Ix. New machines,” notes a guild navigator (another class of humans that replace the work formerly handled by computers). Organizations can automate much of their data science work by partnering with vendors that have already made significant investments in R&D and data science talent. Leveraging outside expertise to focus on improving specific workflows is more cost-effective, provides flexibility, and can accelerate digital transformation efforts. 

It’s time to bring back the thinking machines (spoiler alert: the humans and AI eventually make peace in the Dune series).

2. Every drop counts

In contrast to the Harkonnen who seem to indulge in daily steam showers, the Fremen natives of Dune are relentless in their conservation of water. Donning water-preserving suits, the Fremen even reclaim water from corpses and avoid crying. Of course, personal survival demands it, but their hyper-vigilant water preservation also serves their long term vision – terraforming their desert planet into a green oasis. The Fremen use wind traps to collect moisture from the air and slowly amass giant caches of water across thousands of sites.

Fremen water catch basin

Organizations rightly prioritize opportunities that promise to have the biggest impact. But they also shouldn’t overlook less obvious opportunities to innovate (for instance, optimizing the various points at which water is used within food manufacturing processes). By applying the same rigor across other processes, the many small gains in aggregate can have an enormous impact on the efficiency and sustainability of the entire business.

3. “The slow blade penetrates the shield”

Combat in Dune highlights the value of adaptation and an incremental approach. With personal shielding technology having rendered conventional projectile weapons largely ineffective, military forces in Dune revive the use of hand-to-hand combat and traditional weapons. To win in battle, soldiers have to think steps ahead and employ techniques that allow them to overcome the shields, which only yield to slow attacks.

Likewise, with the conventional, top-down approach to digital transformation often failing to deliver, organizations must adapt more effective strategies. A survey of industrial professionals indicated that while 94% have taken an organization-wide approach to digital transformation, only 29% claimed success. Stymied by unanticipated complexity and plagued with delays and cost overruns, many organizations are turning to an operation-specific approach to digital transformation. By implementing digitization and automation techniques to specific workflows first, organizations are able to ensure incremental success and then scale their efforts to the rest of the org.

4. Enlist the frontline

Another benefit of the ops-specific approach is that it more effectively involves and considers those closest to the processes being targeted. In Dune, as the management of Arrakis and spice mining changes hands from the Harkonnen to the Atreides, there’s a clear distinction in the management style of the Atreides. The Harkonnen impose their rule and maximize spice production with violent oppression. By contrast, the Atreides begin their management by sending envoys to engage the locals. They rescue spice harvester workers at the expense of spice production, and then Paul embeds himself with the Fremen and gains their desert knowledge. The approach pays off, as Paul is able to mobilize the locals to overwhelming success.

Similarly, it behooves organizations looking to transform their operations to enlist stakeholders at all levels, especially those that can assess the situation on the ground and identify all opportunities to innovate. Getting their buy-in, tapping their experience and expertise, and ensuring the project delivers on their goals will increase chances for success.

Dune spice miner

5. Fear is the mindkiller

“Moods are a thing for cattle and love play” declares 1984 Dune’s Gurney Halleck while chiding Paul Atreides for not being more vigilant in preparing for their hostile destination. Once on Arrakis, Paul finds himself stripped of his resources and stranded in the desert. He’s forced to quickly hone his skills and adapt to the conditions of his new environment. 

The pace of innovation across all industries is increasing. To maintain their competitive advantage, organizations must create an environment to support innovation within. They can’t afford to wait for years long, enterprise-wide digital transformation projects to deliver uncertain results. Budgetary limits, legacy systems, lack of expertise, and other challenges can be overcome with the right approach. The op-specific approach can help organizations adapt faster, empower professionals across the organization, and realize ROI sooner. 

The sleeper must awaken!

Zillow & 2 Attributes of A Successful Data Culture

In our recent e-book, 3 Hacks for Onboarding AI Platforms, we outline a few key steps to building the right team and culture to support an AI deployment. And we did so for good reason. There is broad consensus that the success of digital transformation efforts hinge on having a data-driven culture behind it. A 2019 Deloitte study found that companies with strong data-driven cultures were twice as likely to exceed business goals. Another study by New Vantage Partners found that 95% of the challenge to adoption of big data and AI initiatives was cultural, organizational, or process-driven rather than technological.

Given this, organizations have prioritized fostering data-driven cultures within their organizations. Whether it’s hiring a digital-focused executive, establishing centers of excellence, or instituting organization-wide mandates, the focus is on moving away from decisions based on gut feeling to those based on data-derived facts. 

Organizations Must Look Beyond the Numbers

Sounds great, but an effective data driven organization must often look beyond the numbers and can face major consequences when they fail to do so. Take for example Zillow, a company that has used data to not only build more accurate real estate models but has also leveraged data into a powerful competitive advantage.

Zillow’s automated home-buying business recently made headlines for its decision to halt home purchases. The company, which has access to more than 17 years worth of data, is hearing backlash after the announcement. Some are calling into question the company’s ability to properly plan and take into account logistical constraints. Others are wondering if their brand has been irreparably damaged. How could these things happen in a data-centric company?

Attributes of a Data-Informed Culture: Intuition & Ownership

In our experience, organizations have proven tremendously successful when they connect big data analytics to the business strategy. This data-informed approach means they acknowledge the data-derived insights but are also aware of and account for the implications of other non-data factors that may impact the direction of the overall strategy.

It also means that when building this data-informed culture, in addition to data literacy, organizations must also look for and encourage two key attributes: 1) Intuition and 2) Ownership

Intuition is defined as the natural ability to know something without any proof or evidence.But it’s also another data point, based on unconscious knowledge, expertise, and experience to be combined with other data in decision making. Ownership is the state of being responsible and accountable. It’s critical that these two components are embedded into the company’s values so that data may be used in a way that properly guides and informs decisions. Otherwise, you may be sitting on actionable insights that no one has evaluated properly or acted on because it’s “not my place.” Someone must answer to the choices being made and how those decisions align to and support broader goals.

It’s easy to wonder if the culture at Zillow didn’t empower the decision makers to use their intuition in the process, but instead they had been accustomed to letting the data be their one and only guide. 

It also highlights a gap between the company’s actions and the real-world issues having to do with the on-the-ground workers and supply constraints. This could be the result of a lack of ownership over the decisions being made.

Being data-informed in addition to data-driven means using both intuition and ownership to constantly check your assumptions, methods and outcomes. The qualitative complements the quantitative, just as the human element complements the data analysis. 

If you want to take your data insights to the next level and avoid the unintended consequences associated with mismanaging the intangible side of your business, look for people that demonstrate high intuition and ownership traits. Your culture will thank you for it.

100th Episode Of The Dan Smolen Podcast

Prateek Joshi, Founder and CEO of Plutoshift, discusses how A.I. makes the world a better place on the 100th episode of The Dan Smolen Podcast. The Dan Smolen Podcast is the best at covering future of work and meaningful work topics and trends.

In this episode, Prateek:

  • Describes Plutoshift and his role in the company. Starts at 3:03
  • Defines A.I. and contrasts it with Machine Learning. Starts at 3:51
  • Addresses workforce concerns that A.I. takes jobs away from people. Starts at 8:52
  • Illustrates how Plutoshift helps clients involved with providing clean and potable water. Starts at 13:03
  • Identifies the training and advanced skill that he seeks in hired talent. Starts at 20:25
  • Tells us how, beyond his work, he adds fun and enjoyable activity to each day. Starts at 27:59


Listen to the A.I. and the Future of Work podcast.

Databases, Infrastructure, and Query Runtime

Recently, my team was tasked with making a switch from a combined MySQL and Cassandra infrastructure to one in which all of this data is stored entirely on a PostgreSQL server. This change was partially due to an increased drive to provide necessary and crucial flexibility to our customers, in tandem with the fact that Cassandra was simply not necessary for this particular application, even with the high quantities of data we were receiving. On its face, the mere need for such a change almost looks backwards given how much movement within the tech industry has been made away from SQL databases and towards NoSQL databases. But, in fact, NoSQL — or even hybrid systems — are not always best.

Performance Gain Considerations

In certain applications, one might find that performance gains, hoped to be reaped from NoSQL’s optimizations, may not translate perfectly to production without some forethought. I would personally argue that SQL databases often are preferable (over something like Cassandra) in non-trivial applications, most of all when JOIN operations are required.. Generally speaking, NoSQL databases — certainly Cassandra, among others — do not support JOIN. I will add to this that the vast majority of ORMs (for those who may not be familiar with the term, these are effectively systems of abstracting database relations into typically “object-oriented” style objects within one’s backend code) are built around SQL. Thus, the flexibility and readability that is afforded by these ORMs — at least when operating a database of non-trivial objects —can be a lifesaver for development time, database management, integrity, and readability. Indeed, I would even argue that, for most web applications, it often outweighs the sometimes marginal or even relatively negligible performance increases that a NoSQL database may provide (of course, this is completely dependent on the nature and scale of the data, but that is perhaps a topic for another time).

Cloud Infrastructure

However, none of this matters if the engineer is not paying close attention to their cloud infrastructure and the way that they are actually using their queries in production. In evaluating one engineer’s project, I found they were doing all of their insertion operations individually rather than attempting to batch or bulk insert them (when this was well within the scope of this particular application). It appeared they had been developing with a local setup and then deploying their project to the cloud where their database was running on a separate machine from their server. The end result in this case was rather comical, as once insertions were batched, even in Postgres, they were orders of magnitude faster than the piecemeal NoSQL insertions. They had not considered the simple fact of latency.

How did this original engineer miss this? I do not know, as this particular piece of software was inherited with little background knowledge. But, given that they were testing locally, I can assume that they elected for individual insertions. Making queries in this way can sometimes be less tricky than bulk insertions (which often have all sorts of constraints around them, and require a bit more forethought, especially when it comes to Cassandra). We found the performance was beyond satisfactory. What they did not consider, however, is that the latency between the backend server and a Cassandra (or SQL) server hosted in any sort of distributed system (ie. production). This meant that it didn’t really matter how fast these queries were; the latency between the backend and the database was so much greater than the query runtime, that, in fact, it really didn’t even remotely matter which database was used. So it followed that the real-world performance was actually significantly improved by simply batching insertions in Postgres (though of course, batching is supported in Cassandra — but the change was necessary nonetheless).

The Moral of the Story

In any case, the moral of the story here, in my opinion, is that understanding your own cloud infrastructure is crucial to writing actual performant programs in the real world. As well as the fact that, just because one database may be purported to perform better than another given certain circumstances, without a solid understanding of the environment in which this application is going to be deployed in, one cannot hope to see any appreciable performance gain.

Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

Predict Tomorrow’s Influent Flow With Today’s Data

Wastewater plant operations make important operational decisions based on the influent flow rate to the plant, and despite the ample availability of sensors, there is no accurate industry standard for predicting influent flow rate to the plant.

Knowing the performance of a collection system is difficult because there are few industry-recognized benchmarks on what “performance” is and how it should be determined. Performance of sewer collection systems are often simply educated guesses. Quantifying the areas of highest inflow and infiltration can be difficult due to large networks of pipes, the expense of water monitoring, and varying weather conditions impacting soil saturation.

Municipal sanitary sewer collection and conveyance systems are an extensive, valuable, and complex part of the nation’s infrastructure. Collection systems consist of pipelines, conduits, pumping stations, force mains, and any other facility collecting wastewater and conveying it to facilities that provide treatment prior to discharge to the environment

Plant operators are responsible for ensuring there is enough treated water available for pumping into the distribution or discharge system as well as enough water to maintain ongoing operations. Many operators overlook production water in addition to effluent pumping rates when determining influent rate, this factor ensures treatment is consistent.

Influent flow rates are usually estimated by the operators based on experience and local weather forecasts. These back-of-the-napkin calculations are necessary to engage in master planning for the future of the facility. Determination of the future capacity should be based on needs and sizing, as well as the plant’s ability to meet regulations in the future, and expected timing to update or build new facilities, are all impacted by the irregular and unpredictable amount of influent entering a system.

EPA estimates that the more than 19,000 collection systems across the country would have a replacement cost of $1-2 trillion dollars. The collection system of a single large municipality can represent an investment worth billions of dollars. Usually, the asset value of the collection system is not fully recognized and the collection system operation and maintenance programs are given low priority compared with wastewater treatment needs and other municipal responsibilities.

Typically, small amounts of infiltration and inflow are anticipated and tolerated. Yet, unpredictable weather can increase this load and cause overflows. Management of these events are costly in terms of unplanned labor expenditures, repair of damaged equipment and health and environmental impacts sometimes incurring monetary fines and coverage on the evening news.

As one of the most serious and environmentally threatening problems, sanitary sewer overflows are a frequent cause of water quality violations and are a threat to public health and the environment. Beach closings, flooded basements and overloaded treatment plants are some symptoms of collection systems with inadequate capacity and improper management, operation, and maintenance. The poor performance of many sanitary sewer systems and resulting potential health and environmental risks highlight the need to optimize operation and maintenance of these systems.

Wastewater collection systems suffer from inadequate investment in maintenance and repair often due in large part to the “out-of-sight, out-of-mind” nature of the wastewater collection system. The lack of proper maintenance has resulted in deteriorated sewers with subsequent basement backups, overflows, cave-ins, hydraulic overloads at treatment plants, and other safety, health, and environmental problems.

Managing these complex water systems relies on heavy physical infrastructure and reactive governing attitudes. This is changing with the development of cyber-physical systems, active performance monitoring, big data analysis and machine learning with advanced control systems through the Internet of Things (IoT). These “smarter” systems; in which technology, components, and devices talk to each other and feed information to each other in a more sophisticated way bring about a more optimized, efficient process.

Data provided by weather radar are important in weather forecasting. Rainfall data are typically introduced to provide stormwater information at different locations in the vicinity of the wastewater treatment plant. Several consecutive days of rainfall appears to correlate with increased WWTP flows, indicating a trend that is historically related to interflow.

Goals of prediction to prevent overflows:
  • Reduce ratepayer costs by implementing all cost-effective I&I reduction projects
  • Minimize liability from water pollution and public health risks by eliminating storm-related SSOs
  • Proactive reduce overall I&I to avoid capital costs of capacity expansion in anticipation of future population growth
  • Eliminate enough I&I to offset the environmental and regulatory impact of sewer system expansion and increased water demand

Though sensors helped to combat the overflows in South Bend, Indiana for a while, they could only read out that they were being overwhelmed in a recent storm. Yet, if the data from those sensors flowed into a system powered by Artificial Intelligence, operators could have a forecast to predict that storm and may have be able to proactively divert in preparation.

Predictive influent flow rate information is helpful to determine the the most cost-efficient schedule of operating wastewater pumps. Pluto AI has developed a state-of-the-art prediction system which delivers a high accuracy influent flow forecast based on weather forecasts, recent influent flow trends, and the hydraulics of the plant and sewer system to predict influent flow into a wastewater plant.

To assess extraneous water entering your system at least a year of influent flow data to the treatment facility should be examined. Pluto recommends two. Contact us to learn more about integrating predictive forecasting for overflow prevention into your system.

Sources:
https://www.southbendtribune.com/news/local/south-bend-s-smart-sewers-overwhelmed-by-floodwaters/article_cb75b63c-aaa9-5b39-9c9c-df4fcd2b62b3.html
https://www.mass.gov/eea/docs/dep/water/laws/i-thru-z/omrguide.pdf
https://www.globalw.com/support/inflow.html
https://www.ce.utexas.edu/prof/maidment/giswr2012/TermPaper/Boersma.pdf
https://www.mountainview.gov/civicax/filebank/blobdload.aspx?blobid=6979

Highlights from the 2018 Membrane Technology Conference

Back in March, I attended the opening day of the AWWA & AMTA Membrane Technology Conference in West Palm Beach, Florida to meet Pluto customers. I wanted to learn more about the challenges facing them and explore the new processes and solutions being employed to meet those challenges.

The conference opened with an inspiring keynote address given by Water for People CEO, Eleanor Allen. Her speech offered a glimpse into the progress made through collaborative partnerships of social entrepreneurs around the world to provide potable water to the millions in need. Distinct from the technologically-focused presentations given throughout the day, this talk was an uplifting reminder of the life-sustaining impact of the advancements and efforts of the water industry’s products, services, and people.

After the lunch hour, Val Frenkel Ph.D., PE, D.WRE., of Greely and Hansen, presented a thought-provoking presentation entitled “What We Don’t Know About RO.” Dr. Frenkel provided a comprehensive review of the history of RO systems and the introduction to the commercial marketing dating back to the 1970s. He discussed the impact of specific system configurations to enable different types of RO systems to achieve individual targets of product quality or meet specific operating procedures for different applications.

Dr. Frenkel went on to describe pretreatment of membranes as a cost-effective way to insure integrity. Now that the performance of RO systems is no longer a question of achievability, the longevity and integrity of the RO membrane is the new focus for furthering system performance.

Another talk that stood out was a presentation by Pierre Kwan of HDR, regarding the Basin Creeks membrane operation, “All-Gravity Membrane Filtration: Design and Operational Considerations.” Kwan described an almost certainly unique circumstance of having a water reservoir with enough altitude above the plant to not only eliminate to the expensive pumping usually required but, created the complication of managing high pressure, instead.

Building a sustainable operation under these conditions had several interesting ramifications. Along with this gravity challenge was the high-water quality requirement, the two-stage membrane process implemented was impressive. The net result of this unique system design was that this facility consumed only 5% of the energy typically expected of a membrane plant. Kwan painted a vivid description of how thoughtful, custom design can overcoming the geographical and infrastructure challenges; the result was an compelling speech about how to achieve energy efficiency in the face of adversity.

Overall, the advancements in membrane integrity analysis and the appetite for increasing efficiencies is a rich area for predictive technologies. Pluto’s predictive analytics dashboard has helped several utilities and companies determine convenient cleaning schedules and discover optimal points for normalization of RO membrane trains, typically with a 3-5x ROI. Click here for more information (link to Demo)

Pluto AI Raises $2.1M for Smart Water Management

On World Water Day, I’m excited to announce that we have raised $2.1M in funding from some of the top Silicon Valley VC firms including Refactor Capital (cofounded by David Lee of SV Angel and Zal Bilimoria of Andreessen Horowitz), Fall Line Capital (cofounded by Eric O’Brien of Lightspeed), 500 Startups, Unshackled Ventures, Jacob Gibson (cofounder of NerdWallet), and a few other amazing investors. With such an awesome team around us, 2017 is going to be fantastic.

Pluto is an analytics platform for smart water management. We enable water facilities like treatment plants or beverage processing plants to prevent water wastage, predict asset health, and minimize operating costs. We use cutting edge Artificial Intelligence (AI) to achieve this. Our pilot customers include some of the largest water and beverage companies in the world.

Around 2.1 trillion gallons of clean water is lost in the US every year. With more than 150,000 water facilities in the country, this continues to be a massive problem.

Our goal is to address this issue by maximizing the water resource efficiency. Growing up in Gulbarga (town in southern India), I experienced the effects of water shortage firsthand. I feel that AI has been limited to first-world problems so far, which is why Pluto plans to use it to solve a meaningful problem like the ongoing water crisis. After having published 7 books on AI, I’ve become good friends with it.

Aging assets contribute heavily to water loss, with the average age of U.S. water pipes at 47 years. Replacing them is very expensive! We leverage large amounts of existing data to extract actionable insights that enables water facilities to manage their assets better. Pluto extracts wisdom from unstructured data in real time to enable operators and plant managers to take proactive action.

We are aiming to disrupt a massive industry that is in dire need of a solution. Andrew Ng very nicely posited that AI is the new electricity. It is transforming many industries and water is no different. Pluto is at the forefront of a huge wave of change in the world of water.

Contrary to what people might believe about AI, we are actually using it to create more jobs. We need more water operators to use the insights provided by us to take action. In order to scale it up and have a meaningful impact, we need your support. We are actively hiring right now. If you want to collaborate with us in any capacity, feel free to ping us at hello@plutoai.com.

According to Leonardo da Vinci, water is the driver of nature. Pluto provides the Iron Man suit to that driver.

Deep Learning and the Water Industry

For years, the water industry has been thought of as a slow moving sector that’s resistant to change. This makes it difficult for startups to come up with creative solutions and iterate on them quickly. Water utilities are filling up with new, vast amounts of data that can be utilized to create unforeseen jumps in operational efficiencies and margins. But it’s difficult for startups to build and test solutions because the water industry doesn’t want to change its status quo. This creates an unfortunate barrier for modern technologies to enter the water market. Why is it relevant now? Why do we need to care about it?

Winter is coming

After years of prolonging and promoting the status quo, time and change seems to be catching up with the industry. A change appears to be on the horizon, not only technological, but also psychological. Two key elements have sparked this potential inflection point within the industry — 1) rapid decay of our nation’s water infrastructure 2) proliferation of low cost internet connected devices.

Pipes seem to work just fine. What’s the big deal?

A large portion of our nation’s water infrastructure is either approaching or has passed its useful life. One might say — So what? Well, this decaying infrastructure promotes the waste of water resources via leakage and pipe bursts. They also contribute to the introduction of harmful elements into the nation’s drinking water — look no further than the lead crisis at Flint, Michigan. Not only is it irresponsible to waste our most precious resource, it’s dangerous too.

Where’s the data?

In addition to replacing the physical infrastructure elements like pipes, one might also wonder about the IT infrastructure. Luckily, given Moore’s Law, we have seen an amazing increase in processing power coupled with an equally amazing decrease in prices; especially for hardware devices. The age of internet connected devices is upon us when you look at sensors, smart meters, and so on. This ecosystem of internet connected devices is collectively referred to as Internet of Things (IoT). This system allows the industry to collect, analyze, and act upon streaming data coming into their IT systems.

How do we analyze that data?

The internet connected devices generate a lot of data continuously. One might wonder — Why do we even need fancy techniques to analyze the data? Why can’t we just use thresholding and call it a day? Well, the good ol’ ways of using manual thresholds to make huge business decisions are not sufficient anymore. The complexities of modern data far exceed the simplistic techniques that people use. We need a machine that can analyze sequential data and extract relevant insights from it. This machine should be capable of adapting to shifting baselines, prolonged delays between cause and effect, learning to detect new anomalies, and so on. A human looking at spreadsheets and manual processes is not going to help you manage your modern infrastructure. This is where Deep Learning becomes extremely relevant. People tend to think of it as some dark magic. It is actually a really effective tool that understands sequential data from sensors like no other technique ever has. It’s beautiful in so many ways!

Moving forward

As of right now, the world is only in the 4th inning of the IoT revolution and the US water industry might be even further behind than that. With that said, the future looks potentially bright when one considers the power and responsiveness of the active performance monitoring capabilities the IoT devices offer. Additionally, as the water industry’s analytical sophistication and mindset increases, they will have the ability to leverage these data streams into predictive insights, in addition to reactive monitoring. Some areas of opportunity include predictive asset management, anomaly detection, demand forecasting, and operational efficiency.