Manufacturing’s End Game in the Artificial Intelligence Journey

The other day someone asked me, “When it comes to Artificial Intelligence and the Industrial Internet of Things (IIoT), when will enough be enough?”

A great deal of hype accompanies emerging technologies, particularly when they hold such promise. That’s why researchers from Gartner created their hype cycle, a representation of the true risks and opportunities during phases of a technology’s journey, a tool that businesses can use to make objective and better decisions.

Yes, there is a lot of grandiose talk around Artificial Intelligence. I tend to understand better through examples. That said, if you were to ask any C-level executive what their facility’s power consumption was during the past 14 days, they wouldn’t know, despite it being one of their larger costs. Understandably, it would take a few emails and days to answer. In the meantime, if there’s an inefficient asset, loss continues to mount.

Getting that insight immediately would lead to better decisions that enhance efficiency and performance. Instant analysis could detect anomalies and trends, even anticipate future issues, leading to preventative measures and perhaps an automated solution.

That’s the end game for Manufacturing in the AI journey.

As widespread as Word

With razor thin profit margins in manufacturing and an increasing need for companies to be agile, decision-makers must be able to perform analytics fast, ultimately in real-time. Artificial Intelligence will fulfill its goal when that day comes. Applying analytics would be as ubiquitous as using Microsoft Word to write documents.

One challenge is for people to “unlearn” some of the hype. That’s the result of Peak of Inflated Expectations that Gartner’s hype cycle warns us about. It requires taking a step back and focusing on fundamentals. We recommend developing a roadmap that identifies the problem and a path to desired results.

You want assets to generate more revenue without further investment or infrastructure upgrades. You don’t want to wait until the end of the month to realize you’ve had issues that drove energy costs sky-high.

You don’t want lagging indicators, you need leading indicators.

Follow the money

It’s all about following how assets impact the bottom line. Artificial Intelligence can map the problem, and with an asset performance management (APM) solution automatically connecting data with financial metrics, you can easily monitor performance, achieve business outcomes, and increase profit margins. Add in Machine Learning and it goes to a whole new level.

With a software intelligently assessing conditions that affect manufacturing processes, it will be able to learn and provide humans with the right information at the right time to make decisions.

A pump gets too hot, sensors detect it, they communicate with the monitoring software, and it predicts what operations need to be shut down before worse damage occurs. The next step would be for the software to dispatch a technician with the details they need to get it up-and-running fast. This will make sure the production down-time is eliminated and operations continue to run efficiently.

Less loss and greater efficiency equals more revenue.

A standard journey

Artificial Intelligence and its industrial application is still relatively young. Exciting things will happen before it reaches its final destination, which for me will be when it becomes standard.

It doesn’t mean removing humans from the process. They’ll be making better decisions based on the best information, from whatever device, no matter where they’re located. Gartner’s hype cycle has its Plateau of Productivity; when a technology becomes widely implemented, its place in the market understood, and its benefits realized.

For me, that’s when enough will be enough. Want to learn more about Artificial Intelligence applications? Download Plutoshift’s Strategic Application of AI whitepaper.

Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

Highlights from the 2018 Membrane Technology Conference

Back in March, I attended the opening day of the AWWA & AMTA Membrane Technology Conference in West Palm Beach, Florida to meet Pluto customers. I wanted to learn more about the challenges facing them and explore the new processes and solutions being employed to meet those challenges.

The conference opened with an inspiring keynote address given by Water for People CEO, Eleanor Allen. Her speech offered a glimpse into the progress made through collaborative partnerships of social entrepreneurs around the world to provide potable water to the millions in need. Distinct from the technologically-focused presentations given throughout the day, this talk was an uplifting reminder of the life-sustaining impact of the advancements and efforts of the water industry’s products, services, and people.

After the lunch hour, Val Frenkel Ph.D., PE, D.WRE., of Greely and Hansen, presented a thought-provoking presentation entitled “What We Don’t Know About RO.” Dr. Frenkel provided a comprehensive review of the history of RO systems and the introduction to the commercial marketing dating back to the 1970s. He discussed the impact of specific system configurations to enable different types of RO systems to achieve individual targets of product quality or meet specific operating procedures for different applications.

Dr. Frenkel went on to describe pretreatment of membranes as a cost-effective way to insure integrity. Now that the performance of RO systems is no longer a question of achievability, the longevity and integrity of the RO membrane is the new focus for furthering system performance.

Another talk that stood out was a presentation by Pierre Kwan of HDR, regarding the Basin Creeks membrane operation, “All-Gravity Membrane Filtration: Design and Operational Considerations.” Kwan described an almost certainly unique circumstance of having a water reservoir with enough altitude above the plant to not only eliminate to the expensive pumping usually required but, created the complication of managing high pressure, instead.

Building a sustainable operation under these conditions had several interesting ramifications. Along with this gravity challenge was the high-water quality requirement, the two-stage membrane process implemented was impressive. The net result of this unique system design was that this facility consumed only 5% of the energy typically expected of a membrane plant. Kwan painted a vivid description of how thoughtful, custom design can overcoming the geographical and infrastructure challenges; the result was an compelling speech about how to achieve energy efficiency in the face of adversity.

Overall, the advancements in membrane integrity analysis and the appetite for increasing efficiencies is a rich area for predictive technologies. Pluto’s predictive analytics dashboard has helped several utilities and companies determine convenient cleaning schedules and discover optimal points for normalization of RO membrane trains, typically with a 3-5x ROI. Click here for more information (link to Demo)

Deep Learning and the Water Industry

For years, the water industry has been thought of as a slow moving sector that’s resistant to change. This makes it difficult for startups to come up with creative solutions and iterate on them quickly. Water utilities are filling up with new, vast amounts of data that can be utilized to create unforeseen jumps in operational efficiencies and margins. But it’s difficult for startups to build and test solutions because the water industry doesn’t want to change its status quo. This creates an unfortunate barrier for modern technologies to enter the water market. Why is it relevant now? Why do we need to care about it?

Winter is coming

After years of prolonging and promoting the status quo, time and change seems to be catching up with the industry. A change appears to be on the horizon, not only technological, but also psychological. Two key elements have sparked this potential inflection point within the industry — 1) rapid decay of our nation’s water infrastructure 2) proliferation of low cost internet connected devices.

Pipes seem to work just fine. What’s the big deal?

A large portion of our nation’s water infrastructure is either approaching or has passed its useful life. One might say — So what? Well, this decaying infrastructure promotes the waste of water resources via leakage and pipe bursts. They also contribute to the introduction of harmful elements into the nation’s drinking water — look no further than the lead crisis at Flint, Michigan. Not only is it irresponsible to waste our most precious resource, it’s dangerous too.

Where’s the data?

In addition to replacing the physical infrastructure elements like pipes, one might also wonder about the IT infrastructure. Luckily, given Moore’s Law, we have seen an amazing increase in processing power coupled with an equally amazing decrease in prices; especially for hardware devices. The age of internet connected devices is upon us when you look at sensors, smart meters, and so on. This ecosystem of internet connected devices is collectively referred to as Internet of Things (IoT). This system allows the industry to collect, analyze, and act upon real-time data coming into their IT systems.

How do we analyze that data?

The internet connected devices generate a lot of data continuously. One might wonder — Why do we even need fancy techniques to analyze the data? Why can’t we just use thresholding and call it a day? Well, the good ol’ ways of using manual thresholds to make huge business decisions are not sufficient anymore. The complexities of modern data far exceed the simplistic techniques that people use. We need a machine that can analyze sequential data and extract relevant insights from it. This machine should be capable of adapting to shifting baselines, prolonged delays between cause and effect, learning to detect new anomalies, and so on. A human looking at spreadsheets and manual processes is not going to help you manage your modern infrastructure. This is where Deep Learning becomes extremely relevant. People tend to think of it as some dark magic. It is actually a really effective tool that understands sequential data from sensors like no other technique ever has. It’s beautiful in so many ways!

Moving forward

As of right now, the world is only in the 4th inning of the IoT revolution and the US water industry might be even further behind than that. With that said, the future looks potentially bright when one considers the power and responsiveness of the real-time monitoring capabilities the IoT devices offer. Additionally, as the water industry’s analytical sophistication and mindset increases, they will have the ability to leverage these data streams into predictive insights, in addition to reactive monitoring. Some areas of opportunity include predictive asset management, anomaly detection, demand forecasting, and operational efficiency.