NEW REPORT: Creating Actionable Decisions in the Era of Digital Data

Data is crucial for the success of any business. In order to utilize the data for actionable decision making, data analysis must deliver information that is timely, accurate and trusted. Many manual processes lead to inconsistent data and errors, causing cost overruns and imperiling regulatory compliance. To save time and aid in efficiency, industries are leaning on digital transformation technologies as companies make the move to real-time data collection.

However, industrial manufacturing companies are still slow to adopt advanced technologies like AI and IoT. While companies have adopted these technologies for some departments, most have not implemented them across the enterprise, leaving gaps of productivity, efficiency and cost control.

Our new report, The Challenge of Turning Data into Action, found that nearly half (48%) of manufacturing companies use spreadsheets or other manual data entry documents. As a result, only 12% are able to take action on their data insights automatically.

The IIOT

A big player in the digital transformation of industrial manufacturing is Industrial IoT (IIoT), accounting for more than $178 billion in 2016 alone and proving critical to providing companies with a competitive edge, according to Forbes.

We found that 44% of manufacturing professionals stated that less than half of their company’s manufacturing process is outfitted with industrial IoT technology. But, IoT can be a game changer for many businesses.

What’s At Stake?

Despite the importance of this data, older processes like spreadsheets or standalone solutions remain in many manufacturing organizations. Data collection for manufacturers focuses on quality assurance (67%), operational efficiency (64%), labor time (63%) and cost of materials (63%) — all vital aspects of maintaining efficiency in the manufacturing industry. But for more than one-third of these manufacturing companies, they can’t make informed decisions on such matters due to a lack of trust in the data.

Turning Data into Action

Over three quarters (76%) of respondents said in order to take immediate action based on collected data, they need software solutions that analyze data in real-time. Manufacturers crave simple, reliable solutions to keep data timely and accurate. They noted that such a system could offer more efficient ways to communicate updates to people on the line as well as put all data into a single platform so information can be viewed quickly and accurately.

PlutoShift offers an asset performance management platform for a variety of process industries, including food, beverage, water and chemicals. We bring together data on one easy-to-use platform, contextualizing the information and measuring the bottom-line financial impact.

Download the full report here to read more about the challenges facing the manufacturing industry and how PlutoShift services can streamline data collection.

To stay connected, follow us on LinkedIn and Twitter for more, and be sure to visit plutoshift.test.

Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

Highlights from the 2018 Membrane Technology Conference

Back in March, I attended the opening day of the AWWA & AMTA Membrane Technology Conference in West Palm Beach, Florida to meet Pluto customers. I wanted to learn more about the challenges facing them and explore the new processes and solutions being employed to meet those challenges.

The conference opened with an inspiring keynote address given by Water for People CEO, Eleanor Allen. Her speech offered a glimpse into the progress made through collaborative partnerships of social entrepreneurs around the world to provide potable water to the millions in need. Distinct from the technologically-focused presentations given throughout the day, this talk was an uplifting reminder of the life-sustaining impact of the advancements and efforts of the water industry’s products, services, and people.

After the lunch hour, Val Frenkel Ph.D., PE, D.WRE., of Greely and Hansen, presented a thought-provoking presentation entitled “What We Don’t Know About RO.” Dr. Frenkel provided a comprehensive review of the history of RO systems and the introduction to the commercial marketing dating back to the 1970s. He discussed the impact of specific system configurations to enable different types of RO systems to achieve individual targets of product quality or meet specific operating procedures for different applications.

Dr. Frenkel went on to describe pretreatment of membranes as a cost-effective way to insure integrity. Now that the performance of RO systems is no longer a question of achievability, the longevity and integrity of the RO membrane is the new focus for furthering system performance.

Another talk that stood out was a presentation by Pierre Kwan of HDR, regarding the Basin Creeks membrane operation, “All-Gravity Membrane Filtration: Design and Operational Considerations.” Kwan described an almost certainly unique circumstance of having a water reservoir with enough altitude above the plant to not only eliminate to the expensive pumping usually required but, created the complication of managing high pressure, instead.

Building a sustainable operation under these conditions had several interesting ramifications. Along with this gravity challenge was the high-water quality requirement, the two-stage membrane process implemented was impressive. The net result of this unique system design was that this facility consumed only 5% of the energy typically expected of a membrane plant. Kwan painted a vivid description of how thoughtful, custom design can overcoming the geographical and infrastructure challenges; the result was an compelling speech about how to achieve energy efficiency in the face of adversity.

Overall, the advancements in membrane integrity analysis and the appetite for increasing efficiencies is a rich area for predictive technologies. Pluto’s predictive analytics dashboard has helped several utilities and companies determine convenient cleaning schedules and discover optimal points for normalization of RO membrane trains, typically with a 3-5x ROI. Click here for more information (link to Demo)

3 Reasons Why We Need Deep Learning For Water Analytics

Over the past few years, the business world has seemed to enter a frenzy around buzzwords like “analytics,” “big data,” and “artificial intelligence.” There are two key elements to this phenomenon. First, the amount of data generated has exploded recently. Second, effective marketing schemes have created an “analytics” frenzy. In many cases, business and utilities don’t even know why they need or want hardware (sensors, meters) that will allow them to collect data every 15 seconds. Even when they do that, they are not sure why they need an analytical software component to study the abundance of data. Business and utility managers simply want to increase revenue and decrease costs, but don’t care about the specifics.

Unfortunately, all this frenzy allows for the entry of charlatans that just want to create noise. Another problem is that this prevents end users from reaching their full business potential. Now why is that? Because unsuspecting customers may end up purchasing poor analytics solutions. This forces them to conclude that analytics just doesn’t work and they revert back to their old inefficient ways.

Aren’t all analytics solutions equivalent?

Not at all! This is true for a variety of reasons, but let’s go through some of the key attributes of the most popular analytics solutions provided to end users today. We promise to not go too far down the technical rabbit hole.

The most common technique that you’ll come across is conditional monitoring. This is just monitoring the values coming from sensors and taking action based on some simple thresholding. As you can imagine, this is not sufficient at all. Setting thresholds manually and hoping that nothing goes wrong is like walking blindfolded in the middle of a busy freeway.

How about extracting rolling stats?

Extracting rolling stats refers to calculating metrics in real time based on a time window. You can also extract things like variance or autocorrelation. But these metrics are very simplistic and don’t tell you too much about data. You will not be able to infer anything about the cause and effect, which is where all the money is.

You can get a bit more sophisticated and build autoregressive models that can analyze timestamped data. The problem with autoregressive models is that they assume that the current output is a direct result of the previous ‘n’ values. And neither the value of ‘n’ nor the relationship with those values is allowed to evolve over time. Other machine learning techniques impose similar restrictions. It’s like forcing you to stick with the shoe size based on what you bought when you were 12 years old. It’s not going to fit all your life!

One technique to rule them all

This is where Deep Learning becomes really relevant. If we were to summarize the shortcomings of all those techniques:

  • The time difference between cause and effect has to be small (and not variable)
  • The relationship of the current output (effect) with the previous input measurements (cause) is not allowed to evolve with time
  • The current output (effect) is not dependent on previous outputs (effect)

Deep Learning is really good at solving these problems. Due to the inherent nature of deep neural networks, there’s very little manual intervention. This allows the engine to train itself very efficiently and solve problems with high accuracy.

Moving forward

Now, when businesses and utilities need to solve difficult business intelligence problems, they will have an intelligent understanding of what analytics solutions can offer. As stated above, there are pros and cons to various solutions, but the technique which stands out as superior in efficacy, speed, and quality is Deep Learning. The good thing is that customers don’t need to know anything about Deep Learning in order to use it. All they need to know is that it’s like Winston Wolf from Pulp Fiction … It solves problems!