NEW REPORT: Creating Actionable Decisions in the Era of Digital Data

Data is crucial for the success of any business. In order to utilize the data for actionable decision making, data analysis must deliver information that is timely, accurate and trusted. Many manual processes lead to inconsistent data and errors, causing cost overruns and imperiling regulatory compliance. To save time and aid in efficiency, industries are leaning on digital transformation technologies as companies make the move to real-time data collection.

However, industrial manufacturing companies are still slow to adopt advanced technologies like AI and IoT. While companies have adopted these technologies for some departments, most have not implemented them across the enterprise, leaving gaps of productivity, efficiency and cost control.

Our new report, The Challenge of Turning Data into Action, found that nearly half (48%) of manufacturing companies use spreadsheets or other manual data entry documents. As a result, only 12% are able to take action on their data insights automatically.

The IIOT

A big player in the digital transformation of industrial manufacturing is Industrial IoT (IIoT), accounting for more than $178 billion in 2016 alone and proving critical to providing companies with a competitive edge, according to Forbes.

We found that 44% of manufacturing professionals stated that less than half of their company’s manufacturing process is outfitted with industrial IoT technology. But, IoT can be a game changer for many businesses.

What’s At Stake?

Despite the importance of this data, older processes like spreadsheets or standalone solutions remain in many manufacturing organizations. Data collection for manufacturers focuses on quality assurance (67%), operational efficiency (64%), labor time (63%) and cost of materials (63%) — all vital aspects of maintaining efficiency in the manufacturing industry. But for more than one-third of these manufacturing companies, they can’t make informed decisions on such matters due to a lack of trust in the data.

Turning Data into Action

Over three quarters (76%) of respondents said in order to take immediate action based on collected data, they need software solutions that analyze data in real-time. Manufacturers crave simple, reliable solutions to keep data timely and accurate. They noted that such a system could offer more efficient ways to communicate updates to people on the line as well as put all data into a single platform so information can be viewed quickly and accurately.

PlutoShift offers an asset performance management platform for a variety of process industries, including food, beverage, water and chemicals. We bring together data on one easy-to-use platform, contextualizing the information and measuring the bottom-line financial impact.

Download the full report here to read more about the challenges facing the manufacturing industry and how PlutoShift services can streamline data collection.

To stay connected, follow us on LinkedIn and Twitter for more, and be sure to visit plutoshift.test.

3 ways AI will amalgamate humans and machines to close the talent gap

The process manufacturing industry is experiencing a talent shortage that could potentially worsen with time if not proactively addressed. Faced with an aging population of workers, the manufacturing industry skills gap will only continue to widen if the industry as a whole does not do more to attract millennials and Generation Z.

When good operators retire, they take all of their operational knowledge with them due to lack of digitized documentation. With careers spanning 30 to 40 years, these operators are walking encyclopedias of knowledge. Companies lose these knowledge databases when they retire. The industry has not done a great job of backfilling these positions and documenting this trove of expertise. An inexperienced operator manages assets and processes poorly, which leads to inefficient production in terms of energy, chemicals, and labor costs.

This is where Artificial Intelligence (AI) and emerging technologies can help bridge the skills gap. They allow process manufacturers to thrive and attract new and emerging talent.

Are you feeling the strain of talent management within your manufacturing operation? If so, consider the following three areas.

Fixing the “old school” mentality to address the skills gap

According to Deloitte, the manufacturing skills gap will widen so much by 2025 that it will create 3.4 million openings for skilled workers and 2 million of those roles will go unfilled. Driving this will be the 2.7 million workers that will retire or leave the industry, combined with the roughly 700,000 jobs that will be created by growth within the industry.

A perfect storm of factors and misperceptions are potentially exacerbating the industry’s lack of appeal to emerging workforce. This will ultimately hinder talent acquisition and retention. Reasons include the perceived lack of innovation and use of antiquated technology systems that are siloed from each other.

For example, a single process analyst at a bottling company can be responsible for maintaining assets at 12 plants. At each facility, there might be five different types of software systems that house data within complex and aging, on-premise environments. As a result, industrial plant operators have to make decisions based on years of on-the-job experience and cumbersome tools to monitor the performance of assets. However today, with a strong performance monitoring technology, that process analyst extracts data from their systems, determines if the data can solve their particular problems, then applies AI to analyze and provide a greater level of data intelligence.

How can AI help bridge the legacy technology gap?

One of the biggest areas where AI can help immediately is by providing data that is mobile and on-demand. Our performance monitoring solution leverages AI to collect data from disparate, legacy systems, many of which emerging operators have no interest in learning because they are hard to use and not intuitively designed. For example, it is not uncommon for a process manufacturer to have sensor data located on one legacy platform, maintenance data in another, and financial data in a third system. This makes it difficult and time intensive to extract the data. Plutoshift connects all the data sources, extracts the relationships, and converts that data directly into actionable intelligence by surfacing relevant information at the right time.

AI can add years (of experience) to a person’s life

AI can help a more inexperienced engineer perform at a higher level. By collecting data across the organization, identifying trends, and discovering correlations, AI can then focus on living up to the second part of its name: Intelligence. After performing advanced analytics on the right kind of data, Plutoshift’s performance monitoring solution presents information to an engineer that allows them to make decisions in an intelligent way. No longer are 40 years of on-the-job expertise required because AI’s capabilities can fill that void. But do keep in mind that this person is absolutely still required to do the job. This is a common fear. No piece of analytics software, no matter how insightful, can replace that person. In fact, PwC reports that robotics and AI will create a net gain of 200,000 jobs in the U.K. alone by 2037.

The skills gap is a very real concern. The challenge in attracting the younger generation to work in the process industry is shedding the outdated notion that they’ll be working the factory floor like their grandparents did. The industry is evolving just like others. It is no longer about being covered in grease and carrying a big wrench to adjust machines. Today, it’s about leveraging advanced and emerging technologies capable of tasks that older generations couldn’t even imagine. It’s going to be the promise of working with these innovations that attract the next generation of the process industry workforce.

To learn more about how AI can help your organization, please read: 5 Things to Consider When Implementing Advanced Analytics for Industrial Processes

Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

Four-Pronged Strategy For Asset Management In Water

When you look at the water treatment facilities, assets are very critical to their operations. These assets can be pumps, pipes, evaporators, chlorinators, and so on. Most of the inefficiencies like water leakage, monetary losses, or compliance related fines can be directly attributed to assets’ performance. So why don’t water facilities just replace the assets when they go down in efficiency? One of the biggest problems here is that assets are very expensive. Replacing them is not an option until it completely dies down. Given this situation, what can the water facilities do to solve their problems?

What are the main problems?

Water and wastewater treatment facilities face enormous challenges when it comes to managing their operations. These challenges represent significant expenses to operators. Some of the highest ranking problems include asset health prediction, anomaly detection, performance forecasting, combined sewer overflow avoidance, and many more. Understanding the asset health and learning how to predict it can open up a lot of doors, especially when we can’t replace them frequently.

Understanding the definition

Before we dig into asset health prediction, we need to understand asset management. What exactly is asset management anyway? Sounds like it’s just managing the assets, right? Well, there’s a lot more to it than that. When it comes to wastewater asset management, we need to be aware of all the variables that impact a particular asset’s health. It includes the operation, maintenance, and replacement of assets on the critical path. For example, the critical path for a water utility will be retrieving, purifying, and dispersing clean water. This path will include water pumps, water transportation pipes, stormwater sites, and many other components.

What exactly is the problem?

One of the first and foremost questions that comes to mind is — What’s the big deal here? Why can’t we just use simple thresholds to get alerted about assets? The problem is that the data is very unstructured. This data is usually a combination of numbers, free form text, SCADA, ERP, event logs, and more. It’s usually referred to as a “data lake”. Extracting meaningful insights from this data lake takes several areas of expertise like:

  • Automatic data processing engine to parse the data
  • Natural Language Processing to understand text
  • Time-series modeling to analyze sensor data
  • Predictive analytics for event prediction
  • In reference to the title of the post, these are the four prongs we need to build anything meaningful. Modern water facilities are gathering data using many different sources, so we need to make sure we use all that data to drive efficiency upwards.
Okay I understand the problem, but what’s the solution here?

We need a solution that can extract wisdom from this data lake consisting of large amounts of unstructured data. More importantly, we need wisdom that’s specific to water. We don’t need some generic “Artificial Intelligence platform” that uses the same model for many verticals like healthcare, energy, mining, and so on. Artificial Intelligence is an amazing tool that can solve really difficult problems, but only if we use it in the right way. Water is a very unique vertical that has a lot of nuances associated with it. An Artificial Intelligence solution that takes this into account when extracting wisdom will totally outperform a generic Artificial Intelligence platform. Artificial Intelligence deserves to be used in the right (and slightly constrained) way so that it can have a meaningful impact.

3 Reasons Why We Need Deep Learning For Water Analytics

Over the past few years, the business world has seemed to enter a frenzy around buzzwords like “analytics,” “big data,” and “artificial intelligence.” There are two key elements to this phenomenon. First, the amount of data generated has exploded recently. Second, effective marketing schemes have created an “analytics” frenzy. In many cases, business and utilities don’t even know why they need or want hardware (sensors, meters) that will allow them to collect data every 15 seconds. Even when they do that, they are not sure why they need an analytical software component to study the abundance of data. Business and utility managers simply want to increase revenue and decrease costs, but don’t care about the specifics.

Unfortunately, all this frenzy allows for the entry of charlatans that just want to create noise. Another problem is that this prevents end users from reaching their full business potential. Now why is that? Because unsuspecting customers may end up purchasing poor analytics solutions. This forces them to conclude that analytics just doesn’t work and they revert back to their old inefficient ways.

Aren’t all analytics solutions equivalent?

Not at all! This is true for a variety of reasons, but let’s go through some of the key attributes of the most popular analytics solutions provided to end users today. We promise to not go too far down the technical rabbit hole.

The most common technique that you’ll come across is conditional monitoring. This is just monitoring the values coming from sensors and taking action based on some simple thresholding. As you can imagine, this is not sufficient at all. Setting thresholds manually and hoping that nothing goes wrong is like walking blindfolded in the middle of a busy freeway.

How about extracting rolling stats?

Extracting rolling stats refers to calculating metrics in real time based on a time window. You can also extract things like variance or autocorrelation. But these metrics are very simplistic and don’t tell you too much about data. You will not be able to infer anything about the cause and effect, which is where all the money is.

You can get a bit more sophisticated and build autoregressive models that can analyze timestamped data. The problem with autoregressive models is that they assume that the current output is a direct result of the previous ‘n’ values. And neither the value of ‘n’ nor the relationship with those values is allowed to evolve over time. Other machine learning techniques impose similar restrictions. It’s like forcing you to stick with the shoe size based on what you bought when you were 12 years old. It’s not going to fit all your life!

One technique to rule them all

This is where Deep Learning becomes really relevant. If we were to summarize the shortcomings of all those techniques:

  • The time difference between cause and effect has to be small (and not variable)
  • The relationship of the current output (effect) with the previous input measurements (cause) is not allowed to evolve with time
  • The current output (effect) is not dependent on previous outputs (effect)

Deep Learning is really good at solving these problems. Due to the inherent nature of deep neural networks, there’s very little manual intervention. This allows the engine to train itself very efficiently and solve problems with high accuracy.

Moving forward

Now, when businesses and utilities need to solve difficult business intelligence problems, they will have an intelligent understanding of what analytics solutions can offer. As stated above, there are pros and cons to various solutions, but the technique which stands out as superior in efficacy, speed, and quality is Deep Learning. The good thing is that customers don’t need to know anything about Deep Learning in order to use it. All they need to know is that it’s like Winston Wolf from Pulp Fiction … It solves problems!