5 ways for business leaders to package AI into bite-sized pieces

Large companies have been tackling the issue of AI for the last few years. Business leaders are often faced with the problem of figuring out how to use this technology in a practical way. Any new technology needs to be packaged into bite-sized pieces to show that it works. These “success templates” can then be used to drive enterprise-wide adoption. But should they do it all at once? How do you ensure that you’re not boiling the ocean? How can a company package AI into bite-sized pieces so that their teams can consume it? From what we’ve worked on with our customers and seen in the market, there are 5 steps to do it:

1. Start with the use case
It always starts with a use case. Before launching any AI initiative, the question you should ask is whether or not there’s a burning need today. A need qualifies as “burning” if it has a large impact on your business. If solved, it can directly increase revenue and/or margins for the company. We need to describe this burning need in the form of a use case. These use cases are actually very simple to describe as shown below:
– “We’re using too much electricity to make our beverage product”
– “We’re taking too long to fix our pumps when someone files a support ticket”
– “We’re spending a large amount of money on chemicals to clean our water”

2. Pick a data workflow that’s specific to an operation
Once you figure out the use case, the next step is to figure out the data workflow. A data workflow is a series of steps that a human would take to transform raw data into useful information. Instead of figuring out a way to automate all the workflows across the entire company, you should pick a workflow that’s very specific to an operation. This allows you to understand what it takes to get something working. We conducted a survey of 500 professionals to get their take on this and we found 78% felt supported by their team leaders when they embarked on this approach. Here’s the full report: Instruments of Change: Professionals Achieving Success Through Operation-Specific Digital Transformation

3. Be selective with data
Once you pick a workflow, you need to understand what specific data is going support this particular workflow. If you try to digest all available data, it leads to chaos and suboptimal outcomes. If you’re disciplined around what data you need, it will drive focus on the outcomes and ensure that the project is manageable.

4. Create a benefits scorecard collaboratively
The main reason you’re deploying AI is to drive a specific outcome. This outcome should be measurable and should have a direct impact on the business. You should include all stakeholders in creating a benefits scorecard. The people implementing the AI solution should hold themselves accountable with respect to this benefits scorecard. The time to realize those benefits should be short e.g. 90 days.

5. Have the nuts-and-bolts in place that enable you to scale
Let’s say you successfully execute on this PoC. What’s next? You should be able to replicate it with more use cases across the company. There’s no point in doing this if the approach is not scalable. Make sure you have a data platform that supports deploying a wide range of use cases. The nuts-and-bolts of the platform should enable you to compose many workflows with ease. What does “nuts-and-bolts” include? It includes automating all the work related to data — checking data quality, processing data, transforming data, storing data, retrieving data, visualizing data, keeping it API-ready, and validating data integrity.

Building vs Buying AI Platforms: 6 Tradeoffs To Consider

Companies that produce physical goods and services have been around for a long time. These companies drive our economy. If you’re one such company, your core goal is to get your customers to love your primary offering. Your aim is to produce these goods/services at a low cost, sell them at a higher price, and keep increasing your market share.  

A company needs many tools to manage its business. Even though the core focus of a company should be its primary offering, people within the company often feel the need to embark on efforts to build those tools internally. One such example is building AI platforms from scratch. If a company needs a tool on a daily basis, they’re tempted to build it in-house and own it. 

They can certainly build the tool, but is it a good use of their time? Another group within the company feels that the build-it-yourself path is too long and that they should just buy existing software products from the market to run their business. How do you evaluate if you want to build it yourself or buy an existing software product? How do you ensure that you’re positioning the company for success? From our experience, we’ve seen that there are 6 tradeoffs to consider before making that decision:

#1. Upfront investment
Creating enterprise-grade AI software is a huge undertaking. This software needs to work across thousands of employees and a variety of use cases. Building it in-house would be a multi-million dollar project. In addition to that, there will be ongoing maintenance costs. On the other hand, buying that software from the market would mean the upfront cost would be low and the ongoing maintenance costs would be minimal.

#2. Time sink
Companies that can afford to consider building AI software in-house are usually large. There are many groups and budgets to consider. From what we’ve seen, it takes 3 years to go from conceptual idea to production-ready software. It means that this asset won’t generate any return-on-investment in the first 3 years. In the meantime, your competitors would have already introduced a solution in the market by integrating an existing AI tool into their offering.

#3. Talent churn
A company can attract top talent for areas that drive its core business, but it will face difficulties in attracting top talent for AI software. Even if they hire software talent, the churn will be high. Due to this churn, the software that is built in-house will become clunky over time because nobody has a complete understanding of what’s happening. This will render the asset useless because people internally can’t (or won’t) use it.

#4. Being the integrator vs being the creator
Over the last 10 years, I’ve seen that successful companies are integrators of software tools. They bring the right software pieces into their architecture to drive their business forward. This is in contrast with being the creator of all the different pieces. For a company whose primary product is not cloud-based software, you’ll position yourself for success if you invest your efforts in understanding how to choose the right software as opposed to figuring out how to build everything from scratch.

#5. Core focus vs everything else
Successful companies have a fanatical focus on their core product to the exclusion of everything else. Their expertise in this area enables them to generate high ROI. For everything else, they get other firms to do the work. If the company does the work in these areas, their ROI would be very low. For example, an eCommerce company shouldn’t invest time in figuring out how to build their own water treatment plant just because their thousands of employees drink water every day. Not a good use of their time!

#6. Competitive advantage
AI software shouldn’t be looked upon as an asset that is external to the business and something that can generate returns that are independent of your core business. This is especially relevant to services companies. AI software gives you a competitive advantage that will have a direct impact on your core business.

Having built AI systems over the years, I’ve learned that architecting is the hard part when it comes to data and cloud software. Anticipating how the data behaves today as well as in the future is a key component of architecting a solution that can accommodate everything. A simple mistake today will compound over time and will render this asset useless in the face of change. Companies should invest in learning how to identify good architects. This will enable them to identify good partners and get them to do the work across these areas.

NEW REPORT: Creating Actionable Decisions in the Era of Digital Data

Data is crucial for the success of any business. In order to utilize the data for actionable decision making, data analysis must deliver information that is timely, accurate and trusted. Many manual processes lead to inconsistent data and errors, causing cost overruns and imperiling regulatory compliance. To save time and aid in efficiency, industries are leaning on digital transformation technologies as companies make the move to real-time data collection.

However, industrial manufacturing companies are still slow to adopt advanced technologies like AI and IoT. While companies have adopted these technologies for some departments, most have not implemented them across the enterprise, leaving gaps of productivity, efficiency and cost control.

Our new report, The Challenge of Turning Data into Action, found that nearly half (48%) of manufacturing companies use spreadsheets or other manual data entry documents. As a result, only 12% are able to take action on their data insights automatically.

The IIOT

A big player in the digital transformation of industrial manufacturing is Industrial IoT (IIoT), accounting for more than $178 billion in 2016 alone and proving critical to providing companies with a competitive edge, according to Forbes.

We found that 44% of manufacturing professionals stated that less than half of their company’s manufacturing process is outfitted with industrial IoT technology. But, IoT can be a game changer for many businesses.

What’s At Stake?

Despite the importance of this data, older processes like spreadsheets or standalone solutions remain in many manufacturing organizations. Data collection for manufacturers focuses on quality assurance (67%), operational efficiency (64%), labor time (63%) and cost of materials (63%) — all vital aspects of maintaining efficiency in the manufacturing industry. But for more than one-third of these manufacturing companies, they can’t make informed decisions on such matters due to a lack of trust in the data.

Turning Data into Action

Over three quarters (76%) of respondents said in order to take immediate action based on collected data, they need software solutions that analyze data in real-time. Manufacturers crave simple, reliable solutions to keep data timely and accurate. They noted that such a system could offer more efficient ways to communicate updates to people on the line as well as put all data into a single platform so information can be viewed quickly and accurately.

Plutoshift offers an asset performance management platform for a variety of process industries, including food, beverage, water and chemicals. We bring together data on one easy-to-use platform, contextualizing the information and measuring the bottom-line financial impact.

Download the full report here to read more about the challenges facing the manufacturing industry and how PlutoShift services can streamline data collection.

To stay connected, follow us on LinkedIn and Twitter for more, and be sure to visit plutoshift.test.

3 ways AI will amalgamate humans and machines to close the talent gap

The process manufacturing industry is experiencing a talent shortage that could potentially worsen with time if not proactively addressed. Faced with an aging population of workers, the manufacturing industry skills gap will only continue to widen if the industry as a whole does not do more to attract millennials and Generation Z.

When good operators retire, they take all of their operational knowledge with them due to lack of digitized documentation. With careers spanning 30 to 40 years, these operators are walking encyclopedias of knowledge. Companies lose these knowledge databases when they retire. The industry has not done a great job of backfilling these positions and documenting this trove of expertise. An inexperienced operator manages assets and processes poorly, which leads to inefficient production in terms of energy, chemicals, and labor costs.

This is where Artificial Intelligence (AI) and emerging technologies can help bridge the skills gap. They allow process manufacturers to thrive and attract new and emerging talent.

Are you feeling the strain of talent management within your manufacturing operation? If so, consider the following three areas.

Fixing the “old school” mentality to address the skills gap

According to Deloitte, the manufacturing skills gap will widen so much by 2025 that it will create 3.4 million openings for skilled workers and 2 million of those roles will go unfilled. Driving this will be the 2.7 million workers that will retire or leave the industry, combined with the roughly 700,000 jobs that will be created by growth within the industry.

A perfect storm of factors and misperceptions are potentially exacerbating the industry’s lack of appeal to emerging workforce. This will ultimately hinder talent acquisition and retention. Reasons include the perceived lack of innovation and use of antiquated technology systems that are siloed from each other.

For example, a single process analyst at a bottling company can be responsible for maintaining assets at 12 plants. At each facility, there might be five different types of software systems that house data within complex and aging, on-premise environments. As a result, industrial plant operators have to make decisions based on years of on-the-job experience and cumbersome tools to monitor the performance of assets. However today, with a strong performance monitoring technology, that process analyst extracts data from their systems, determines if the data can solve their particular problems, then applies AI to analyze and provide a greater level of data intelligence.

How can AI help bridge the legacy technology gap?

One of the biggest areas where AI can help immediately is by providing data that is mobile and on-demand. Our performance monitoring solution leverages AI to collect data from disparate, legacy systems, many of which emerging operators have no interest in learning because they are hard to use and not intuitively designed. For example, it is not uncommon for a process manufacturer to have sensor data located on one legacy platform, maintenance data in another, and financial data in a third system. This makes it difficult and time intensive to extract the data. Plutoshift connects all the data sources, extracts the relationships, and converts that data directly into actionable intelligence by surfacing relevant information at the right time.

AI can add years (of experience) to a person’s life

AI can help a more inexperienced engineer perform at a higher level. By collecting data across the organization, identifying trends, and discovering correlations, AI can then focus on living up to the second part of its name: Intelligence. After performing advanced analytics on the right kind of data, Plutoshift’s performance monitoring solution presents information to an engineer that allows them to make decisions in an intelligent way. No longer are 40 years of on-the-job expertise required because AI’s capabilities can fill that void. But do keep in mind that this person is absolutely still required to do the job. This is a common fear. No piece of analytics software, no matter how insightful, can replace that person. In fact, PwC reports that robotics and AI will create a net gain of 200,000 jobs in the U.K. alone by 2037.

The skills gap is a very real concern. The challenge in attracting the younger generation to work in the process industry is shedding the outdated notion that they’ll be working the factory floor like their grandparents did. The industry is evolving just like others. It is no longer about being covered in grease and carrying a big wrench to adjust machines. Today, it’s about leveraging advanced and emerging technologies capable of tasks that older generations couldn’t even imagine. It’s going to be the promise of working with these innovations that attract the next generation of the process industry workforce.

To learn more about how AI can help your organization, please read: 5 Things to Consider When Implementing Advanced Analytics for Industrial Processes

Influent Flow Forecasting Made Easy

Like the wastewater industry, most food and beverage manufacturing facilities are equipped with massive data systems to monitor and optimize the wide range of operations. These similarly regulated industries are increasingly adopting Artificial Intelligence (A.I.) into their processes to better manage systems and procedures.

Though many water industry professionals recognize the potential of A.I., the public health implications of delivering top-quality wastewater in addition to aged production infrastructure, municipal operators and engineers have not yet enjoyed the same benefits of these technologies.

Several large corporations have invested heavily to develop broad “solutions” to address the challenges of water production industries. Yet, these systems have been hit or miss due to the wide range of data streams and particularities within plants across the water industries.

For decades, water treatment process decisions have been made by plant operators based on information spread across a wide range of systems. Calculations are often made by hand and cautious decisions are chosen to avoid the vast array of potential risks – often without regard to cost or process efficiencies. Recognition of patterns of system behavior is nearly impossible as a variety of staff are tasked with administration of multiple machines on an irregular basis.

What if there was a way to recognize the risks and achieve optimal efficiencies that could address the specific challenges faced by an individual plant, without additional infrastructure investment?

One of the many benefits of the marriage between machine learning and Artificial Intelligence, as utilized by Pluto AI, is the ability to recognize the differences in individual system behavior and processes to make more informed decisions to improve plant efficiencies while controlling for potential risks.

Utilizing the existing data from each individual plant, the EZ Influent Flow Predictor will forecast influent flow and detect anomalies to help operators predict future plant behavior and upcoming challenges. The machine learning aspect of our proprietary algorithms analyze and continuously learn from the existing data that impacts incoming flow and Artificial Intelligence maps out the data to provide actionable insights to operators to determine the best course of action based on the range of potential risk factors present.

Our unique system of dashboard insights and alerts have helped customers achieve compliance and save thousands in operational costs.  A pilot version of the EZ Influent Flow Predictor is available for free to a limited number of treatment plants, learn more about how to enroll.

Four-Pronged Strategy For Asset Management In Water

When you look at the water treatment facilities, assets are very critical to their operations. These assets can be pumps, pipes, evaporators, chlorinators, and so on. Most of the inefficiencies like water leakage, monetary losses, or compliance related fines can be directly attributed to assets’ performance. So why don’t water facilities just replace the assets when they go down in efficiency? One of the biggest problems here is that assets are very expensive. Replacing them is not an option until it completely dies down. Given this situation, what can the water facilities do to solve their problems?

What are the main problems?

Water and wastewater treatment facilities face enormous challenges when it comes to managing their operations. These challenges represent significant expenses to operators. Some of the highest ranking problems include asset health prediction, anomaly detection, performance forecasting, combined sewer overflow avoidance, and many more. Understanding the asset health and learning how to predict it can open up a lot of doors, especially when we can’t replace them frequently.

Understanding the definition

Before we dig into asset health prediction, we need to understand asset management. What exactly is asset management anyway? Sounds like it’s just managing the assets, right? Well, there’s a lot more to it than that. When it comes to wastewater asset management, we need to be aware of all the variables that impact a particular asset’s health. It includes the operation, maintenance, and replacement of assets on the critical path. For example, the critical path for a water utility will be retrieving, purifying, and dispersing clean water. This path will include water pumps, water transportation pipes, stormwater sites, and many other components.

What exactly is the problem?

One of the first and foremost questions that comes to mind is — What’s the big deal here? Why can’t we just use simple thresholds to get alerted about assets? The problem is that the data is very unstructured. This data is usually a combination of numbers, free form text, SCADA, ERP, event logs, and more. It’s usually referred to as a “data lake”. Extracting meaningful insights from this data lake takes several areas of expertise like:

  • Automatic data processing engine to parse the data
  • Natural Language Processing to understand text
  • Time-series modeling to analyze sensor data
  • Predictive analytics for event prediction
  • In reference to the title of the post, these are the four prongs we need to build anything meaningful. Modern water facilities are gathering data using many different sources, so we need to make sure we use all that data to drive efficiency upwards.
Okay I understand the problem, but what’s the solution here?

We need a solution that can extract wisdom from this data lake consisting of large amounts of unstructured data. More importantly, we need wisdom that’s specific to water. We don’t need some generic “Artificial Intelligence platform” that uses the same model for many verticals like healthcare, energy, mining, and so on. Artificial Intelligence is an amazing tool that can solve really difficult problems, but only if we use it in the right way. Water is a very unique vertical that has a lot of nuances associated with it. An Artificial Intelligence solution that takes this into account when extracting wisdom will totally outperform a generic Artificial Intelligence platform. Artificial Intelligence deserves to be used in the right (and slightly constrained) way so that it can have a meaningful impact.

3 Reasons Why We Need Deep Learning For Water Analytics

Over the past few years, the business world has seemed to enter a frenzy around buzzwords like “analytics,” “big data,” and “artificial intelligence.” There are two key elements to this phenomenon. First, the amount of data generated has exploded recently. Second, effective marketing schemes have created an “analytics” frenzy. In many cases, business and utilities don’t even know why they need or want hardware (sensors, meters) that will allow them to collect data every 15 seconds. Even when they do that, they are not sure why they need an analytical software component to study the abundance of data. Business and utility managers simply want to increase revenue and decrease costs, but don’t care about the specifics.

Unfortunately, all this frenzy allows for the entry of charlatans that just want to create noise. Another problem is that this prevents end users from reaching their full business potential. Now why is that? Because unsuspecting customers may end up purchasing poor analytics solutions. This forces them to conclude that analytics just doesn’t work and they revert back to their old inefficient ways.

Aren’t all analytics solutions equivalent?

Not at all! This is true for a variety of reasons, but let’s go through some of the key attributes of the most popular analytics solutions provided to end users today. We promise to not go too far down the technical rabbit hole.

The most common technique that you’ll come across is conditional monitoring. This is just monitoring the values coming from sensors and taking action based on some simple thresholding. As you can imagine, this is not sufficient at all. Setting thresholds manually and hoping that nothing goes wrong is like walking blindfolded in the middle of a busy freeway.

How about extracting rolling stats?

Extracting rolling stats refers to calculating metrics in real time based on a time window. You can also extract things like variance or autocorrelation. But these metrics are very simplistic and don’t tell you too much about data. You will not be able to infer anything about the cause and effect, which is where all the money is.

You can get a bit more sophisticated and build autoregressive models that can analyze timestamped data. The problem with autoregressive models is that they assume that the current output is a direct result of the previous ‘n’ values. And neither the value of ‘n’ nor the relationship with those values is allowed to evolve over time. Other machine learning techniques impose similar restrictions. It’s like forcing you to stick with the shoe size based on what you bought when you were 12 years old. It’s not going to fit all your life!

One technique to rule them all

This is where Deep Learning becomes really relevant. If we were to summarize the shortcomings of all those techniques:

  • The time difference between cause and effect has to be small (and not variable)
  • The relationship of the current output (effect) with the previous input measurements (cause) is not allowed to evolve with time
  • The current output (effect) is not dependent on previous outputs (effect)

Deep Learning is really good at solving these problems. Due to the inherent nature of deep neural networks, there’s very little manual intervention. This allows the engine to train itself very efficiently and solve problems with high accuracy.

Moving forward

Now, when businesses and utilities need to solve difficult business intelligence problems, they will have an intelligent understanding of what analytics solutions can offer. As stated above, there are pros and cons to various solutions, but the technique which stands out as superior in efficacy, speed, and quality is Deep Learning. The good thing is that customers don’t need to know anything about Deep Learning in order to use it. All they need to know is that it’s like Winston Wolf from Pulp Fiction … It solves problems!