A Day in the Life: Where an Industrial Operator’s Time Goes

Steve is a manager at an industrial beverage plant that produces bottled soft drinks. Accessing, analyzing, and sharing data about the daily performance is an integral part of his job and one that can often be tedious and time-consuming. 

Resources like energy, chemicals, and water all play a role in the quality of the end product the plant produces as well as the profit margins Steve and his team can achieve. Manual and legacy data management processes can eat up a serious portion of an operator’s day. 

The following illustrates the challenges that workers like Steve experience throughout the day in an attempt to manage and make sense of their data. 

Monday: 9 a.m.

Steve gets to his desk and opens an email from his colleague about the performance of a new piece of equipment the plant installed last week. He doesn’t quite remember what the data in the spreadsheet is measuring, but it doesn’t look good. He searches back through last week’s emails to jog his memory.

He clicks download on the Excel spreadsheet attachment in his colleague’s email, only to get a pop-up window that says he needs to update his Microsoft Office Suite in order to open the document. He asks himself, “Where is that activation code, again?”

He opens the spreadsheet and has to correct some of the data formulas that didn’t import the right way, he starts reading through the 13 tabs in the document. The numbers don’t look right for some reason. He swears it was performing perfectly when he read the initial read-outs from his technician last Friday. Steve rifles through the thick portfolio on his desk for the printout the technician gave him last week. He can’t find it. “I’ll have to give that tech a call,” he says.

Steve then gets a voicemail message saying that particular technician is out sick today. The report will have to wait. 

1 p.m.

After back-to-back meetings, Steve gets called down to the factory floor to inspect a piece of equipment that has automatically shut off due to a malfunction. Production is at a standstill as he and his team try to figure out what went wrong with the machine.

After sifting through dozens of printouts and warning screens on the equipment itself, he and the team discover the machine was overheating. Things get frantic as the plant sits idle, so Steve makes an executive decision to adjust the cooling system on the equipment to a temperature his gut tells him will work (he has over 25 years of experience, so his intuition is spot on, right?)

3:30 p.m.

Steve gets back to his desk and opens the spreadsheet from the morning. He realizes the report from the email was showing the coolant malfunction in the machine he just had to deal with on the factory floor. He has access to all this data, but it’s spread out across so many different sources that he can’t make the appropriate decisions that will lead to meaningful actions. He combs through the spreadsheet to see if the gut-based temperature adjustment he made earlier was the right one.

He’s way off…

Like thousands of other industrial operators, Steve can’t make real-time adjustments to his plant’s processes when his data is locked in legacy and manual systems. He would benefit from a centralized platform that can offer him real-time updates on his plant’s processes and assets, as well as automated recommendations and solutions on how to fix problems when they arise. 

After that nightmare of a day, Steve has to spend the next morning looking for ways to deal with equipment downtime and the issues that spreadsheets and other legacy methods have been causing him.

Increasing their industrial intelligence By installing advanced automated sensors powered by an AI system, Steve and his team can monitor critical assets and conditions around the clock in a clear and simple readout that is always up to date. And when emergencies arise, the right AI system can automatically make adjustments and recommendations before a time-wasting issue halts production.  

Does any of Steve’s day sound familiar to you or your team? Unplanned downtime can cost manufacturers an estimated $50 billion annually. It may be time to reevaluate your relationship with your data. 

 

 

 

Plutoshift APM Brings Direct Financial Impact To The Process Industry Using AI

Companies in the process industry today are expected to generate more revenue using fewer resources and without buying new assets. There’s greater pressure than ever to be efficient and nimble, worsened by the potential of a global trade war. Manufacturers have questioned if artificial intelligence (AI) could be cost-effectively harnessed to transform the industry like it promises for so many others.

AI is only as good as the outcomes it supports, so our team’s biggest priority is to make it as easy as possible for our customers to leverage AI to generate the ROI that matters most to them.

Today we announced our cloud-based, AI-driven asset performance management (APM) platform designed specifically for the process industry. Using AI, Plutoshift automatically and continuously connects asset data with financial metrics, letting you easily measure performance, achieve your business outcomes, and increase profit margins.

We worked closely with our Fortune 500 customers in verticals that would push the limits, including food, beverage and chemical, to solve their critical pain points. The vast amount of industrial sensor and IIoT data manufacturers rely on to overcome challenges is often trapped in legacy systems. These aging, on-premise systems can’t correlate the impact of asset performance over future revenue metrics. Plus, these tools have not kept pace with the mobility and ease-of-use demands that today’s savvy end users expect.

That was just the beginning.

We created a platform that lets plant managers discover process inefficiencies and new opportunities to increase throughput, speed up ticket resolution, reduce resource consumption, and eliminate waste. Plutoshift’s proprietary algorithms leverage both existing historical and real-time data, extracting actionable insights between asset behavior and revenue. We work with all of your existing data systems seamlessly, providing you with immediate ROI.

Plutoshift can be accessed safely from anywhere, empowering those front-line end-users who expect an on-demand experience. While new, it’s proven – the only solution vetted by global forums of leading industrial technology evaluation committees – and is accompanied by mature features, including:

  • Deep analysis and intelligence: Easily connects to data streams including SCADA, ERP and CMMS to produce actionable insights on the costs, risks and efficiencies of plant operations.
  • Agile integration: Integrates with every process historian on the market today.
  • On-demand insights: Interactive, easy-to-use dashboards and alerts enables operators to work effectively from anywhere.
  • Pre-built asset templates: A growing library includes membranes, cooling towers, CIP systems, clarifiers, dryers, and more.

We built Plutoshift APM to help companies bridge the relationship between the data and financial performance of their assets. Contact us today for a free demo.

How to measure the success of an APM deployment

The field of Asset Performance Management (APM) has taken off like a rocket ship in the last 3 years. It’s propelled by the fact that the industrial companies want their assets to generate more revenue, but without additional expenditure on buying new assets or upgrading existing infrastructure. This is where APM comes into picture. APM software allows them to pursue this goal in an effective way. How does it do that? Where does Artificial Intelligence fit into this whole thing?

Why do I need Artificial Intelligence?

APM makes it possible by allowing them to leverage the large amounts of data generated by the industrial sensors that are monitoring critical assets. A good APM solution leverages Artificial Intelligence algorithms to achieve the business outcomes. If you are considering or have heard that Artificial Intelligence may be a way optimize your processes, then you’ve probably stumbled upon a plethora of marketing material telling you all about the spectacular benefits of such solutions. They might have also used phrases like Machine Learning, Deep Learning, Advanced Analytics, Predictive Analytics, and so on.

Every AI initiative is won or lost before it is every deployed 

We love Sun Tzu here at Plutoshift. Deploying an APM solution can be quite confusing. In this series of 5 blog posts, we will talk about what we’ve learned about the success and failure mechanisms of these deployments, the things you should know, the benefits you can expect, and the preparation you’ll need to get the most out of your investment.

If leveraging Artificial Intelligence were easy and success was guaranteed, everybody would do it all the time. Today, it isn’t! It is a rapidly growing field. The benefits are very compelling when implemented correctly. APM can provide information and recommendations that will give you a significant competitive advantage.

How does it relate to asset performance?

When operating assets such as membranes, clarifiers, condensers, cooling systems, or clean-in-place systems, there are typically several standard practices. They are like rules-of-thumb! These static rules are used to maintain production at a reasonable level, and to ensure adequate performance and quality. They are not perfect, but the system works in general. If operators had a better understanding of the specific process and its unique response to future conditions, they would agree that the performance could be improved.

The trouble is that the number of varying conditions and large amounts of data to sift through with standard analytics is too vast to be useful, not to mention time consuming. Continuously detecting and measuring the changing relationships make it difficult to do it manually. Without continuing to do the work and getting lucky identifying correlations, any improvements that were made would fade away over time. They become no better, and probably worse, than the rules-of-thumb they replaced.

How does Artificial Intelligence solve this?

Artificial Intelligence allows us to discern correlations, find the cause to a specific process, and predict its future impact by using algorithms to analyze large volumes of data. A good APM solution uses these Artificial Intelligence algorithms to predict future business outcomes. It also continues to analyze data and optimize setting recommendations to likely future conditions on-going. The result is the actual best settings to lower costs, improve quality, and mitigate unplanned downtime.

But what if it’s wrong?

Artificial Intelligence sounds like a great way to get things done. When implemented properly, instead of static or semi-static conservative settings being used, operators would receive the best settings for a specific duration. But what about the cases when the predictions are off? After all, some of these processes may affect the health of a community! It certainly will affect the health of your company if the information provided by Artificial Intelligence is wildly incorrect. This is where asset performance monitoring comes in.

In a good APM solution, advanced analytics or predictions are an important but small part of the information delivered. The rest of the information are useful metrics and key indicators that, quite frankly, are there to provide evidence of the conditions and support the recommendations derived by Artificial Intelligence. The value of these indicators is usually more important on a daily basis than the advanced analytics or predictions.

For an APM solution to be effective, it should provide a way to continuously track the impact of asset performance over future revenue metrics. This doesn’t necessarily refer to predictions, but hidden patterns that are not visible to the naked eye. APM solution centered on business processes, as opposed to machines themselves, is way more likely to succeed.

In the next blog post, we will discuss the things you need to consider before implementing a Machine Learning project. We will talk about the process of figuring out when it makes sense to go with a vendor versus doing the work yourself, the factors you need to consider before choosing a vendor, and the role of subject matter expertise in the world of APM.

Predict Tomorrow’s Influent Flow With Today’s Data

Wastewater plant operations make important operational decisions based on the influent flow rate to the plant, and despite the ample availability of sensors, there is no accurate industry standard for predicting influent flow rate to the plant.

Knowing the performance of a collection system is difficult because there are few industry-recognized benchmarks on what “performance” is and how it should be determined. Performance of sewer collection systems are often simply educated guesses. Quantifying the areas of highest inflow and infiltration can be difficult due to large networks of pipes, the expense of water monitoring, and varying weather conditions impacting soil saturation.

Municipal sanitary sewer collection and conveyance systems are an extensive, valuable, and complex part of the nation’s infrastructure. Collection systems consist of pipelines, conduits, pumping stations, force mains, and any other facility collecting wastewater and conveying it to facilities that provide treatment prior to discharge to the environment

Plant operators are responsible for ensuring there is enough treated water available for pumping into the distribution or discharge system as well as enough water to maintain ongoing operations. Many operators overlook production water in addition to effluent pumping rates when determining influent rate, this factor ensures treatment is consistent.

Influent flow rates are usually estimated by the operators based on experience and local weather forecasts. These back-of-the-napkin calculations are necessary to engage in master planning for the future of the facility. Determination of the future capacity should be based on needs and sizing, as well as the plant’s ability to meet regulations in the future, and expected timing to update or build new facilities, are all impacted by the irregular and unpredictable amount of influent entering a system.

EPA estimates that the more than 19,000 collection systems across the country would have a replacement cost of $1-2 trillion dollars. The collection system of a single large municipality can represent an investment worth billions of dollars. Usually, the asset value of the collection system is not fully recognized and the collection system operation and maintenance programs are given low priority compared with wastewater treatment needs and other municipal responsibilities.

Typically, small amounts of infiltration and inflow are anticipated and tolerated. Yet, unpredictable weather can increase this load and cause overflows. Management of these events are costly in terms of unplanned labor expenditures, repair of damaged equipment and health and environmental impacts sometimes incurring monetary fines and coverage on the evening news.

As one of the most serious and environmentally threatening problems, sanitary sewer overflows are a frequent cause of water quality violations and are a threat to public health and the environment. Beach closings, flooded basements and overloaded treatment plants are some symptoms of collection systems with inadequate capacity and improper management, operation, and maintenance. The poor performance of many sanitary sewer systems and resulting potential health and environmental risks highlight the need to optimize operation and maintenance of these systems.

Wastewater collection systems suffer from inadequate investment in maintenance and repair often due in large part to the “out-of-sight, out-of-mind” nature of the wastewater collection system. The lack of proper maintenance has resulted in deteriorated sewers with subsequent basement backups, overflows, cave-ins, hydraulic overloads at treatment plants, and other safety, health, and environmental problems.

Managing these complex water systems relies on heavy physical infrastructure and reactive governing attitudes. This is changing with the development of cyber-physical systems, real-time monitoring, big data analysis and machine learning with advanced control systems through the Internet of Things (IoT). These “smarter” systems; in which technology, components, and devices talk to each other and feed information to each other in a more sophisticated way bring about a more optimized, efficient process.

Data provided by weather radar are important in weather forecasting. Rainfall data are typically introduced to provide stormwater information at different locations in the vicinity of the wastewater treatment plant. Several consecutive days of rainfall appears to correlate with increased WWTP flows, indicating a trend that is historically related to interflow.

Goals of prediction to prevent overflows:
  • Reduce ratepayer costs by implementing all cost-effective I&I reduction projects
  • Minimize liability from water pollution and public health risks by eliminating storm-related SSOs
  • Proactive reduce overall I&I to avoid capital costs of capacity expansion in anticipation of future population growth
  • Eliminate enough I&I to offset the environmental and regulatory impact of sewer system expansion and increased water demand

Though sensors helped to combat the overflows in South Bend, Indiana for a while, they could only read out that they were being overwhelmed in a recent storm. Yet, if the data from those sensors flowed into a system powered by Artificial Intelligence, operators could have a forecast to predict that storm and may have be able to proactively divert in preparation.

Predictive influent flow rate information is helpful to determine the the most cost-efficient schedule of operating wastewater pumps. Pluto AI has developed a state-of-the-art prediction system which delivers a high accuracy influent flow forecast based on weather forecasts, recent influent flow trends, and the hydraulics of the plant and sewer system to predict influent flow into a wastewater plant.

To assess extraneous water entering your system at least a year of influent flow data to the treatment facility should be examined. Pluto recommends two. Contact us to learn more about integrating predictive forecasting for overflow prevention into your system.

Sources:
https://www.southbendtribune.com/news/local/south-bend-s-smart-sewers-overwhelmed-by-floodwaters/article_cb75b63c-aaa9-5b39-9c9c-df4fcd2b62b3.html
https://www.mass.gov/eea/docs/dep/water/laws/i-thru-z/omrguide.pdf
https://www.globalw.com/support/inflow.html
https://www.ce.utexas.edu/prof/maidment/giswr2012/TermPaper/Boersma.pdf
https://www.mountainview.gov/civicax/filebank/blobdload.aspx?blobid=6979

Highlights from the 2018 Membrane Technology Conference

Back in March, I attended the opening day of the AWWA & AMTA Membrane Technology Conference in West Palm Beach, Florida to meet Pluto customers. I wanted to learn more about the challenges facing them and explore the new processes and solutions being employed to meet those challenges.

The conference opened with an inspiring keynote address given by Water for People CEO, Eleanor Allen. Her speech offered a glimpse into the progress made through collaborative partnerships of social entrepreneurs around the world to provide potable water to the millions in need. Distinct from the technologically-focused presentations given throughout the day, this talk was an uplifting reminder of the life-sustaining impact of the advancements and efforts of the water industry’s products, services, and people.

After the lunch hour, Val Frenkel Ph.D., PE, D.WRE., of Greely and Hansen, presented a thought-provoking presentation entitled “What We Don’t Know About RO.” Dr. Frenkel provided a comprehensive review of the history of RO systems and the introduction to the commercial marketing dating back to the 1970s. He discussed the impact of specific system configurations to enable different types of RO systems to achieve individual targets of product quality or meet specific operating procedures for different applications.

Dr. Frenkel went on to describe pretreatment of membranes as a cost-effective way to insure integrity. Now that the performance of RO systems is no longer a question of achievability, the longevity and integrity of the RO membrane is the new focus for furthering system performance.

Another talk that stood out was a presentation by Pierre Kwan of HDR, regarding the Basin Creeks membrane operation, “All-Gravity Membrane Filtration: Design and Operational Considerations.” Kwan described an almost certainly unique circumstance of having a water reservoir with enough altitude above the plant to not only eliminate to the expensive pumping usually required but, created the complication of managing high pressure, instead.

Building a sustainable operation under these conditions had several interesting ramifications. Along with this gravity challenge was the high-water quality requirement, the two-stage membrane process implemented was impressive. The net result of this unique system design was that this facility consumed only 5% of the energy typically expected of a membrane plant. Kwan painted a vivid description of how thoughtful, custom design can overcoming the geographical and infrastructure challenges; the result was an compelling speech about how to achieve energy efficiency in the face of adversity.

Overall, the advancements in membrane integrity analysis and the appetite for increasing efficiencies is a rich area for predictive technologies. Pluto’s predictive analytics dashboard has helped several utilities and companies determine convenient cleaning schedules and discover optimal points for normalization of RO membrane trains, typically with a 3-5x ROI. Click here for more information (link to Demo)