Plutoshift Selected to Join Exclusive Sustainability Innovation Program

Plutoshift was born out of the desire to unlock the potential of AI to address energy and water challenges. And since our founding in 2017, sustainability and resource conservation has been a driving force behind our work.

So we were thrilled to have been recently selected for the 100+ Accelerator, a program with the express goal of accelerating sustainability innovation. The program, started in 2018 by AB-Inbev and backed by global leaders Coca-Cola, Colgate-Palmolive, and Unilever, strives to identify solutions for some of the most pressing sustainability challenges of our time. To that end, every year, the 100+ Accelerator team identifies a handful of qualified startups. Those chosen are organizations with innovative solutions and teams that are in the best position to succeed — the ability to scale quickly, make a significant impact on one or more challenge areas, and many like Plutoshift have a demonstrated track record.

This year, Plutoshift was among 36 startups selected out of over 1,300 applicants. It is of course exciting for us to be part of such an exclusive group. But more importantly, it validates the approach Plutoshift has taken to innovation. Every day, we focus on delivering AI and machine-learning solutions where they can make the greatest impact and generate tangible results. We’ve worked with global organizations across industries to get actionable intelligence into the hands of those on the front line of sustainability, and as a result, we have quickly helped them reduce resource consumption and their impact on the environment.

What’s most exciting about this partnership are: 1) it demonstrates that intelligent resource management and mitigating the impact of industry on the world is increasingly simply part of good business, and 2) organizations are more seriously committed to making real progress towards those goals. The challenges outlined by the 100+ Accelerator and the path forward are clearly defined, with a promise “to do business the right way, not the easy way.” Previous cohorts have already shown tremendous impact with solutions such as upcycling grain waste into nutrient-rich food and recycling electric vehicle batteries in China.

As proclaimed on the program site, “No one company can solve today’s sustainability challenges alone.” We wholeheartedly agree and look forward to continuing our work as part of the solution.

To learn more about the 100+ Accelerator program, visit www.100accelerator.com.

To discover how Plutoshift can help your business realize your sustainability goals, request a demo.

6 key ingredients of successful Machine Learning deployments

Machine Learning (ML) is a vehicle to achieve Artificial Intelligence (AI).

ML provides a framework to create intelligent systems that can process new data and produce useful output that can be used by humans. Automation technologies are the fastest-growing type of AI. Why? Because they are faster to implement, easy to deploy, and have high ROI. Leaders at an organization are often faced with the problem of figuring out how to make it work within their business.

Before any new technology is adopted, it needs to prove that it works. Business leaders need to create success templates to show how to make it work within their organizations. These success templates can then be used to drive enterprise-wide adoption.

How do we make machine learning deployments successful?

From our experience, there are 6 key ingredients to achieve this:

1. Identify a work process with repetitive steps

You should start by identifying the right work process. A good target here is a process where someone has to go through the same steps over and over again to get to a piece of information. Before deploying ML, the question you should ask is whether or not such a process exists. If this process exists, will the people benefit from solving it? If solved, it can directly increase productivity and revenue for the company. These work processes are actually very simple to describe as shown below:
– “How much electricity did the membranes consume 3 days ago?”
– “How long do we take on average to fix our pumps when someone files a support ticket?”
– “How much money did we spend last month on chemical dosing?”

2. Gather data specific to that work process

Once you identify a work process, you need to gather data for it. You should be selective with your data. You need to understand what specific data is going to support this particular operation. If you try to digest all available data, it leads to chaos and suboptimal outcomes. If you’re disciplined around what data you need, it will drive focus on the outcomes and ensure that the ML deployment is manageable. We conducted a survey of 500 professionals to get their take on operation-specific digital transformation and we found 78% felt supported by their team leaders when they embarked on this approach. Here’s the full report: Instruments of Change: Professionals Achieving Success Through Operation-Specific Digital Transformation

3. Create a blueprint for the data workflow

Once you have a clear understanding of the data, the next step is to create a blueprint for the data workflow. A data workflow is a series of steps that a human would take to transform raw data into useful information. Instead of figuring out a way to work with all the available data across the entire company, you should pick a workflow that’s very specific to an operation and create a blueprint of how the data should be transformed. This allows you to understand what it takes to get something working. The output of this data workflow is the information that can be consumed by the operations team on a daily basis.

4. Automate the data workflow

Once you have the blueprint for the data workflow, you should automate it. An automated data workflow connects to the data sources, continuously pulls the data, and transforms it. The operations teams will be able to access the latest information at all times. New data that gets generated goes through this workflow as well.

5. Create and track the benefits scorecard

The main reason you’re creating the automated data workflow is to drive a specific outcome. This outcome should be measurable and should have a direct impact on the business. You should involve all the stakeholders in creating and tracking this benefits scorecard. The people implementing and using the ML system should hold themselves accountable with respect to this benefits scorecard. The time to realize those benefits should be 90 days or less.

6. Build the data infrastructure to scale

Once you successfully execute on this workflow, what do you do next? You should be able to replicate it with more workflows across the company. A PoC is not useful if it can’t scale across the entire organization. Make sure you have the data infrastructure that supports deploying a wide range of workflows. A good platform has the necessary data infrastructure built into it. It will enable you to create many workflows easily on top of it. The capabilities of the platform include automating all the work related to data — checking data quality, processing data, transforming data, storing data, retrieving data, visualizing data, keeping it API-ready, and validating data integrity. This will allow you to successfully use the platform to drive real business value at scale.

The Water Values Podcast: Digital Transformation with Prateek Joshi

CEO Prateek Joshi talks about digital transformation in the water sector. Prateek hits on a number of important and practical points in a wide-ranging discussion on data, AI, and machine learning in the water sector.

In this session, you’ll learn about: 

  • Prateek’s background & how it influenced his arc into the water sector
  • Water-intensive industries and using water data in those industries
  • Prateek’s view on digital transformation
  • How COVID influenced the digital transformation
  • The limitations of human-based decision-making
  • Common challenges for data-centric organizations
  • How to drive organizational behavior change with respect to data usage
  • The difference between AI and machine learning
  • Data quality and verification issues
  • The factors companies look for when selecting an AI system

Click the link below to listen:

https://episodes.castos.com/watervalues/TWV-192-Digital-Transformation-with-Prateek-Joshi.mp3

 

 

100th Episode Of The Dan Smolen Podcast

Prateek Joshi, Founder and CEO of Plutoshift, discusses how A.I. makes the world a better place on the 100th episode of The Dan Smolen Podcast. The Dan Smolen Podcast is the best at covering future of work and meaningful work topics and trends.

In this episode, Prateek:

  • Describes Plutoshift and his role in the company. Starts at 3:03
  • Defines A.I. and contrasts it with Machine Learning. Starts at 3:51
  • Addresses workforce concerns that A.I. takes jobs away from people. Starts at 8:52
  • Illustrates how Plutoshift helps clients involved with providing clean and potable water. Starts at 13:03
  • Identifies the training and advanced skill that he seeks in hired talent. Starts at 20:25
  • Tells us how, beyond his work, he adds fun and enjoyable activity to each day. Starts at 27:59


Listen to the A.I. and the Future of Work podcast.

Databases, Infrastructure, and Query Runtime

Recently, my team was tasked with making a switch from a combined MySQL and Cassandra infrastructure to one in which all of this data is stored entirely on a PostgreSQL server. This change was partially due to an increased drive to provide necessary and crucial flexibility to our customers, in tandem with the fact that Cassandra was simply not necessary for this particular application, even with the high quantities of data we were receiving. On its face, the mere need for such a change almost looks backwards given how much movement within the tech industry has been made away from SQL databases and towards NoSQL databases. But, in fact, NoSQL — or even hybrid systems — are not always best.

Performance Gain Considerations

In certain applications, one might find that performance gains, hoped to be reaped from NoSQL’s optimizations, may not translate perfectly to production without some forethought. I would personally argue that SQL databases often are preferable (over something like Cassandra) in non-trivial applications, most of all when JOIN operations are required.. Generally speaking, NoSQL databases — certainly Cassandra, among others — do not support JOIN. I will add to this that the vast majority of ORMs (for those who may not be familiar with the term, these are effectively systems of abstracting database relations into typically “object-oriented” style objects within one’s backend code) are built around SQL. Thus, the flexibility and readability that is afforded by these ORMs — at least when operating a database of non-trivial objects —can be a lifesaver for development time, database management, integrity, and readability. Indeed, I would even argue that, for most web applications, it often outweighs the sometimes marginal or even relatively negligible performance increases that a NoSQL database may provide (of course, this is completely dependent on the nature and scale of the data, but that is perhaps a topic for another time).

Cloud Infrastructure

However, none of this matters if the engineer is not paying close attention to their cloud infrastructure and the way that they are actually using their queries in production. In evaluating one engineer’s project, I found they were doing all of their insertion operations individually rather than attempting to batch or bulk insert them (when this was well within the scope of this particular application). It appeared they had been developing with a local setup and then deploying their project to the cloud where their database was running on a separate machine from their server. The end result in this case was rather comical, as once insertions were batched, even in Postgres, they were orders of magnitude faster than the piecemeal NoSQL insertions. They had not considered the simple fact of latency.

How did this original engineer miss this? I do not know, as this particular piece of software was inherited with little background knowledge. But, given that they were testing locally, I can assume that they elected for individual insertions. Making queries in this way can sometimes be less tricky than bulk insertions (which often have all sorts of constraints around them, and require a bit more forethought, especially when it comes to Cassandra). We found the performance was beyond satisfactory. What they did not consider, however, is that the latency between the backend server and a Cassandra (or SQL) server hosted in any sort of distributed system (ie. production). This meant that it didn’t really matter how fast these queries were; the latency between the backend and the database was so much greater than the query runtime, that, in fact, it really didn’t even remotely matter which database was used. So it followed that the real-world performance was actually significantly improved by simply batching insertions in Postgres (though of course, batching is supported in Cassandra — but the change was necessary nonetheless).

The Moral of the Story

In any case, the moral of the story here, in my opinion, is that understanding your own cloud infrastructure is crucial to writing actual performant programs in the real world. As well as the fact that, just because one database may be purported to perform better than another given certain circumstances, without a solid understanding of the environment in which this application is going to be deployed in, one cannot hope to see any appreciable performance gain.

5 ways for business leaders to package AI into bite-sized pieces

Large companies have been tackling the issue of AI for the last few years. Business leaders are often faced with the problem of figuring out how to use this technology in a practical way. Any new technology needs to be packaged into bite-sized pieces to show that it works. These “success templates” can then be used to drive enterprise-wide adoption. But should they do it all at once? How do you ensure that you’re not boiling the ocean? How can a company package AI into bite-sized pieces so that their teams can consume it? From what we’ve worked on with our customers and seen in the market, there are 5 steps to do it:

1. Start with the use case
It always starts with a use case. Before launching any AI initiative, the question you should ask is whether or not there’s a burning need today. A need qualifies as “burning” if it has a large impact on your business. If solved, it can directly increase revenue and/or margins for the company. We need to describe this burning need in the form of a use case. These use cases are actually very simple to describe as shown below:
– “We’re using too much electricity to make our beverage product”
– “We’re taking too long to fix our pumps when someone files a support ticket”
– “We’re spending a large amount of money on chemicals to clean our water”

2. Pick a data workflow that’s specific to an operation
Once you figure out the use case, the next step is to figure out the data workflow. A data workflow is a series of steps that a human would take to transform raw data into useful information. Instead of figuring out a way to automate all the workflows across the entire company, you should pick a workflow that’s very specific to an operation. This allows you to understand what it takes to get something working. We conducted a survey of 500 professionals to get their take on this and we found 78% felt supported by their team leaders when they embarked on this approach. Here’s the full report: Instruments of Change: Professionals Achieving Success Through Operation-Specific Digital Transformation

3. Be selective with data
Once you pick a workflow, you need to understand what specific data is going support this particular workflow. If you try to digest all available data, it leads to chaos and suboptimal outcomes. If you’re disciplined around what data you need, it will drive focus on the outcomes and ensure that the project is manageable.

4. Create a benefits scorecard collaboratively
The main reason you’re deploying AI is to drive a specific outcome. This outcome should be measurable and should have a direct impact on the business. You should include all stakeholders in creating a benefits scorecard. The people implementing the AI solution should hold themselves accountable with respect to this benefits scorecard. The time to realize those benefits should be short e.g. 90 days.

5. Have the nuts-and-bolts in place that enable you to scale
Let’s say you successfully execute on this PoC. What’s next? You should be able to replicate it with more use cases across the company. There’s no point in doing this if the approach is not scalable. Make sure you have a data platform that supports deploying a wide range of use cases. The nuts-and-bolts of the platform should enable you to compose many workflows with ease. What does “nuts-and-bolts” include? It includes automating all the work related to data — checking data quality, processing data, transforming data, storing data, retrieving data, visualizing data, keeping it API-ready, and validating data integrity.

The Stream Podcast

Prateek Joshi, Founder and CEO of Plutoshift joins Will Sarni and Tom Freyberg on The Stream to talk all things Artificial Intelligence (AI)

 

Why Manufacturing Companies Continue to Struggle with AI Implementation Projects

Manufacturing companies know that Artificial Intelligence (AI) can have multiple business advantages for the frontline team and on the overall bottom line. But despite companies’ best intentions and AI’s clear potential, many companies struggle to fully utilize AI. This is leading them to reevaluate their strategy on how to leverage AI for their business, according to our new report

The usage of AI  at the enterprise level is continuing to grow, but the  projects are often loosely defined and can take longer than anticipated to show returns. This has the potential to limit the progress that AI can provide. To further understand the hurdles standing in the way and opportunities AI can bring to manufacturing, we surveyed 250 manufacturing professionals who have visibility into their company’s AI strategy.

Continuing The March Toward AI Implementation

Companies have good intentions when they begin the AI implementation process, but the complexity of AI can pose problems and cause companies to reevaluate their strategy. 

  • 61% said their company has good intentions but needs to reevaluate the way it implements AI projects
  • Only 17% of respondents said their company was in full implementation stage of its AI projects
  • 34% said their company has struggled to keep its AI project(s) in scope because there was a lack of expert guidance at the planning phase of the project

What is Really Holding Companies Back

Having a mature data collection and storage system is essential for AI implementation projects. Without the ability to collect and store data in a timely manner, manufacturing companies can’t get far with AI implementation. Manufacturers are realizing it takes more time than anticipated to get their data systems up and running. 72% of manufacturing companies said it took more time than anticipated for their company to implement the technical/data collection infrastructure needed to take advantage of the benefits of AI.

Internal buy-in is important for any company project, but it’s especially important for something as complex as AI. Just like marketing, business development, or any other business function, commitment from key stakeholders is essential for success.

Internal buy-in from employees needs to be taken into account when implementing AI projects. 62% said their company took more time than anticipated to acquire internal buy-in and commitment in implementing AI. This lack of internal buy-in has also caused 34% of employees to say that there is a lack of engagement toward these projects.

Important Factors For Successful AI Projects

We found that issues with internal buy-in, decisions regarding who should use the data and lack of budget consensus related to AI projects can slow down implementation in the manufacturing industry.

It’s important to have a mature data collection and analysis infrastructure and to agree on specific business outcomes before implementing and incorporating AI into an industrial workflow. Multiple reasons are given when companies make the decision to begin implementing AI. 54% said that cost savings were the top business problem that they were trying to solve, followed by 49% that said automating tasks was the top reason.

These kinds of projects can lose focus within the company and encounter multiple problems. Less than half (47%) said their company has kept AI projects in scope and focused on deliverables. Moving to a focused approach that can manage the complex process of AI can help to eliminate these issues and help companies stay focused on the long-term goal.

The use of AI technology in the manufacturing industry has the potential and opportunity for companies to empower the frontline team with automated performance monitoring for any industrial workflow. AI can help businesses drive ROI by reducing resource consumption, operating costs, and reliably predict the current state of their business predictions, whenever they need it. 

How do your company’s experiences with AI match up to the respondents? Read more about the report and the methodology here. 

Industrial Automation: 4 Industrial AI Predictions for 2020

2019 was a banner year for AI! Almost every business magazine and tech publication published a deluge of articles about AI and its impact on the industrial world and industrial automation. While some of the speculations about AI were overblown, the majority of the attention dedicated to the state of AI by analysts and the media is well warranted.

AI’s impact on industrial automation has already dramatically changed the way companies think about their operations and how they collect and use their data. 2020 will undoubtedly see some of the recent trends in AI continue to intensify, but what else should industrial operators and C-level executives expect for the start of the new decade? Here are four predictions for the world of industrial AI in 2020:

  1. Business Intelligence will Become More Accessible

Access to data is critical for any industrial operator, but many companies are hindered by siloed data sources that cannot be easily accessed by the entire company. Business functions such as budgeting, sales, and performance monitoring all rely on data that’s often needed quickly.

Data analysts in large organizations are under increased pressure to make actionable recommendations out of unstructured and siloed data. There are now vertical-specific products available that can centralize data and allow everyone to access it across the organization. 2020 will be the year that companies commit to making their data more accessible to their workers as a business imperative.

  1. AI Will Continue to Spread to New Industries

Manufacturers are continuing to invest in AI solutions that address issues like preventative maintenance and the automation of certain tasks. Based on this progress, the less obvious industries will invest in this technology as well.

Verticals that are closely related to the manufacturing industry such as supply chain management and logistics are benefiting from AI applications, with logistics managers using AI applications to help forecast demand trends in key markets that can optimize their supply chains.

Companies like Amazon are also using AI-powered machines to help automate manual tasks in their warehouses. While these machines can’t complete complicated tasks, they are proving to be useful aids to workers who can pass off some repetitive and labor-intensive jobs.

On a broader scale, CISOs and security professionals are facing more complicated threats every year. With talent shortages and an overall increase in the number of IoT devices connected to any given server, companies can no longer rely on legacy or manual processes to respond to security events.

AI applications are helping to automate threat detection and can provide around-the-clock monitoring services for enterprises that are always at risk.

  1. AI Will Change The Way Work Gets Done

Much like the combustion engine or the wide-spread adoption of electricity, AI will have an impact on the way people do their jobs. More specifically, it will impact the way work gets done. AI applications will continue to enhance workers’ roles through automation and problem solving, helping to manage the burden of the millions of micro-decisions workers make in their jobs.

While the best industrial AI applications are easy and intuitive to use, the rapid adoption of AI will create specialized job opportunities. Recent reports have estimated that the adoption of AI will create 58 million more jobs than it displaces in the next few years.

There are instances where workers need to adapt to new technology, but lack the proper access to getting trained. This is where education leaders, labor organizations, local governments, and tech companies have stepped in. They are committed to the reskilling initiatives that will address the needs of workers.

  1. Companies Will Demand Accountable AI Projects

As the field of AI moves from the theoretical to the applicable, companies will demand projects that can deliver targeted results and stay on budget. According to our new report, 33% of manufacturing professionals say their company struggles with their budget scope with either being over budget or finding the resources to implement AI.

While the benefits of using AI applications at scale are clear, reaching that ROI can be a challenge for some companies. Read more about this and some of the other challenges companies are facing when implementing AI in our new report.

AI for Coffee Manufacturing: 3 Ways AI is Energizing The Coffee Industry

From bean to barista, the global coffee industry is valued at over $100 billion. For a producer, distributor or manufacturer in this massive industry, the use of AI for manufacturing can play a vital role in optimizing critical processes. 

Specifically, in retail, agricultural and manufacturing operations, the coffee industry is discovering ways AI applications can benefit everyone from small-scale farmers to large industrial plants. Below are three examples where AI is benefiting the coffee industry.

  1. AI is helping farmers protect their crops.

The increased use of AI applications in agriculture has the potential to help farmers across the world protect their beans from disease and optimize growing conditions by monitoring factors like soil and moisture levels. AI has proven to be especially beneficial to farmers in developing nations in Latin America and Africa, where they can utilize advanced warnings about pests that threaten their crops, as well as receive data-driven insights that can help them adapt to the effects of climate change.

With the price of coffee beans at some of the lowest in a decade, AI and machine learning can provide actionable intelligence and decrease the negative impacts of the massive, yet volatile market. 

  1. AI is helping coffee at the industrial level.

The benefits of IoT-enabled AI are not restricted to the farm. Industrial operations perhaps have the most to benefit from the technology. And with some of the largest coffee companies in the world like Starbucks seriously investing in AI solutions for their industrial processes, the rest of the industry is undoubtedly paying close attention. For example, the word ‘digital’ was used over 40 times on a Starbucks investor call in Q4, 2019, which can only suggest that finding and investing in technology solutions is top-of-mind in the boardroom. 

These digital initiatives are improving industrial processes at coffee manufacturing plants in a number of ways. From decreasing equipment downtime at bottling facilities to monitoring the performance of key assets such as water, chemicals and labor at large plants.  

  1. AI is providing advanced insights into transportation and logistics.

While IoT and AI applications are still in their relatively early days of use, the technologies are quickly gaining steam and don’t appear to be a passing trend. Leading authorities have predicted that half of all manufacturing supply chains will be using some form of AI by 2021.

The use of AI by large coffee producers like Starbucks shows that AI is continuing to deliver numerous benefits to coffee growers and producers. AI is providing benefits to the supply chain such as identifying what time of the year is most advantageous to carry specific varieties in stores. 

Further up the supply chain, AI applications can predict order patterns to reduce or eliminate penalties caused by missed OTIF (On Time in Full) deliveries, providing benefits to every stakeholder in the process.

As outlined above, there is no place in the coffee industry that the use of AI wouldn’t benefit. On the production and retail side, companies in the coffee industry can automate inventory orders and predict equipment maintenance and staffing needs. For farmers, increased AI usage can lead to an improvement in water quality, increased efficiency in coffee processing, packaging, as well as provide a positive impact on the overall bottom-line. On the manufacturing side, issues like unplanned equipment downtime and asset quality can be mitigated with the right AI system. 

Curious about how to get started with AI in your company? Be on the lookout for our new report in January of 2020 that explores some of the challenges manufacturing companies face in getting their AI projects off the ground, as well as how to overcome them.