5 Digital Transformation Lessons from Dune

Last week, sci-fi fans finally got to see the latest film adaptation of Dune. When published in 1965, Frank Herbert’s novel was a groundbreaking, eco-conscious sci-fi epic. Set 20,000 years in the future with intergalactic dynasties and secret orders battling for control of the scarcest resource in the universe, Dune seems both completely alien but also very familiar.

Much has been written about Herbert’s inspiration for dune. But while the author had plenty of history and his own time to draw from, the story is even more relevant today, given how dire some of the same issues have become. So if Dune does such a great job of reflecting our current situation, what insights can it offer into how to address our challenges?

Here are 5 lessons from Dune on digital transformation

1. Bring back the thinking machines

In the Dune universe, a war against machines results in a prohibition against AI or “machines in the likeness of a human mind.” Subsequently, over thousands of years, humanity has filled the role of advanced computers with Mentats. After undergoing conditioning at specialized schools, these ‘human computers’ are able to process large amounts of data, identify patterns, apply logic, and then deduce probable future outcomes. The prescience and strategic abilities of Mentats make them valued advisors, with the great houses of the universe vying for their service.

Atreides Mentat Thufir Hawat

Sound familiar? With organizations across all industries racing to capitalize on AI, there’s been growing demand for data science related roles. Companies have to compete with big tech companies for talent, and there is simply not enough supply to meet the demand.

The solution? Automation. “Many machines on Ix. New machines,” notes a guild navigator (another class of humans that replace the work formerly handled by computers). Organizations can automate much of their data science work by partnering with vendors that have already made significant investments in R&D and data science talent. Leveraging outside expertise to focus on improving specific workflows is more cost-effective, provides flexibility, and can accelerate digital transformation efforts. 

It’s time to bring back the thinking machines (spoiler alert: the humans and AI eventually make peace in the Dune series).

2. Every drop counts

In contrast to the Harkonnen who seem to indulge in daily steam showers, the Fremen natives of Dune are relentless in their conservation of water. Donning water-preserving suits, the Fremen even reclaim water from corpses and avoid crying. Of course, personal survival demands it, but their hyper-vigilant water preservation also serves their long term vision – terraforming their desert planet into a green oasis. The Fremen use wind traps to collect moisture from the air and slowly amass giant caches of water across thousands of sites.

Fremen water catch basin

Organizations rightly prioritize opportunities that promise to have the biggest impact. But they also shouldn’t overlook less obvious opportunities to innovate (for instance, optimizing the various points at which water is used within food manufacturing processes). By applying the same rigor across other processes, the many small gains in aggregate can have an enormous impact on the efficiency and sustainability of the entire business.

3. “The slow blade penetrates the shield”

Combat in Dune highlights the value of adaptation and an incremental approach. With personal shielding technology having rendered conventional projectile weapons largely ineffective, military forces in Dune revive the use of hand-to-hand combat and traditional weapons. To win in battle, soldiers have to think steps ahead and employ techniques that allow them to overcome the shields, which only yield to slow attacks.

Likewise, with the conventional, top-down approach to digital transformation often failing to deliver, organizations must adapt more effective strategies. A survey of industrial professionals indicated that while 94% have taken an organization-wide approach to digital transformation, only 29% claimed success. Stymied by unanticipated complexity and plagued with delays and cost overruns, many organizations are turning to an operation-specific approach to digital transformation. By implementing digitization and automation techniques to specific workflows first, organizations are able to ensure incremental success and then scale their efforts to the rest of the org.

4. Enlist the frontline

Another benefit of the ops-specific approach is that it more effectively involves and considers those closest to the processes being targeted. In Dune, as the management of Arrakis and spice mining changes hands from the Harkonnen to the Atreides, there’s a clear distinction in the management style of the Atreides. The Harkonnen impose their rule and maximize spice production with violent oppression. By contrast, the Atreides begin their management by sending envoys to engage the locals. They rescue spice harvester workers at the expense of spice production, and then Paul embeds himself with the Fremen and gains their desert knowledge. The approach pays off, as Paul is able to mobilize the locals to overwhelming success.

Similarly, it behooves organizations looking to transform their operations to enlist stakeholders at all levels, especially those that can assess the situation on the ground and identify all opportunities to innovate. Getting their buy-in, tapping their experience and expertise, and ensuring the project delivers on their goals will increase chances for success.

Dune spice miner

5. Fear is the mindkiller

“Moods are a thing for cattle and love play” declares 1984 Dune’s Gurney Halleck while chiding Paul Atreides for not being more vigilant in preparing for their hostile destination. Once on Arrakis, Paul finds himself stripped of his resources and stranded in the desert. He’s forced to quickly hone his skills and adapt to the conditions of his new environment. 

The pace of innovation across all industries is increasing. To maintain their competitive advantage, organizations must create an environment to support innovation within. They can’t afford to wait for years long, enterprise-wide digital transformation projects to deliver uncertain results. Budgetary limits, legacy systems, lack of expertise, and other challenges can be overcome with the right approach. The op-specific approach can help organizations adapt faster, empower professionals across the organization, and realize ROI sooner. 

The sleeper must awaken!

Zillow & 2 Attributes of A Successful Data Culture

In our recent e-book, 3 Hacks for Onboarding AI Platforms, we outline a few key steps to building the right team and culture to support an AI deployment. And we did so for good reason. There is broad consensus that the success of digital transformation efforts hinge on having a data-driven culture behind it. A 2019 Deloitte study found that companies with strong data-driven cultures were twice as likely to exceed business goals. Another study by New Vantage Partners found that 95% of the challenge to adoption of big data and AI initiatives was cultural, organizational, or process-driven rather than technological.

Given this, organizations have prioritized fostering data-driven cultures within their organizations. Whether it’s hiring a digital-focused executive, establishing centers of excellence, or instituting organization-wide mandates, the focus is on moving away from decisions based on gut feeling to those based on data-derived facts. 

Organizations Must Look Beyond the Numbers

Sounds great, but an effective data driven organization must often look beyond the numbers and can face major consequences when they fail to do so. Take for example Zillow, a company that has used data to not only build more accurate real estate models but has also leveraged data into a powerful competitive advantage.

Zillow’s automated home-buying business recently made headlines for its decision to halt home purchases. The company, which has access to more than 17 years worth of data, is hearing backlash after the announcement. Some are calling into question the company’s ability to properly plan and take into account logistical constraints. Others are wondering if their brand has been irreparably damaged. How could these things happen in a data-centric company?

Attributes of a Data-Informed Culture: Intuition & Ownership

In our experience, organizations have proven tremendously successful when they connect big data analytics to the business strategy. This data-informed approach means they acknowledge the data-derived insights but are also aware of and account for the implications of other non-data factors that may impact the direction of the overall strategy.

It also means that when building this data-informed culture, in addition to data literacy, organizations must also look for and encourage two key attributes: 1) Intuition and 2) Ownership

Intuition is defined as the natural ability to know something without any proof or evidence.But it’s also another data point, based on unconscious knowledge, expertise, and experience to be combined with other data in decision making. Ownership is the state of being responsible and accountable. It’s critical that these two components are embedded into the company’s values so that data may be used in a way that properly guides and informs decisions. Otherwise, you may be sitting on actionable insights that no one has evaluated properly or acted on because it’s “not my place.” Someone must answer to the choices being made and how those decisions align to and support broader goals.

It’s easy to wonder if the culture at Zillow didn’t empower the decision makers to use their intuition in the process, but instead they had been accustomed to letting the data be their one and only guide. 

It also highlights a gap between the company’s actions and the real-world issues having to do with the on-the-ground workers and supply constraints. This could be the result of a lack of ownership over the decisions being made.

Being data-informed in addition to data-driven means using both intuition and ownership to constantly check your assumptions, methods and outcomes. The qualitative complements the quantitative, just as the human element complements the data analysis. 

If you want to take your data insights to the next level and avoid the unintended consequences associated with mismanaging the intangible side of your business, look for people that demonstrate high intuition and ownership traits. Your culture will thank you for it.

The Water Values Podcast: Digital Transformation with Prateek Joshi

CEO Prateek Joshi talks about digital transformation in the water sector. Prateek hits on a number of important and practical points in a wide-ranging discussion on data, AI, and machine learning in the water sector.

In this session, you’ll learn about: 

  • Prateek’s background & how it influenced his arc into the water sector
  • Water-intensive industries and using water data in those industries
  • Prateek’s view on digital transformation
  • How COVID influenced the digital transformation
  • The limitations of human-based decision-making
  • Common challenges for data-centric organizations
  • How to drive organizational behavior change with respect to data usage
  • The difference between AI and machine learning
  • Data quality and verification issues
  • The factors companies look for when selecting an AI system

Click the link below to listen:

https://episodes.castos.com/watervalues/TWV-192-Digital-Transformation-with-Prateek-Joshi.mp3

 

 

8 Dimensions of Data Quality

Large companies have enormous physical infrastructure. This infrastructure is well-instrumented and data is collected continuously. The Plutoshift platform uses this data to help them monitor their physical infrastructure. When we look at the data flowing in, we need to standardize and centralize the data in an automated way. One of the first steps in monitoring physical infrastructure is to check data quality. How do we do that? What framework should we use to validate data quality?

The topic of data quality is vast. There are many ways to check and validate data quality.

To automate the work of monitoring physical infrastructure, we employ a variety of machine learning tools. You need to automate the work of looking for anomalous performance metrics and surface them. If you look at deploying machine learning in these situations, it is basically a data infrastructure problem. If you have good data infrastructure, your machine learning tools will do a good job. Needless to say, your machine learning tools will look bad if the data infrastructure is not robust.

In our experience, there are 8 criteria we can use to ensure data quality in the world of physical infrastructure:

1. Consistency

There shouldn’t be any contradictions within the data. If you do a sweep of the entire data store, the observations should be consistent with each other. For example, let’s say that there’s a sensor monitoring the temperature of a system. The dataset shouldn’t contain the same timestamp with two different temperature values.

2. Accuracy

The data should accurately reflect the reality. You should be able to trust your instrumentation. In general, this is a consideration for the data collection systems. For example, let’s say that you’re looking at the data store for flow rates within a pipe. The data should accurately reflect the reality of what’s actually happening in the pipe. The machine learning model will assume that the data is true to make a prediction. If the data itself is inaccurate, the machine learning model can’t do much.

3. Relevancy

The data should be relevant to the use case. You need data that enables you to achieve a specific goal. For example, let’s say we’re looking at the energy consumption problem. If you want to reduce energy consumption, you need to have data on the levers that are responsible for driving energy consumption. Machine learning can’t do much with high-quality data if it’s not relevant.

4. Auditability

We should be able to trace the changes made to the data. You can make sure that nothing gets overwritten permanently. By understanding the changes made to the data over time, you can detect useful patterns. For example, let’s say that you’re looking at a response tracker filled with user-inputted values. The ability to trace the changes made to the data gives us the ability to look at the evolution of the dataset.

5. Completeness

It means that all elements of the data should be in our database. Fragmented data is one of the most issues of subpar performance. In order to drive a use case, you need all elements of the data. Data completeness allows machine learning models to perform better. For example, let’s say you are looking at monitoring membranes within the physical infrastructure at a beverage company. The aim is to predict cleaning dates and there are 5 key factors that affect the cleaning dates. If the dataset only has 3 of those, then the machine learning model can’t achieve the level of desired accuracy.

6. Timeliness

We should get data with minimal latency. Data tells us something about the real world. The sooner we know it, the sooner we can dissect it to take action. If something is happening in the real world, the data collection system should be able to get that data into the hands of the end-user with minimal latency. For example, let’s say we’re look at pump monitoring. In case of emergency, the aim is to take action within the hour to minimize damage. If the data collection system sends you the data with a gap of 3 hours, then you’ll be too late.

7. Orderliness

The data should have a fixed structure and format. Data format plays an important role in building scalable products. For software to work at a large scale, the data needs to be in an agreed-upon shape. This allows machine learning systems to work at scale, which is really powerful given the amount of data they can handle. For example, let’s say we’re looking at monitoring cooling systems across 400 sites. A machine learning model is effective if the data from all those sites is in a standardized format. If all 400 sites have different data formats, then you’ll have to build separate workflows for each. The ability to scale reduces.

8. Uniqueness

Data shouldn’t be duplicated. This one seems obvious, but data duplication is a very real issue that we face. In a given database, the data shouldn’t be duplicated. There’s no reason to store it more than once. It occupies more space and doesn’t serve any purpose. For example, let’s say we’re looking at pressure values within a steam system. For a given timestamp and location, we only need the value to occur once. If the values are duplicated, we need to deduplicate it before processing further.