In the rapidly evolving landscape of business and technology, artificial intelligence (AI) is a transformative force with potential to redefine how organizations operate and deliver value.
Among other impacts, AI boosts employee productivity and drives greater customer engagement.
However, the next evolution of the impact of AI in organizations is how it powers operational excellence.
This is where efficiency and resource optimization in all aspects of how an organization operates is paramount in
- achieving bottom line efficiency,
- maximizing revenue generation, and
- delivering a differentiated customer experience.
The current landscape of AI applications spans various uses.
These range from customer chatbots to product personalization and decision-making, and much more.
Yet, the next frontier for AI is to move beyond the periphery of processes and into the core of organizational operations.
This requires operationalizing AI at scale.
In turn, this requires deep and wide AI integration into the organization’s products, services, and business processes.
Going this deep identifies waste and inefficiencies that hinder performance, and drives achieving organizational agility.
This is critical for capitalizing on opportunities for competitive differentiation.
So, what’s holding organizations back from operationalizing AI at scale?
The primary challenge lies in ensuring the quality of operational data — a critical component for effective AI utilization.
Achieving this is no easy feat, especially considering the exponential growth of data.
For instance, since 2020 there’s been an almost 5,000% surge in data created, captured, copied, and consumed – this is not slowing down.
Organizations manage a vast amount of operational data in disparate repositories.
These include transaction processing systems, customer repositories, risk management systems, issue management systems, and more.
The challenge is that managers silo these with distinct data structures.
As a result, change professionals struggle to perform associations and gain insights needed for operational improvement.
Renowned computer scientist Andrew Ng advocates for a data-centric AI approach.
This emphasizes that the limiting factor in realizing AI’s value lies in the quality of data.
High quality data is characterized by consistent labeling.
Data labeling involves adding informative labels to provide context for machine learning models.
These labels must propagate across the various repositories of operational information, fueling the next level of AI value.
While not necessarily a new concept, Digital Twins have gained prominence, especially in industries like manufacturing.
They are virtual representations of physical objects, systems, or processes capable of capturing real-time data from sources such as IoT sensors, radio frequency identification (RFID), and computer vision.
The critical aspect is maintaining consistent labeling of data from these sources.
This enables AI to process patterns and provide invaluable insights into process performance.
A noteworthy example is Unilever, who has created a digital replica of its factories to gain unprecedented visibility into its supply chain.
First, the digital model captures data from sensor-equipped machines and processes, simulating all aspects of the plant from individual machines to entire processes.
Then, it applies advanced analytics and machine learning algorithms to the collected data, empowering Unilever to anticipate and respond to various challenges readily.
This integrated approach not only ensures optimal efficiency, resiliency, and flexibility, but also facilitates cost reduction, improved efficiency, and enhanced decision-making.
As a result, this approach has positioned Unilever as an industry trailblazer at the forefront of digital innovation.
However, the challenge lies in adopting Digital Twins in organizations and industries where the environment is more virtual than physical.
Therefore, making the creation of a data labeling scheme becomes less obvious.
This is because we cannot underestimate the promise of leveraging AI for operational excellence.
It can power improvements in any operational metric an organization measures itself by across all operating dimensions.
AI can essentially become a virtual Lean Six Sigma engine, enabling humans to optimize not only individual processes but also streamline the entire operating model and supply chain for inefficiencies.
To drive organizations towards this greater level of operational excellence, we need a comprehensive data labeling schema.
Operating an organization is really a collection of actions needed to run the business.
This means the key to a good schema lies in understanding the business process landscape.
This also means creating a comprehensive and accurate inventory of processes, providing the “ground truth” for the organization’s operational landscape.
In addition, it means building an inventory of processes and leveraging it to align operational and resource information.
The process names in this inventory become the data labels, as they represent all the actions that AI can evaluate for performance.
These labels need to be distributed across operational repositories and built into the data models for how they store and manage their data.
Building a process capability through a Process Center of Excellence (COE) is crucial for creating and maintaining this comprehensive information repository.
This requires an organizational investment and commitment.
However, the benefits in terms of improved operating metrics and the seamless adoption of digital technologies to power business performance far outweigh the initial investment.
Integrating AI for operational excellence isn’t just a technological upgrade.
It’s a strategic imperative for organizations seeking to thrive in the digital era.