Process Intelligence for Operational Excellence: 2024 Industry Report

How to optimize complex, end-to-end business process with process intelligence

Add bookmark

process intelligence

Process intelligence refers to the use of advanced analytics to gain insights into business processes and workflows. It involves collecting, analyzing and interpreting data from multiple applications across various functions to understand how processes are executed, identify bottlenecks, inefficiencies and areas for improvement. These insights are then used by organizations to re-engineer processes and identify opportunities for automation.

“To me, process intelligence is really about using data and process touchpoints to map out the entire value chain from a customer and business standpoint,” Vineet Mehra, General manager and senior platform product owner, Maersk explains. “By identifying those cross-functional touchpoints, we can better understand end-to-end processes and optimize performance.”

Process Intelligence, however, should not be thought of as a distinct solution or one-time effort, but rather an ongoing practice. “As initial inefficiencies are addressed, it’s essential to monitor processes continuously for future deviations, using predictive analysis based on SLAs, and employing alerting mechanisms to detect and address issues before they occur,” Catherine Stewart, President and GM of Novelis Americas, told us “Change Management and communication become critical practices at this point as process improvements will likely affect how they are executed.”

Case Study: Building a foundation for digital transformation and business process automation

The Conseil d’État, the highest court in France for issues and cases involving public administration, recently started leveraging process intelligence to help modernize its litigation portals. “Before we can structure and launch our larger digital transformation initiatives, we wanted to understand how our legacy systems worked in detail,”

Coralie Ducloy, the Conseil d’État’s head of department of projects and applications maintenance told us.

As their legacy systems were large in volume, decentralized and complex, traditional functional mapping conducted via professional workshops would have required significant human resources and time. “We did not want to add to the already heavy workloads of magistrates and administrators working in the courts, so we decided to partner with Novelis who helped us leverage Blue Prism’s Process Intelligence tool instead,” Ducloy explains. “To start, we used it to map out and analyze how disputes are managed. We can now use these insights to prioritize the functionalities we need to develop in our new information systems.”

To elaborate, Ducloy explains, “With our technical experts we select the datasets representative of our activity in our Legacy systems, then with Novelis we carry out extractions and transformations to load the data into Blue Prism Process Intelligence. After configuring the tool, we then organize feedback sessions with representatives to identify which parts of the process are critical, which functionalities are used, which screens are necessary and so on.”

In addition to achieving their initial objectives of mapping cross-functional processes and prioritizing digital transformation projects, the analysis of the processes made it possible to identify and re-engineer two major process bottlenecks. They were also able to automate a large number of repetitive, administrative tasks.

Democratizing data to enhance decision-making

At Maersk they are building a data intelligence tool to track the entire lifecycle of an order from a profitability standpoint. “We are aggregating, consolidating and analyzing data from multiple finance, sales and operations systems to understand which orders are profitable and which are not,” Mehra tells us. The idea is to give product, business service delivery sales and pricing teams visibility into the various factors that negatively impact revenue and help them mitigate those issues.

These insights are shared via a data visualization tool that allows the user to track KPIs and measure progress over a period of time. Mehra goes on to explain, “For us it’s a beautiful use case where everything ends in P&L.”

In a similar vein, EDP Renewables leveraged process intelligence to more effectively map-out, understand and optimize its procure-to-pay (P2P) process from end- to- end. “To ensure ROI, we wanted to focus on a cross functional, multi-system and high-volume process that had a significant impact on the customer,” Stephan Blasilli, director lean, business process excellence, sustainability and innovation at EDP Renewables told us.

To start, they identified KPIs across the entire end-to-end process such as supplier qualification cycle time, PR creation cycle time, PO creation cycle time, payment cycle time, vendor inquiries volume. “We began by identifying performance issues at a high-level and then used more granular techniques, such as process and task mining, to uncover the root cause of bottlenecks,” Blasilli said.

As P2P is a complex, human-in-the-loop process, Blasilli emphasizes that “we had to think about process intelligence slightly differently and develop new ways of measuring process performance that make sense in a more unstructured work pattern. It was not as simple as looking at cycle time, which is often used at more repetitive work patterns where it’s all about efficiency and automation.”

AI and the Future of Process Intelligence

With the emergence of AI and machine learning, organizations are starting to use process intelligence to not only understand process, but generate predictive insights.

At Johnson & Johnson for example, “we have evolved our capabilities from being able to show what happened to what could happen,” Marvin Johnson, vice president service excellence told us. “We can now predict how changes (or no changes) could impact the customer experience. In addition, to help turn insight into action, the system also provides contextual guidance on what to do next.”

At Maersk, they are also using predictive process intelligence to make predictions related to cash flows, FP&A planning and accounts receivable. Mehra cautions though, “We are very picky about the use cases we choose because transactional data can be inconsistent and contain variations, especially when you are trying to combine internal data with external data.”

Over the next five years, as AI and machine learning mature, it will not only enable process intelligence to be generated faster and more accurately, it will make it much easier to set up these initiatives. “One of the challenges of process and task mining is the identification of data points,” Blasilli tells us. “In the future, AI will tell us what data points we should be using.”

“One of the challenges of process and task mining is the identification of data points. In the future, AI will tell us what data points we should be using.”” Stephan Blasilli Director lean, business process excellence, sustainability and innovation at EDP Renewables

Process Intelligence is mostly a data exercise According to Novelis’ Catherine Stewart, president and GM, Americas, “The ability to analyze processes and identify inefficiencies is highly dependent on gathering the right data. It is important to recognize that any process intelligence project requires resources with the skillset to not only know where to find the data, but also how to manipulate it for successful data ingestion.”

The Conseil d’État’s process intelligence initiative, for example, involved sourcing and sorting data from 75 courts of justice. “To prove value quickly, we have gradually increased the scope of analysis. The project is separated into several phases: the analysis of one jurisdiction, then of 6 jurisdictions and finally of the entire perimeter,” Ducloy explained.

Mehra concurs that data management is often the most difficult aspect of process intelligence. “Democratizing data through self-service is a major objective for us.

However, pulling data from more than 100 systems, some of which are over 20 years old, is a very complex undertaking” Mehra tells us. At Maersk, they consolidate data from the operational systems such as ERPs and CRMs into a data lake. Users can then access and explore the data using a data visualization platform.

Getting process intelligence data into the hands of decision-makers, however, is only half the battle. In addition to putting the right data in front of the right people, ensuring people know how to use data is also essential.

At Maersk, this means promoting data literacy and helping people understand what they can accomplish with the process intelligence tool. “We also try to give people the space to experiment with the tool and develop their own use cases,” Mehra tells us.

To maximize ROI, understand that automation is only one option

While digital technologies such as intelligent automation, process mining and RPA are incredible enablers of operational excellence, automation is not the solution to every problem.

Along with automation, simple process reengineering, training, custom development or redesigning a process with an AI solution should also be considered, Stewart tells us.

“Before you automate or use sophisticated tools such as process or task mining, you want to make sure you conduct cost-benefit analysis,” Blasilli concurs. “Automation works best on high volume, repeatable processes, so you won’t see the same ROI with processes that are low volume or unstructured.” For dynamic and/or low-volume processes, he recommends sticking to manual approaches such as process mapping.

Engage the right stakeholders

Engaging stakeholders from the very beginning is critical for success, Blasilli tells us. “If you don’t get that buy-in early on, it will be very challenging to implement changes down the line.”

At EDP renewables, they adopted what Blasilli describes as a “bottom-up” approach to process intelligence: “To kick off the project we worked with frontline employees and subject matter experts to understand their pain points and challenges. To ensure the insights we share continue to be accurate and relevant, we are constantly checking in with our users and asking for feedback.”

Stewart agrees, stating: “Projects should start by aligning with the overarching business goals and a clear understanding of how ‘improvement’ will be defined, whether it’s cost reduction, time savings or staff optimization. IT and application owners play a critical role in providing the necessary data for analysis.”

Read More:

Sponsored By:

RECOMMENDED