Subscribe
BTOES Insights Official
By
August 16, 2021

Process Mining Live - SPEAKER SPOTLIGHT: Process and Task Mining: Hybridizing Solutions to Harmonize Insights

Courtesy of Software AG's J-M Erlendson, below is a transcript of his speaking session on 'Process and Task Mining: Hybridizing Solutions to Harmonize Insights' to Build a Thriving Enterprise that took place at the Process Mining Live Virtual Conference.

BLOGS COMPANY LOGO (63)-1pillar%20page%20line%201

Session Information:

Process and Task Mining: Hybridizing Solutions to Harmonize Insights

Many organizations struggle to realize the benefits of Process Mining, despite having good transactional data and powerful toolsets. What if they're not seeing the whole picture? What if key improvement opportunities are buried somewhere between the system and the human?

In this session, we’ll explore the practice of combining insights from traditional process mining and task visibility from Robotic Process Mining. We’ll discover the benefits of combining these techniques and their data sets to make more evidence-based decisions for transformation. Lastly, we’ll explore a real-world example and see how a major manufacturing organization uncovered end-to-end process weaknesses and adopted these practices to tremendously increase efficiencies.

Session Transcript:

Our next speaker, he's J M. Aaron. And his son, who is assist Senior Systems Engineer for a software AG. He's a business process, architect methodologies, specialists. And Transforming Engineering Lead, with over 15 years of Experience Jam has consulted for clients across industries, including the public sector, retail, utilities, manufacturing, lottery, and gaming, telecom, an oil and gas.

He helps his clients develop, and implement business process frameworks, hone, process centric strategists, and execute, Process Improvement and Architecture modernization projects, jam's an honor and pleasure to have you here with us. You always bring tremendous insights to our global community, and on top of that, you are all around good guy, just a pleasure to be with, and the great smile, a great positive attitude in everything that you do. And it's a real gift for all of us to have you presenting today. Well, thank you so much, Josie. And I felt like saying at the beginning, oh yeah, J M Orleans, And from Toronto, Canada in the chat, I got so many kids around the world, have shown great industry and thought leaders in process mining, coming from Canada today.

You know, and we have belong, who is an AI expert and data science leader.

Also, based in Toronto, clearly, Toronto has like, this great cluster of great minds, but we like to think of ourselves as Silicon Valley North. But maybe we've also got a couple of folks here to back that up Well. Bill Bill and I and I can have to connect later gravel bikes. I wanted to, Toronto is wonderful restaurants. Well, folks, I want to get started here, Joseph. Thank you so much for the introduction, and I want to get started with the talking about something that I care quite a lot of it. Hopefully after this presentation.

You will too session today is called Process My Process and Task Mining: Hybridizing Solutions to Harmonize Insights for very important reason. I see these two things as as separate but important and interlocked components of understanding and getting insight into your business, how it works, and, ultimately, how it's performing.

And being able to understand where to go next, and who to look to, to transform and improve your operations.

I understand that there's a transformational journey that a lot of people are on.

And prior to coming to these sessions and thinking about process mining, you might have been stuck in the top area of this sort of gray design. It might execute designing with Linux.

This is where a lot of organizations kind of get stuck because they've got people trying to do their work, documenting processes, creating deliverables, but they're not really creating a lot of value for the organization. Kind of satisfying requirements at very minimum, or keeping the lights on.

J-M ErlendsonAnd when you start to take a look at the, the thing on the right side, when you get this execution data collected.

I think what you're doing is starting to take lessons learned back from the way in which your business actually operates. And highlighted in green, there is where processing task mining starts to come in.

It starts to provide actual data on how things are operating, and give you insights on how to improve them, And they're both part of the puzzle. We're gonna get into that as we, as we talk about what they are, and the real practical example of what we do with them.

From those process of mining and pathfinding insights, what do you do next?

Well, let's take a look at an example what we did today. But, at a high level, we want to understand where things are going wrong, people to simulate the impact of changes. Should you address those issues and be able to approve, govern, and check back on how task and process mining is being implemented in an automation, or in a business process transformation, Or ultimately, just in better learning and development for the people, because that's, we've got operating and executing your processes.

We understand that process and task mining are a pretty key part of a lot of your worlds, which is why you're here today. Because you want to optimize your process. You want to achieve process excellence.

You also want to understand whether or not people are doing what you said they should do.

Whether or not different business units are they are behaving differently and actually performing differently and get conformance on those ....

And then you want to digitize your business, people, to understand where you can automate, how you can use technology to improve your, your operations. And using things like RPA, using things like a further level of automation, different types of business rules implemented, you can really achieve high value transformations with simply using data as your feeder.

Now, what is process discovery? When we talk about process mining, Because I'm gonna go talk about both of these things in tandem and it's important to recognize each is separate, but they dovetail together. On the bottom. You're gonna see, we're gonna hit process mining as source systems.

Pulling a bunch of sets of activities, You think of these as your data points that you're going to mine out. These are activities that occurred within those transactional systems that you're pulling information from, that we're going to parse and align to what they happened in context up. So, these are steps that happen, as one person was doing an order. This is that, these are steps that are part of the hire to retire process, or a single employee.

And, once we've aligned them to what happened, we can time, code them, and sort them into the order in which they occurred, allowing us to understand the flow of information.

Now, if this is new to you, welcome to process mining live. If this is not new to you, and you're obviously an experienced practitioner, but I want you to note very carefully, where our data all came from, kind of source systems we had.

We're looking at transactional data, pulled from data spaces, data lakes, data, available tables, Connections into databases were pulling elements of measured, by system, transactions, Bit to that in a second.

Now, what do you do with task planning, What is what is task mining?

Conversely, here, you've got a bunch of different machines of individual executory in your organization.

And each of them are doing processes.

Btog CTAThose processes are on the desktop, and they're using systems that wouldn't necessarily capture automated transactions normally, They would be no Excel, Word, Outlook. All those different manual with you. Call it manual, I suppose, but they're, they're manual programs, things that don't have a transactional bend to them. And what we're going to do is we're going to build a visualization of those individual user actions, based on screen scraping based on click and key logging.

What that allows us to do is understand what did person A do on day be to execute process, see, where did they click, what do they type? What did they do? And, to be able to achieve those business goals.

But, we're seeing and collecting this data and grabbing it into a log.

Unlike the transactional log we saw before, from it through a system, where it's all controlled by the parameters of the automation, we are controlling them by the parameters of the parser.

So, the task minor is helping to understand and consolidate these actions into a recognizable process flow. It's also entirely based on individual user action, which allows us to see a different level of insight as to what you happening, what's happening on the desktop.

But it doesn't give us a view into anything else. It just tells us what, Where did you click? Did you go do your taxes?

I'm getting, of course, we put data masking all this stuff but comparing these two things together, process mining allows us to capture those transactional automated steps, displays key attributes captured by the system. So, for instance, there's a lot of information we can capture that systems themselves capture, things like financial data, things like order quantities or volumes.

We call those measures and dimensions as in things we can segment are processed by a dimension as in our business unit area or a type of order or priority of the order and and measures being quantities and values that we can calculate on However, It's really only sees our transactions. There's only sees what our assistant seats. So, and this is actually the process of mining, obviously, isn't very new. This is it. But this is a weakness of this whole philosophy, is you can only ever see what's your system seeds.

And in some ways, that's great, because your system sees a lot of things, particularly if you're a very heavily automated shop or you're on an ERP system that is built to control and manage everything you do.

That's great.

But it doesn't explain what I talk about.

It's suspicious activities very well.

Task mining, on the other hand, has been revolutionized to capture activities on the desktop aggregated over a user, understanding the variances and task running of your app or a particular process by a user.

However, this only ever sees a user action.

So if a user doesn't click or type somewhere on the screen, during the period of capture of your robotic process discovery bot, it doesn't exist in the system.

That seems to miss a few of the things we're talking about.

As you might imagine, there's a lot of the business that is not done at the desktop level. There are transactions, there's automations that ensure that the processes that run in the background on the back end, behind the scenes, they also move from user to user dynamically as you go through end to end handoff. When you only see user actions, you have to start being doing things like stitching them together. By users.

It becomes a limiting factor of task mining.

What I argue for today and what you should be taking a look at and thinking about is the Hybridizing of these two solutions to harmonize the insights that they provide?

Process mining, providing you an understanding of the automations and how they are run. And the steps that are being done underneath it.

Has planning, Helping you to understand the individual user actions in order to achieve those steps. Helping you to focus in on those suspicious automated steps that really aren't so automated. And be able to see exactly where you can focus your efforts to improve. Not just by re automated, you know, re automated systems, but also, by learning and development, by individual user support and performance enhancement. Things like robotic process automation, that can help on the desktop level to solve some of those suspicious problems. We see in the transactional level, it's not all about doing our system re implementation.

Because that's expensive time consuming, and often, not even the right thing to do.

We see this all come together.

A technical life cycle that we that we've, you know, we've really honed developed and implemented here at Software EG for a bunch of our clients. But I'm gonna go through an example today as well. On the left side, you're seeing the discovered analyze this duality.

Of things discovered analyzed being, rediscovered the processes that are executed in systems we analyze the user actions that are happening on a desktop level, we bring those together into a design environment. Now I see these are two levels of hierarchy.

The discovered process being a higher level model, the process model, what is your automation?

The analysis, and analyze level being our sub processes. What is an individual user doing underneath each one of these little boxes in your model? And when I say in your model, I do mean in your model, I would always recommend for you to take any insights you've gathered from profits, finding a Cask Mining, and put it into a modeling blackboard WIA modeling platform. I'm just trying to gather insight.

Well, the answer is you're not, You're actually trying to make a change. You're trying to effect change on the business. at all you have is a system throwing up red flag and saying, something's broken off. You're going to get, is people complaining at you? What you want is you want to be able to do something about it. You want to remodel, want to design, develop the next way of executing your processes. Either at the system level, the rate through re automation, or at the user level through learning and development trainings, and RPA, and developing solutions that you can use at each of these levels, put them together to solve the problem that you have found.

PM_Live_GraphicAnd then when you deploy and operate, you also always want to check back. I talked about this the beginning, but it's a really important part of the puzzle. A lot of organizations tend to invest money and sort of invest as a strong word. They tend to throw money at a technology in order to solve a problem. What they really want to do is throw money at a problem and find a technology to enable it.

And then when that ultimately does something and it's a nebulous outcome, then they go, ah, there is a great try.

Well, what if you could check back and see iteratively how you're improving your processes. You can make a much stronger business case for the next project. Geez, you can even justify and track benefits and show the value of the work you've already done.

So, make sure you've got that piece of the puzzle in there as well. You want to capture, want to model.

You want to deploy, You want to operate and capture those are important parts at all levels of your process.

So what is the value to this hybridized solution?

First and foremost, want to uncover those process weaknesses with ERP configurations and other types of automation like you know, heavy automation systems where we can do re-engineering.

We also want to evaluate the people who are executing the process.

We can see that in our transactions through what we call interaction analysis, how people are touching process, we can also see this at the desktop level where people are executing processes at a very low level.

We want to provide a roadmap to understand what process transformation is going to look like, and that process transformation.

When I say process transformation, I don't just mean, what are you going to deploy in your ERP. I mean, how are you going to improve your business operation for the people, doing it, as well as the systems?

Understand your SLAs. How are they being hit up against from a external supplier perspective, And how are we helping our internal teams deliver on their promises to other teams?

I worked with tons and tons of organizations. Where the biggest problem isn't, the solutions that they're delivering from, are getting delivered from their clients or vendors. It's the people who are doing internal shared services who are not meeting the requirements and standards, which are causing slowdowns. And if you simply had better alignment of resources, more, understandable expectations, clear goals, and defined steps, operation, and, ultimately, a better relationship between the business and automation. Of course, you would be able to deliver and achieve your goals, that you would expect that.

And lastly, we want to understand what are low value activities? This is something that process mining is sometimes good at, but task mining is particularly good at. And process binding says, This steps taken a long time I'm suspicious of this step. There's some automation to it, but I don't understand why this transaction is not resolving more quickly.

Well, there's a lot of low value activities happening underneath it. We want to reveal those emerge. Those up contextualize those under the mind process, but be able to ultimately reduce or remove those so that you can get a better result.

And faster processes, more efficient processes, focus your time on what matters.

So a lot of people are going to be theory crafting in this in this conference, and I love it. They are thought leaders.

I think they're fantastic at what they do, and I really want to learn, and I have been learning from their understanding and insights from the marketplace, today, in this session.

I'm going to spend a lot of my time talking, talking to you about exactly what actually happens and what specifically actually happened.

I'm going to bring in an example of where I've done this specific work and why it was valuable and what we were able to get as an outcome from it, so today we're going to dive into a real example a real execution of process and task mining put together.

So, what are some key deliverables and tasks for a hybridize approach? Number one, we want to understand what information you have.

Existing documentation that will help to set a baseline for your expectations for process execution. That could be systemized, system documentation, system specs, or code, or, you know, if your app you have a big SI who has implemented something with you, they probably produce a lot of a lot of paperwork being able to understand what they've got, what they said they were doing, when they automate your processes.

You're also bringing in your business processes, the strategy, the execution that was expected from your users, your people, at a lower level, something you can compare to, and feel, these libraries are free with re-usable assets.

Next, we want to do, as part of these things, is automatically create process models, to start the conversation that placement within, which you can have a conversation, to do that.

We want to bring the data from process, mining the data from pathfinding together, and stack it top to bottom. So, we can understand, here's our process here with our tasks, how we do it.

Once we have this stack, can we have our existing information?

We now combine the two.

What are we doing versus what do we think we should have done?

Interesting, where our variances, what's our non compliance isn't, how can we learn from some of the wisdom of the crowd. Worked for a fairly major oil and gas company in the south, where they learn from the wisdom the crowds. They shouldn't be doing a lot of the process steps that they mandated, because the ones who weren't doing it, we're having a much faster throughput time and having the same outcomes. Well, let's take that lesson learned and use that to drive process improvement.

We also want to produce dashboards and visualizations to highlight KPIs and shoreward strong or weak, and where we're suspicious, and particularly health checks on, our are suspicious processes and suspicious business units.

For instance, if you've got a business unit consistently spending more time working on a particular process, why is that? Let us be able to go and investigate that. And it could be, as a result of staffing issues, activity communications issues, with in that particular business unit, we could have a, particularly a demographic of our orders, or customers that are particularly difficult to end up. Going always to. that business unit, as I had, with my clients, turns out, that they weren't just slow, they just had all the worst customers. And that's recommend something to make them better, Let's figure out how we can do things better with that.

So, what's that real hybrid deployment? Where I that I'm talking about. Let's get some lessons and insight.

Let's go straight into how we did it.

We pursue this and they this, this is a one of, our one of our clients said, I would like to do the following things I want to understand where I manually intervening in.

My automated process, not, always tells me, There's two levels, here.

Right. There's the automation.

The process money, and then there's a manual work that's there.

Planning.

So where do I have to do things to intervene and what am I doing when I intervene?

I want to reduce those. They had. They said, we wanted to take away 20% of our process, cost, and vicious go for it, We're going to make that happen.

She was wanted, we wanted to make shorter wait times, because people were waiting a long time for their orders to come through internally, and that was a serious issue we had about. That means that the customers, I ended up being impacted, and we have. We wanted.

we wanted to have increased customer satisfaction, sort of increase that top line and also decrease our customer complaints by 60% from those extra weight times, or from incorrect orders and reworks, that had to happen as a result of process weaknesses. In order to figure out, Are our partners delivering on what they said they would do? So, there were some serious issues they thought they had with vendors, but they couldn't identify them. They couldn't pick it out.

So, that process mining, we're going to pull, in order to understand the satisfaction of our customers. We wanted to get those insights, and bring them in. And that's when we start to talk about it, an ecosystem of data that comes in to inform where you should go. What. Where does your NPS or your Net Promoter Score factor into this? Well, now I'm gathering additional data on satisfaction, and we can identify which processes from the process and task lighting needs to be focused on.

click_to_view_all_eventWhich steps in our process mining, and which parts of our task money, And get that street level view. And they wanted to create, essentially, a repository of documentation, help understand what they're doing, figure out how the users transact. And how do we get improved work efficiencies?

By re pre-training, we prioritize it, re prioritization, and, ultimately, implementation of a new way of doing a new way of working at greater transparency was really, It's a real client example. How do we do this?

First and foremost, you focus on what processes are going to start with us. What's our most important thing? We pick that order materials and service process.

The common process exists across a lot of organizations. Eric, you know, 10, 10 plus thousand executions a month somewhere between 18 and 20 steps in our transactional systems table to execute that process. Now, they thought there were significant variants. They thought that was probably about a human touch, because it's cycle times a very long, but they couldn't identify exactly where they couldn't identify why.

What are we doing? We import their content first. What do you have?

Let's populate the repository with any existing information. And we want to leverage the BPM and business process modeling notation format in order to document and lay out the movement of information particularly between systems because that's we're gonna catch at those those breakouts, that's what we're going to catch, where manual rework manual steps needed to be an intervenor in there as well as getting those contextual organization and system libraries.

Then we ingest that data from a central processing system there, they had a particularly an ERP that was controlling a lot of the movement of the other information and render that data and mapped and matched it through keying, so helping to understand what step happened first, where it happened and where wit process instance fit into.

And then he evaluating all those things based on the KPIs and flow conformance as, and how are you compared to the documentation you provided. And how are you compared to our overall expectations, in terms of the delivery of these processes.

And so we got down into the weeds there and made it clear, OK, here is what you're doing. And here's what's going on.

And so this is what we ended up getting. As, if we, once again, these are just going to give you a practical example. A practical approach to making it happen. So, first and foremost, here's our process.

This is what the process model looked like.

Here are the steps that we ran into. This is the order in which they occurred. This was the documentation that they had were able to provide. Now, mined out just the documentation. So, why or where do we start? This is what we're trying to improve. This is our, This is our Nirvana. We're trying to achieve an optimized version of this order materials and services process.

Next, let's mind this, that, how is this process running? And you might imagine, it didn't quite work the way that they expected. On the left side of the screen, you're seeing a bunch of loop backs and variances. And our whole whack a different things that people were doing to execute this. And there was a lot of rework there was a lot of loop back. There was a lot of various paths. And we were able to filter down those dimensions by things that were important to them. Are people running this differently? Our suppliers running this differently? How does it happen as a result of characteristics?

We know we can control, And ultimately, what are the suspicious steps?

Let's get into things that are problematic, where are what steps are sticking out, and I say suspicious, because we don't know for sure why these steps are the longest running. We don't have all that information. What we know is that these steps have manual touches and that they are taking a long time.

And this is where we start to take a look at the lower level.

We start to take a look at the dimensions that we're analyzing.

And we start to look at these task mining underneath these steps, which we'll see in a second.

So, where do we go next? And how do we look at this to get value from it? So, we want to understand, what are our key candidates for change, from just an individual step execution perspective. We're not talking about necessarily the whole system re-engineering. There are some definitely some variants engineering we want to get into and pull back, so that you can have a more standard informed process.

But, let's, let's see, what, what do we can use for task mining here?

We also wanted to understand, are people delivering our suppliers delivering relative to what we wanted? So, are we getting our KPIs are some some of our suppliers problematic? We can pull that information out and help them understand why and where things are starting to break down.

What's our source of long running processes that are good are relying on external partners?

And then where do things go wrong? And here's where we really dig in. Where are the individual steps that, when executed took a time that was unacceptable?

Things that were way above our KPI thresholds, and things that were causing long running process instances as a result of execution.

Now, that's really important to know, because once again, we are saying these are suspicious.

And when you're suspicious, it's time for you to take a look much more closely, figure out, hey, can we make this better?

And how can we make this better.

Now, I group processes into four different quadrants when I take a look at these sorts of analyzes. And it's really important that to go over this, I think this is a great metric and a great structure for understanding where things can be improved and how they can be improved.

I group this on this chart, which has two axes the left axis or the or the Y axis is our frequency.

So, how often does this process happen?

And the X axis is is the amount of time it takes.

So, we can, we can take a look at this process as how long does it take, how often does it happen?

Obviously, on the bottom-up, we have low value for improvement.

These are not really things that are going to take very much, much, There's a lot of value to these particular things. On the top right, we see automation as a big piece of the puzzle. And then, if you've got long running processes that are taking, that are, you know, these are things that are taking a long time. They don't happen that often. We want to changes. We want to understand and build a better way of doing things. If you have processes. And that could involve things like RPA that could involve things like learning and development processes, that are very quick running.

J-M ErlendsonThat, you know, have a lot of transactions that are occurring, but they don't take very long. You're gonna want to do things like automation improvement through that, trying to eke out small benefits through those particular things. And that's That's really important to understand where you think you can do the best good. So we're I'm finding things that are taking a long time. They may or may not happen that often But they take a long time. That's when we start to take a look and even more suspicious process steps things we can get into in a task mining layer. So what did we learn at our cross mining?

We learned standards are being followed. We learned a lot every works in the loop backs. We learn where are more snow suspicious steps were, and what who was in violation of the SLAs and KPI thresholds that we have sets And what our users at a high level are doing, or whether or not They're performing better or worse.

And, now, we have the perfect case to head, one layer deeper.

Now, let's now, let's go down the rabbit hole into task mining.

What processes are we actually executing on the manual sub processes, which ones are: are suspicious business KPIs? How do we improve individual execution? How do we capitalize on those learnings to support the implementation of new systems and technologies? Ultimately, the re-engineering of our process and the way we do things.

So, one step here, and you can see on on the left side, one of those steps, breaks into an entire sub process.

And you can see on the sub process, it's a little bit small, But you can read that we're actually looking at the individual actions that are happening in Excel, outlook, in particular, as people navigate their e-mails in order to fill out the automatic transactions.

We thought were so darn automated But we're not, there's a ton of manual work that's going into this and there's some loop back. So some very different bail conditions. There's a lot of things we should be concerned about in this individual execute or operation.

Interesting. So and so, how do we improve that? Where do we go into that?

Well, let's take a look what the actual Flow of steps is for those people, and what are the manual steps within this process that are taking, the very longest times, It's a minute, two minutes of time, per step that we have to look at.

If we could just do that in a matter of seconds, we can save a huge amount because of the amount of, know, the amount of time we're taking for each of these process steps to go through each of these parts of the process step. I can only imagine this is a transaction is actually, It's not taking 30, 40 minutes, an hour, two hours. No, No. The transaction is fast. The manual work is slow. And we actually dig into this and say, here's where Here is where the problem is at the task. one.

Reveal the flow of manual task. Reveal what people need to change. It's not a process re-engineering or re automation thing. It was actually all along a task level problem, people doing things differently.

Let's use that data to drive the decision on how to improve. Don't just throw a bunch of money at the wall and hope that automation re automation in your big systems is going to solve the problem. Understand that the automation, that big systems as being supplemented by work at the manual level where we can change the change, make a really big change for the better.

Then, how do we use this? Not only are we doing process re-engineering because we found where the problem was. We're also going to use this to create digital SLPs because one of the problems with conformance to process. So how do we make sure that people are doing it better the next time? So we need to bring this together with the documentation we've already spoken about with the previous automations we've already spoken about. And, now we're creating digital SLPs to support learning and development. And co-ordination with things like RPA. We want to make sure that the people are taking care of at the very lowest level. And so that data helps the users to get better through job aids, through support for their ongoing operations.

In fact, a lot of the organizations we work with, use modeling, use models, and there's sort of a portal that comes along with them as a way of making people more informed, better prepared, and ultimately better able to execute on their processes.

And then, how does this data help our developers? We want to, design, develop, and deploy.

You design your automation landscape using simulation from the statistics, captured through process and task mining, that hybrid approach with multiple levels.

We want to simulate what's happening at each of these steps.

And where is it failing at the lower level. If I were to make a certain change. What if I were to remove these steps, or whatever to set a new standard? What if I were to an automation here that would reduce the time required, I can get that statistic and information in from my automation partner, and I can put that into the mix and understand, what would the business benefit from that.

And then using that to, to send out requirements documents. Now, you have, here's what I need from you.

automation partner, here's what I need from you, people, and who are executing this process. And I'm developing both a technical solution and I'm developing an organizational change solution, which I will deploy.

Deploying, that means both technically implementing and going alive with your solution, and also communicating and monitoring the actual activities of your users. So, design, develop, deploy, based on the insights you've gathered at both of those levels.

And what do we learn through that task? Blindingly, what do we learn?

How do we dovetail back significant manual rework?

Our client was doing to support this automation but it in quotes because it really wasn't fully in any sort of fashion or form automated. What we were great for helping to start the conversation about application rationalization because, once again, this automation had multiple different applications that we're working on things. And there was a lot of suspicious steps happening, there was a ton of manual rework to fixed automation problems that were happening in those larger systems. Why don't we rationalize that into something that is working? Let's look at the statistics on which processes in which systems are working better.

It's harmonized and, and go in on those in a big way.

Next is we help to build the right way of re-engineering and automation through ERP configuration, as well as through manual rework or manual support through RPA and learning and development, through standardizing processes, passing out training information. And being able to socialize that change be part of that change management team.

And we realized that inconsistent results and trooper times could be defeated.

We could standardize, we could support. We, can build. And we could ensure that the next way of doing things, in this organization, was better at both the task binding layer and at the process finding layer.

In fact, they were kind of one and the same.

They are part of one big model that we could put as a standard for how or the organization was going to execute things moving forward. And ultimately, be able to socialize, and get acceptance for that, deploy that, and check back to make sure it was done. So, what did we do?

We actually got there. And in a very short amount of time, we were able to end up reducing those, you know, in the pilot stage of things that are able to reduce the cost per execution. In terms of human time, which was the biggest issue for them. As well as, they were able to rationalize a couple of the systems that they were working on that we're definitely not value added, be able to get that 20% number.

PM_Live_GraphicThey were starting to see substantially shorter wait times, even in the very first iteration of improvement.

Through a very quick implementation of RPA to take some of those theory, suspicious steps and make them, not quite so long running and bottleneck, creating. And then, early on, people started to provide feedback, say, hey, this was, this is fantastic. We love it.

We're starting to see that at customer acceptance of the new process and a better way of doing things, we were able to go back up to the SLA partners and say, Hey, listen, I've found some issues that are happening as a result of the way in which you do business.

You are doing business, because my doing business harder.

And that's not OK. Let's talk about how we can work together in the future.

We're able to display and give them insight, all the way up to the executive level of where things needed to get better, Why they needed to get better.

Perfect, important, important thing for the business case development and project justification.

And then give control tower officers and folks who are in the process, ownership group, visibility, street level visibility to what are the users are actually doing. We had an edge. People didn't even know who was doing what. Now, they have this visibility. They have this documentation. They're ready.

Not just for this implementation. Not just not just for this improvement, but for the future.

They're ready for their future improvements at any level of implementation and they can continuously monitor and use those lessons learned of where should I look.

What should I do about it to make their next way of doing things even stronger, even better, and build organizational capital around process and task mining hybridized to produce harmonized and benefits for the whole organization?

And with that I wanted to stop for questions and answers. I save my my time for the conversation has elapsed but I want to invite my friend Joe their way back on the line.

As he and I are going to have a great chat about what we've talked about today and also any other insights and thoughts you might have.

Wow, what a great masterclass on the blending of process, mining, and task, mining, a full picture, of what's, what's needed, the great examples. Lots of positive feedback here on your presentation and the ending sites, practical insights you have provided here. There is a lot, all sorts of different types of question, is actually a very interesting variety of questions coming up. And, I am going to start here with one related to privacy. From all things, you know, yeah, mining is looking at the user task set, the Desktop level. We have people here from all over the globe and they all have different perspectives on that.

And then how privacy is affected, and are this user actions protected by legislation such as GDPR and others? But specifically is that how do you do the task mining without infringing on certain level of privacy that people want to have?

Yeah, that's that's a really good question. I would say every client, I have asked that exact question, because it's scary. You've got big brother on your computer. What are they doing? What are they looking at? Are they getting your password for your banking information? Oh, no, that's terrible.

While the truth of the matter is, task finding, it doesn't care about your banking information, It doesn't care about anything except bored performance of your work in the platforms that are supposed to be done for work. So what happens is, task planning does two things. First is it does a lot of data masking, which is really good to be sure that you're protecting the privacy of people from entering things, like passwords and things like that, that could be stolen and done, and done ill with a second thing. Is it, is, it, does data filter it?

And segmentation, so you never even capture. And I pass mining system.

Anything about any system that is not in your very specific list to pick list of things that you are going to look at, because you have an understanding of what, how people are doing things. And you can identify if you're missing something, because there's like, literally a block that's missing in a process. Hey, what happened during this huge, empty spaces, black box? You can go and fix it. But normally we start with the very smallest you know, set of applications that you're going to look at, and that will usually give you a very good idea without infringing on anyone's privacy when they're working on Word, Excel, Outlook PowerPoint. These are things that are business applications for the purpose of completing business tasks.

And that's where we're going to find the information we actually care about, and no other information will even be captured by the system.

That is, that, that's good to know. Is the, is that trigger at the application level? For example, when using Excel, then, then, there's a trigger there, then, that you're recording those fast shorts on the whole time. How does that triggers? Yeah, it's a little desktop robot that sits in your, in your taskbar, and it is triggered by the execution of the call up of these particular application. So, when you switch to the tab, or switch that program, it starts the recording process. When you open the program starts recording process And so, yes, it Is it a desktop level that that that those that that that's made. You can do a secondary level of filtering and Anonymous ation as it makes its way into a server that would show this data up. So no human interacts with it before you get everything filtered and control to make sure that nothing comes through. So there's, there's actually two levels of protection you build and that's a good question.

Yeah, that's That's a great explanation. So, thank you so much for that. Ah, the next question here has a little bit to do with governance, has to do with the how do you identify and prioritize the right Opportunities, Knowing very well, that it's not just about the business problem. It is about the business problem and having the right group of people to work on that business problem.

So, if you can talk a little bit about, when you launch an organization, people say, Oh, yeah, we don't know. We have all sorts of problems, but I don't know which one would be good for process mining, what type of things, what type of factors are looking at?

to help your clients identify and prioritize potential applications that can create the most value, and be successful in the implementation. What combination off of processes and people are you looking to bring together to tackle the issue?

Yeah, that's a good question. I'll talk about processes first. Then, I'll talk about people processes is something that it's going to be dependent on a conversation with, with people who are talking about strategic decision making for the organization. What our, our, our core processes.

What are we focusing on as part of the the operations of our business, that we think provides the highest value to us, and to our customers? So, what are the what's the visibility of this process that I'm looking to target?

And then, what am I getting as feedback?

Is this a process that is problematic? For example, I worked for a very large bank up in Canada and they had a loan origination process that was really problematic. They had really low NPS Net Promoter Score, which are they are the BS in Business Feedback, see, sounds terrible.

And they knew that. There was a problem with the process, because their customers were telling them there was a problem with the process, and so you can take a lot of these sort of strategic business insights, and be able to understand where you should look.

What are our priorities, and what is failing, OK, cool. The third lever is a little bit more nebulous is what can we effect?

And that's when we start to take a look at people, there are some business units that are very sort of structured and rigid, and not going to change the way. They do. Think there's a lot of lots of regulations that make you have to do things in a certain way. And so there are So you are sort of adding to the bucket of things we care about, And that matter, and you are taking, from this bucket things, we can't do anything about, because there are factors that prevent us from doing so.

And together, whatever you have left in that little bucket is going to be the things you should start targeting first. And you should start small and grow from there. Don't try and, you know, evaluate your whole end to end business across 30 processes.

Today, don't do that. That's very yes, it could be.

That's a recipe for failure but you start small and grow from there. What's usually an isolated process. So something that is has has manual touches is very good because you want to bring both those levels and together and you something that has visibility and in measurability. So for instance, a finance process or something like a supply chain process that does pass through a lot of different areas. But it's very measurable.

And it's within itself kind of those are those are good places to start to look.

But people, on the other hand, you want to make sure you bring the right team together. So you want to have data scientists understand your data. You want to have process engineers, and business analysts to understand the context, and how it fits into the way you flow things.

click_to_view_all_eventAnd you want to have feedback from your extra pewter community, once again. I talked about the oil and gas opportunity. It is impossible to fully understand, why a thing is happening, if you don't talk to the people who are doing that thing.

So, having a digital collaboration platform, having feedback, having user interviews, having executory interviews, being able to see that that together will give you more insight and context behind the work you're doing. And then ultimately, once you've brought the team together to understand it, you got those opinions, you want to make sure you've got the executed by N So, when you choose to implement a solution, you can push it back out to them. And you're giving them the tools they need. As I talked about, one of the one of the slides, I'm giving them the tools I need this job aids to be able to execute as per the new design, so they can see the value of the feedback they've given to be involved in the process of improvement.

It makes sense.

Yeah, it does. It does add that there's also a good time, Very good answer and the such great insights. A good time to remind the audience as well. That software AG has made some handouts available. You can check that out on the on the Webinar.

Window here to your right hand side under Handouts, and there are a number of different handouts available there that you can access and download right now.

Jam, I have, I have a super interesting question here, very challenging, kind of unique question.

This is coming from someone who works with C CMBS, which is commercial, mortgage backed securities in financial institutions.

And one of the things that they have is that they have a lot of unstructured data and, you know, every one of their contracts, you would think that this mortgage backed securities, they're all kind of fall a standardized contract. While they don't, they have, every one of them has different types of contracts yet some of the work and tasks are pretty common and they. They seemed that there will lend themselves to automation. But there is a real struggle to automate because the contracts are not exactly the same now.

and so, so I think the overall question here is that, you know, there any tips on how you deal with data that's a bit unstructured, or there's a lot of variation on how, you know contracts, for example, coming in. Is there an application where process mining and text mining can help, or maybe that's not the right technology for that?

Any suggestions on how to get a little bit better automation on where there's a lot of variation in contracts and perhaps unstructured data?

Yes, I'm going to talk about this. There's, there's, there's two sort of challenges here. one is variation. Second is, Zika is the data itself.

So the variation, variation, is, it is.

I get that, that's a, I, there is a lot of variation based on characteristics of client, based on the product, the parameters of that particular security, I understand. I worked with a lot of financial institutions, where they bring the exact same up, too much Variation. How do I deal with this? Well, the answer is, that's a set process flow issue with characteristics of, of how it flows. So you're using flags and you're using different types to identify which process flow it should go under. That's already a process that actually is a, it's a thing you're documenting. And you're just doing it in business rules, inside your systems. That's OK.

You can cap set as decision tables, and you can catch that as variant process flows. Those are all fine. If you'd like, we can have a process with 100 different to one thousand different variants. Doesn't matter, that is actually correct. And then honing in on each variant one at a time lets you get granular analysis. So don't worry about the variation in the flow itself.

Unstructured data is a different question, but the float stuff can be very, very much.

It can be automated, based on the business rules and the decision tables, you bill, On the unstructured data to different question. So machine vision, which is what I'm talking about when we talk about task mining, is very good at understanding where, where you, what a click is and what it represents. The key here is that you need to understand all the different types of the clicks that are the same. Click it, actually, that are saying that this field here is the same as that field on this document is same as this field and that document. So we're all we're doing is we're keen in commonalities.

And, you know, who knows that your people, your, your business process executes already know that.

Because they're making those decisions in their mind every time they look at a new, a new contract, or a new security document that they're saying, Oh, this field, oh, I recognize this field.

It was that field on this other document.

And so, when you key in all the different possibilities, and you use robotic process automation, you're setting HEI, RPA bot, go look on the screen to see if you can find one of these 35 different variations of the same field. Oh, he found it to put that little data field in there. Let's automate this process.

Oh, other, OK. There's other fields have found that over here.

And so you can actually detect, when you click on different places, you can even detect.

Where are those common fields are? So you use robotic process discovery in this task, mining, understand the performance and execution, which will tell you what you should be, automating and robotic process automation.

And so use the use of technology, it's to solve your problem on the discovery side, which will allow you to use the technology to solve your problem. On the execution side. That makes sense.

It makes total sense, jam. That's why you are one of my favorite presenters, and all of the sessions. Because you get into the details. You get into the, how choose the, You know, what can be Done? What's really challenging, real, practical Insights? We've gotta bring you back and have another session. We'll just have like a 45 minute Q&A with you, on all of these challenges. I want to say a shout out to the audience. I mean, we're having so many awesome questions provided by you, the practitioners. who are watching this. So. I promise you, we're gonna bring, We're going to bring Jam back in the future. You're going to do another another one of this. Maybe you would do an even longer Q&A, Fireside Josie, and J M.

Because that's how I J and J here and it's not an infringement on Johnson Johnson because I know there's a lot of hard and Johnson friends on this call right now. Dan, what a pleasure to have you with us. Thank you so much for sharing expertise in your wisdom, and it's obvious that our global audience appreciates the strategy, and the tactics that you share with, with the community, so thank you so much for being here with us.

Pleasure, my friend, I'll see you next time.

Thank you.

Ladies and gentlemen, I, real industrial leader and practitioner right there. J M L, or Sun with us, and that always providing such great insights and the practical applications to process mining. We're gonna wrap up this session now, and I'll give you a break and we're going to restart at the top of the hour, and when we restart, we're gonna bring a treat for you. We're gonna bring a great leader from Twitter. Who's going to talk about process mining as a digital transformation accelerator? So that's Android, Kirklin, who is going to be joining us at the top of the hour. You don't know what do not want to miss Andrews session and I look forward to seeing you back soon here and the next 10 minutes or so. Thank you.

pillar%20page%20line%201

About the Author

J-M ErlendsonJ-M Erlendson,
Senior Systems Engineer, Intelligent Business Solutions,
Software AG.

J-M Erlendson is a Business Process Architect, Methodology Specialist, Conference Speaker, and Transformation Engineering Lead with over 15 years of experience in Business Process Management (BPM), Enterprise Architecture, Supply Chain Management and Project Management, helping clients develop and implement business process frameworks, hone process-centric strategies, and execute process improvement and architecture modernization projects.

He is a leader in business, founding and running multiple highly-successful independent arts companies and charities. He is a process and EA tools expert and high-level trainer with years of experience in documentation, simulation, analysis, improvement and as a vehicle for change management. He combines this technical skill with strong leadership and relationship management skills, having run multiple BPM and EA-focused projects with cross-functional supporting teams.

He also has a strong track record in business development, creating new relationships both internally and externally to drive BPM adoption and sales opportunities. He has consulted for a variety of clients in Canada and the United States, particularly in public sector, retail, utilities, manufacturing, lottery and gaming, telecom and oil and gas.

pillar%20page%20line%201


The Business Transformation & Operational Excellence Industry Awards

The Largest Leadership-Level Business Transformation & Operational Excellence Event

opex_assembly

business_assembly

Proqis Digital Virtual Conference Series

View our schedule of industry leading free to attend virtual conferences. Each a premier gathering of industry thought leaders and experts sharing key solutions to current challenges.

Download the most comprehensive OpEx Resport in the Industry

The Business Transformation & Operational Excellence Industry Awards Video Presentation

Proqis Events Schedule

Proqis Digital

Welcome to BTOES Insights, the content portal for Business Transformation & Operational Excellence opinions, reports & news.

Submit an Article

BTOES UNIVERSAL GRAPHIC - NO DATE.webp?width=1200&name=BTOES UNIVERSAL GRAPHIC - NO DATE
ACCESS 50 VIDEO PRESENTATIONS
Access all 75 Award Finalist Entires
RESEARCH REPORT 2021/2022
BTOES AWARD - NO DATE
BTOES UNIVERSAL GRAPHIC - NO DATE
Subscribe to Business Transformation & Operational Excellence Insights Now
btoes19.png
png
ATTENDEE - Proqis Digital Event Graphics-2
ATTENDEE - Proqis Digital Event Graphics (2)-1
ATTENDEE - Proqis Digital Event Graphics (1)-1
png

Featured Content

  • Best Achievement of Operational Excellence in Technology & Communications: IBM
  • Best Achievement of Operational Excellence in Oil & Gas, Power & Utilities: Black & Veatch
  • Best Achievement in Cultural Transformation to deliver a high performing Operational Excellence culture: NextEra Energy
   
Operational Excellence Frameworks and Learning Resources, Customer Experience, Digital Transformation and more introductions
  • Intelligent BPM Systems: Impact & Opportunity
  • Surviving_the_IT_Talent_deficit.png
  • Six Sigma's Best Kept Secret: Motorola & The Malcolm Baldrige Awards
  • The Value-Switch for Digitalization Initiatives: Business Process Management
  • Process of Process Management: Strategy Execution in a Digital World

Popular Tags

Speaker Presentation Operational Excellence Business Transformation Business Improvement Insights Article Continuous Improvement Process Management Business Excellence process excellence Process Optimization Process Improvement Award Finalist Case Study Digital Transformation Leadership Change Management Lean Enterprise Excellence Premium Organizational Excellence Lean Enterprise Lean Six Sigma Execution Excellence Capability Excellence Enterprise Architecture New Technologies Changing & Improving Company Culture Agile end-to-end Business Transformation Execution & Sustaining OpEx Projects Culture Transformation Leadership Understanding & Buy-In Lack of/Need for Resources Adapting to Business Trends Changing Customer Demands Failure to Innovate Integrating CI Methodologies Lack of/Need for Skilled Workers Lack of/Need for Support from Employees Maintaining key Priorities Relationships Between Departments BTOES18 RPA & Intelligent Automation Live Process Mining BTOES From Home Cultural Transformation Financial Services Customer Experience Excellence Process Automation Technology Healthcare iBPM Healthcare and Medical Devices Webinar Culture Customer Experience Innovation BTOES Video Presentations Exclusive BTOES HEALTH Strategy Execution Business Challenges Digital Process Automation Report Industry Digital Workplace Transformation Manufacturing Supply Chain Planning Robotic Process Automation (RPA) BPM Automation IT Infrastructure & Cloud Strategies Artificial Intelligence Business Process Management innovation execution AI Lean Manufacturing Oil & Gas Robotic Process Automation IT value creation Agility Business Speaker Article Systems Engineering RPAs Insurance Process Design Digital Speaker's Interview data management Intelligent Automation digital operations Six Sigma Awards thought leaders BTOES Presentation Slides Transformation Cloud Machine Learning Data Analytics Digital Transformation Workplace Banking and Capital Markets Data Finance Professional Services Education IT Infrastructure IT Infrastructure & Cloud Strategies Live Blockchain Interview Solving Cash Flow with AI BTOES White Paper investment banking Analytics Insight BTOES19 Consumer Products & Retail Enterprise Agile Planning Government Operational Excellence Model Project Management Algorithm Automotive and Transportation Banking Business Environment Digital Bank Enterprise architecture as an enabler Hybrid Work Model Primary Measure of succes Relationship Management Sales business expansion revenue growth Adobe Sign Agile Transformation CoE Delivery solution E-Signatures Electricity Global Technology HealthcareTechnologies Innovation in Healthcare Reduce your RPA TCO Transportation Accounts Receivable (AR) Big Data Technology CORE Cloud Technology Cognitive learning Days Sales Outstanding (DSO) Logistics Services Operational Excellence Example Risk Management business process automation transformation journey Covid-19 Data Entry Digital Experience Digital Network Digital Network Assistant (DNA) Digitization Drinks Effective Change Leaders HR Internet Media NPS Net Promoter Score Program Management Portal (PgMP) Sustainability TechXLive The Document is Dead The New Era of Automation Automated Money Movement Banking & Financial Services Biopharmaceutical Blue Room Effect Building Your Future Workforce in Insurance Business Process Governance Capital Market Creative Passion Digital Transformation Workplace Live Digital Workforce Digitalization ERP Transformation Finance Global Operations (FGO) Financial Services Software Frameworks Hoshin Planning Human Capital Lean Culture Natural Gas Infrastructure Natural Language Processing Organizational Change Pharmaceutical Pharmaceuticals & Life Sciences Project manager Supply Chain Management Sustainable Growth The Fully Automated Contact Center Transformation Initiatives Workplace Analytics eForms eSignatures 3D Thinking BEAM BFARM BTOES17 Big Data Processing Business Analytics Business Growth Centralized Performance Monitoring System Communication Creativity Digital Technologies Digital Technology Educational Psychologist Energy Management Health Insurance Health Maintenance Organizations Hospitality & Construction Human Centered Design Integrated Decision Approach Integrated Decision Making Intelligent Document Processing Kaizen Medicare Moodset for Excellence Natural Language Processing (NLP) Offering Managers Oil and Gas Optical Character Recognition (OCR) Pharmaceuticals and Life Sciences Photographing Price and Routing Tracking (PART) Process Design Document (PDD) Product Identifier Descriptions (PIDs) Python Quote to Cash (Q2C) Resilience SAP Sales Quota Team Work Telecommunications Text Mining Visually Displayed Work Culture master text analytics virtual resource management