Courtesy of ABBYY's Richard Rabin below is a transcript of his speaking session on 'Gain a new perspective on your processes' to Build a Thriving Enterprise that took place at the Customer Experience Excellence Virtual Conference.
Gain a new perspective on your processes
For most businesses, Process Mining is done for the sake of incrementally improving business processes. And while process efficiency is important, the data you mine can be put to greater use. When applied correctly, process intelligence makes a big impact on your customer experience.
Process Intelligence provides visibility and control of your processes. This extends even if processes are distributed across tools or platforms. Bringing your data together removes friction so you can provide a smoother experience for everyone involved.
Process Intelligence can greatly improve the customer journey. Watch this session to see how the generated process model can improve your business by:
From Philadelphia. Richard, Welcome.
Thank you very much.
Richard is the product marketing manager at Process Intelligence and adding and he's going to present today, gain a new perspective on your processes.
Richard Raven is a product Marketing Manager for Process Excellence at Abbey Digital Intelligence Company.
He works closely with global enterprises to help them better understand and optimize their business process workflows, bottlenecks, and how to select the initiatives that will yield the most business value with intelligent automation and, and how they will impact overall operational excellence.
Richard has more than 35 years of software engineer, engineering expertise.
And more than 10 years of working in the Process Intelligence world bough. Very president. Richard.
Off to you.
Thank you very much, everyone. Appreciate it.
Thank you very much.
Go, as Jim mentioned, I'm going to be talking about process intelligence. Normally, when we talk of process intelligence, we're talking about how it can help a business to gain visibility into their business processes control of these processes find where the inefficiencies are and optimize their processes.
But that's not going to be our topic for today.
Today, we're going to be talking about how process intelligence can also help as far as improving customer experience.
And, in fact, I believe it can be a major asset as far as changing the customer experience.
So there's nothing new about organizations wanting to measure, and understand how their processes are performing.
This goes back a long ways, and it might have included people with clipboards stopwatches, looking over other shoulders, trying to record what was going on as a process was being performed.
The problem with that was that that approach is very time consuming, very error prone, and it doesn't give you the level of data that are available.
In today's modern tools, for what I'm talking about, is process intelligence, which is really an offshoot of process mining, and we will differentiate between process, mining, and process intelligence, and how process intelligence specifically can make such a difference with customer experience.
When we talk about process intelligence, it's really made up of many different pieces. It certainly starts with process mining, but it adds task mining. And we'll talk about the benefits of having both of that.
But beyond that, there's also a degree of analysis tools that are built. So once you've collected the data on all your processes, we then give you various tools to understand, to visualize the process flow.
And in the area of process intelligence, additional tools to do more numerical analysis of the details of your processes, in addition to analyzing your processes. We also, the ability to monitor your ongoing processes, as well as to make, predictions about what will happen based on the experiences that we've seen so far. So we'll go through those, and specifically, how that can affect the customer experience.
So let's start with basic process mining.
So with process mining, the general idea here is that any organization has a variety of systems of record, and processes running within these.
If it was as simple as a process runs in a CRM system, or a process runs in one of these other systems of record, then we could rely on the tools that are provided by that system, to be able to visualize and understand what's going on.
Different companies have different levels of such tools, but they would often be providing something, but the issue that we have found, and we found this over and over again, at many different companies, is that most business processes that people look at actually span more than one system of record. The significance of this is it means that, regardless of what tools that system provides, to help you visualize what's going on, it can't show you the entire end-to-end process, if that process runs across multiple systems of record. So with process mining, we're gathering the data from all of these various systems and bringing it into one place. Now, all different process mining approaches me pretty much the same data.
We're looking for that the breadcrumbs data Artifacts that tell us what steps happened in a business process, we're always going to need to know when a step happened, what specific step it was, And some unique identifier, something that will allow us to group together all the steps that relate to one case, or one process instance. So that might be a document ID. It might be a claim ID, a patient ID. But those are the basic pieces of data that all process mining tools will use, and then, of course, any other data that relates that case that relate to that patient can also be included.
So, the first thing is, we need to bring all that data into one place to be able to work with it. Into something such as a process mining or process intelligence tool, and you might be thinking, Well, you've got a data warehouse. Why?
Couldn't you just use the fact that the data warehouse is bringing together all the data from all these different sources, and get what you need out of that? Or you have BI tools sitting on top of the data that can answer additional questions.
But the reason that those tools do not suffice and that something like process intelligence is needed, is it whether it's a database or a business intelligence tool, they don't know about process, they can gather all the data, but they don't organize it as process steps. They're not structured to be able to visualize a process, to be able to visualize different aspects of a process, and to reason over that process, to be able to do things like, define process rules that span all, these different systems, and enforce those rules against your processes as they're running. So, when we talk about total process visibility, it's really a combination of getting the data from all the sources, feeding it into, in our case, a process intelligence tool, that then organizes that data as a process flow. And a structure to be able to answer questions about that process, so that, rather than relying on how people think their processes performing, you get the actual facts for each process instance on what's going on.
Now, when you look at the data from process mining, you will sometimes find that there are areas where it looks like nothing's going on for a long period of time. And sometimes the reason for that is, well, there is work going on in your process, but it's going on outside of these systems of record. So, the problem in the past, and, as mentioned, I've been doing this for over 10 years. And in the early days, when I was working with process intelligence, people would ask me, Well, if I've got some steps that are not done in any of these systems of record, someone's going out to the web to get information. They're pasting it into an Excel spreadsheet.
How can I include that information in my process diagram? And the basic answer at that point was, well, if you're not recording in the system of record, you can't.
Well, of course, that's changed. And what's changed is, there are new tools to handle what's referred to as task mining. Process mining, We're pulling data from systems of record, databases, log files, various systems that are recording, what's happening in my automated process steps or any process steps. Whereas, with task mining, we're pulling data that's not in any of those systems. So we would deploy recorders out to our users, not the customers but our users, desktops to record what steps are being done. Now, of course, there are many challenges with this.
The first challenge would be one of if you're going to be doing this, then you need to be wear wary of privacy concerns. If you're going to be recording your employees, well, there's many things you have to do to be complying with regulations. They need to know they're being recorded. They need to have some control of it. You might want to control what things are being recorded and obfuscate or hide Confidential data within these recordings. All of that's handled in modern task mining. But the real point is, by combining the information recorded, from users, doing these manual steps, with the information we're getting from systems of record, we have the most complete view of what's actually going on in our business processes. And we'll talk in just a few moments about how this relates to the customer experience.
So, first off, what I would point out, is that, with what we've described so far, It's all various ways to gather the data, to help understand your process. But, of course, a major part of this is, now, what do I do with that data once I've gathered, gathered it.
I talked about the difference between process mining, and process intelligence. Process mining typically focuses on this first display here, this schema display.
So what we do is we take all the information from all the cases, all the instances of some process, and we synthesized that into one diagram that shows the information flow, the flow between different actions within that process. So where I see a branching point.
Well, that might reflect the fact that some processes go from one step to this other one, whereas other cases go from this first step to a different second step. So, there's a branch point, and different things go on to get an idea of what's actually happening, what the flow is within your processes. And these schema diagrams can be very informative.
They can be annotated to include information about how long various steps take, or what percentage of the cases go one path versus another, they give you an idea of what's really happening. And by turning on an animation feature, you can get an idea of what the flow looks like. So, there's a lot of value from that.
The fundamental problem with that is that it relies on processes that have a fairly simple and regular pattern with them, so that you can draw it out like this.
But, imagine if your process had three times, five times, 10 times as many significant steps, as we're seeing here, particularly in a workflow, or case management situation.
You might have a condition where I've got so many different branches going in so many different ways, that it looks kind of like a spaghetti diagram, and each different case is unique, so that you can't easily summarize them in this kind of a simple diagram.
Most process mining tools will deal with this situation by starting to filter out, the less often seen cases. Then you get back to some diagram, which will look closer to this. And it's easier to make sense of.
Of course, the problem with that is that as you start filtering out the exception cases, you may be losing a lot of the data That really is significant to what you're trying to find.
As far as, if you're looking for things such as opportunities for optimization, where time is being spent, when you start filtering out the problematic cases, you're gonna miss a lot of that.
So that, that's a limitation with this schema based approach to process mining. So, we talk about process intelligence. We start adding additional tools to this, so that that might include a display of all of the different paths that are taken by the various cases.
So, typically, this might be structured, that the paths shown here on the far left, is our most common path, and you can't see the metrics above each path, But there's information on what percentage of cases follow one path versus another, or what's the average time it takes when you follow one versus another. Or you can reconfigure it to show what is the cost when you follow one path versus another.
But different sets of types of information that are available beyond the basic schema display.
But, it still has a fundamental limitation that if all of my cases are very complex, and have a lot of human decision making, such that they're likely to take different paths from one phase to another, then, this kind of a diagram is still very limited.
So, we get into other analysis types that are provided by process intelligence. So, for instance, you might have particular steps of interest to you within the process.
And you want to select a start step and an end step, and then generate a diagram that shows you how much, how many cases, um, are very quick. Versus we can see a very small number of cases that take a much longer amount of time.
So, the significance of a display like this is it no longer relies upon trying to visualize the entire process. So that with process intelligence, we're less reliant on that schema diagram, meaning that if you've got a set of problems that generate very complex processes, they may not work well with basic process mining.
With process intelligence, we have a number of types of information displays that will work regardless of how complex your process is.
And it's not just seeing up the vast majority of my cases happened to handle this transition very quickly, but you can then do things like dimensional analysis. So, here, we might break this down further by operator, and I understand you can't see the details of these charts.
But you can see that one differs from another due to yet another one of these, And the point being that you start to get to the root cause of what is taking time within your process.
Because you can see, broken down by any dimension, you care about what the actual process flow looks like, and we can take it a step further within process intelligence. Not just allowing you to pick.
I want to see how long it takes from this step to that step, but you might instead say, you want the system to analyze all of the process instances, and find where the bottlenecks are for you. Show you the transitions that are taking the longest.
And there are many other types of mathematical analytical techniques that allow us to exploit and find the data in your processes, regardless of how complex they are. You might want to be able to look at things such as significant metrics, and how those metrics vary, based on whether you do the process one way versus another.
Or, you might want to set up a definition of what is a process behavior, and use that to filter out and look for just the ones you care about, and we'll talk more about what that can get you in just a moment.
Now, another one of the displays that's available in process intelligence is to go down and look at an individual case. In my case, we'd refer to this as a timeline and be able to see all the steps that are done.
And once again, you might be thinking, well, in my CRM system, I can see a history of the case, but we come back to the fact that, in most of the situations we see, you actually have processes you care about that span multiple, different systems. So no one system is going to be able to give you this kind of a display. So, you need something that sits outside of grid, traditional systems of records and provides what might be thought of as a control plane that can capture data from all the systems, can visualize that data. And even allow you to control or take action over these processes.
So, in this case, if we're looking at all of the steps that happened in a given process, We can even take those various steps and use pattern analysis to look for the different sub processes, so didn't see at a glance what things are going on with this specific customer case, and it's not just seeing what steps happen and what sub process those are a part of. But we also have information on the attributes in the attributes, values, so that you can see at any given time, what are the changes being made to the case attributes.
And you can see that across the complete case history, again, even when that case spans multiple systems of record. So, there are a wide variety of tools that can be used to analyze the data.
The next piece is, so we've got all the data that's been pulled from these various systems and put into one place in my process intelligence tool.
This model that we're building of the process flow would sometimes be referred to as a Digital Twin or a process digital twin.
We've got this analytic model with all the information about what steps have happened in my process. one of the benefits of having such a model that's kept continuously up to date, as any step happens, is that I can use that for monitoring operational and compliance monitoring.
So, I can define behaviors, process rules, and I can scan every instance of a process. And whenever I trigger one of those rules, I can either send out a notification, It might be e-mail or text message to a person or a group.
Or I can invoke a web service.
It might be a digital worker, a bot, or any other web service, so that I can respond instantly when certain conditions are met when assertive behavior is observed.
This is significant because we see various cases where organizations have a large number of people to try and monitor what's going on, but they can only monitor a small number of the customer cases. Whereas, with this kind of automated monitoring, we're able to monitor 100% of the cases and then just get a human involved for reviewing the case when something significant has occurred that requires their attention.
I mentioned behaviors. It's important to understand, when we talk about a process behavior, what we mean, another aspect of the way this is significant, is that, I mentioned before that even if you have all the process data in a data warehouse, the data warehouse isn't structured to be able to visualize the process or to answer questions about the process.
Within this process model, within process intelligence, we not only have the definitions of the process, but we can graphically define a process behavior. Sure. You can write SQL, that would essentially do the same thing with data in a database.
But it would become very complex SQL, very hard to maintain, so that it would not be something that individuals could easily build themselves unless they have a fair skill level.
Whereas, in this case, when I'm defining a process behavior, I'm doing this all graphically by making certain selections. I'm looking for any process where no issues was reported before a case was open. And after the case was opened, then an issue found event occurred.
Then, there were two work completed events.
In this case, for this behavior, two instances of work completed, on, by the way, it took more than 10 days from the time the issue was found to the time the work was completed there, other behaviors that can be defined.
But, in general, these behaviors revolve around what steps were included, What steps were skipped? What steps were repeated, the sequence of steps, the timing between steps? And you can also take any attribute value that's available to you. and put in different conditions relating to the attribute value. So, behavior might be only for certain type of case, You want to look for a certain condition.
So, this is the idea of a behavior definition, going back to monitoring, when we talk about being able to watch over the processes, to see if they trigger certain behaviors. This is typically what we're talking about. A behavior in terms of these factors, that I just mentioned.
And, the fact that we have a continuous, either real-time, or near real-time, depending on your needs update of my Digital Twin of my process model, means that as soon as a new piece of information becomes available, I can re-evaluate these conditions and notify or automatically take actions immediately based on that. Well, we can take it a step further than that.
Take that same notion of having A process Digital Twin, a model of my, of my process, having it being continuously updated. But add to that, the notion of a neural network using machine learning such that we train a network to recognize the behaviors of this process. And then we can use it too.
Those outcomes might be in terms of what step is going to happen Is it going to be one step of interest or another? Or are we going to make an SLA value or not?
And we can have a probability defined that's continuously updated as each new piece of information comes in.
Are we likely to have a patient dismissed from an Emergency Department? Or, are they likely to, to be dismissed to go home or to be admitted to the hospital. Wouldn't it be nice if we could know that. Or at least know, it's likely before it actually happens if we could, based on our previous experience in the training of the model? Understand if an SLA is likely to be missed. And once again, we don't want to wait until it Smith's.
We want to know as soon as we, we believe it's likely to be missed, that we can notify someone who might deal with it.
Are we going to automatically kick off an action to deal with it? And by the way, at this point, when you've got a model, a digital twin of what's going on, you've got the ability to monitor this, to predict outcomes, and to act on those outcomes. It's what, in some cases, is referred to as a control plane.
So it's something that sits outside your normal systems of record watches over those as some process understanding and the ability to take action and ishan notifications based on what's going on.
So, those are a few of the things that you might see available in this notion of process intelligence. How does it relate back to the customer experience?
And, again, I say that I think this is, can have a major impact on customer experience in a way that you couldn't easily get with any other approach.
So once you've brought all the data together from all aspects of your process, all your systems of record, or even from recording what your users are doing, we're keeping this up to date. That might be real-time, Although, I will say that, most of the clients I work with, finally, near real-time is good enough. If you updated every few minutes, that's really all they're looking for. They don't need it to be updated every second or every millisecond, depending on your definition of real time.
And this gives you the ability to visualize what's happening as far as the process flow and any steps of interest. So that's what allows companies to better understand their, their business processes. But when you take into account the fact that with process intelligence, we're keeping data on every instance. That means that I can also search for an instance of interest. And wouldn't it change the customer experience?
If you're organization knew where the bottlenecks weren't could improve the process, that's the basic process intelligence. But beyond that, they also have the record of each case, not waiting until it's done. But as the cases developing, as it's putting data into various systems, were able to accumulate all that, visualize all that. So, if you have a customer service agent who's dealing with a client, they have a real-time view of everything that's gone on in your systems, regardless of what kind of reporting, or what kind of notification is provided by any underlying system. This control plane, sitting on top of all these will give your staff a complete view of every customer journey as it's occurring.
Wouldn't it also change your customers' experience if you could add this automatic monitoring.
So, as we've discussed before, you're able to not just look at some percentage of the cases, but be able to have automatic behavior, check such that you're watching as each piece of information comes in.
Have I violated any rules now, in one sense? This could be done from the standard aspect That that would be used with process intelligence Where you might be looking for concerns or compliance violations and deal with those as soon as they're seen.
But you also might use this from the standpoint of trying to improve a customer's journey.
I'm currently working with an organization that's in the area of AI chatbots, so that they maintain a communication stream with various aspects of the of the clients, staff. And we are able to trigger based on what's happening in each case, as it happens by working together. We can have an intelligent conversation with the customer as their process is going forward. And it doesn't require human intervention to set this up, so that we can do this across every case, 100% of all of our various instances. The next one would be, as I mentioned, prediction.
So, there's really a couple different areas that I've been involved in using prediction in this aspect of working with customer data, working with process intelligence. So, as previously mentioned, we've got the model of all of the interactions. We also now add a neural network trained to recognize the performance of this specific process type, and then it's able to predict various things. So that, in this case, we're talking about an SLA.
So when we're talking about predicting an SLA, you would specify a threshold, something such as, let's say, 75%. So we're not saying when you get to 75% of the time in your SLA, notify someone, we're saying, based on experience, based on the training of your neural network, when we think that it's When the network thinks that it's 75% likely that you're going to miss an SLA. At that point, you can send a notification, or you can initiate a service to deal with that, so that we're responding to this likely SLA breach before it actually happens. So, in many cases, we can prevent that.
So, once again, when you're thinking of customer experience, the ability to have a knowledge of the likelihood of a problem before the problem occurs, is something that can allow you to improve both the flow of your process.
And, of course, along with that, your customer has experience of this process when working with your organization.
And finally, that was predicting an SLA violation.
But another example would be predicting an outcome. So, for instance, in this case, I mentioned an emergency department.
So, when someone leaves this emergency department, they might leave it because they're going home, or they might leave it because they're going to be admitted to the hospital.
And with monitoring, you could certainly wait until the decision's been made to admit summative hospital.
You send out a notice as soon as that decision has been made, and you're able to prepare for that.
But wouldn't it improve your customer, or this case patients experience? If, as soon as it seemed likely that they were going to be admitted, we could certainly appropriate notification, we could call the appropriate service to start preparing for that outcome, even before it's been declared as official, that it's going to happen.
So, bottom line is that when we're talking about process intelligence, that there are really many ways that not only can you improve the backend performance of an organization's processes make it more efficient, avoid bottlenecks.
But it can have very, very significant implications as far as customer experience, based on the fact that your organization will now have complete visibility even while processes going on. And while that processes spend is spanning many different systems, you'll always have complete visibility into each customer journey and be able to use that when interacting with the customer.
You'll be able to keep track of, monitor every single customer instance, customer journey, to be able to understand what's happening, and potentially, notify and take action on what's going on, And as just described, that also gives you the ability to learn, have a neural network, Learn the behaviors of your process and use that to predict outcomes and SLA violations before they occur, which all taken together, give you an ability to enhance customer experience in a way you simply couldn't do with other tools.
So, that's a fairly simple message, but it's what I wanted to present today. And with that, I would ask if anyone has any questions for me.
Thank you, Richard. Fantastic presentation.
Very, very good.
You ended on the half-hour. You're looking for lots of questions, aren't you?
All right, so I have a couple of questions here.
one question I have is around intelligent process automation, and now they talk about hyper automation, perfect, predictive, analytics, prescriptive, analytics, true, Digital Twin.
Can you just kind of provide an overview of all those different themes?
because a lot of folks are probably a little bit confused, So, I mean, there are many aspects to that, from my standpoint, when I'm focused on process intelligence, one of the main things we're doing.
Thing is giving visibility into your business processes, didn't really focus on that as much for this presentation, because we're talking about the customer experience.
But, when you have the ability to better understand the process flow, when you've got tools that will allow you to directly see where the bottlenecks are, or be able to define a process behavior. And, whereas I talked about process behavior from the standpoint of using it for monitoring, you can also use those same process behaviors as an analytic dimension.
So, you can say, For a metric I care about, whether it's customer satisfaction, or whether it's profitability.
I can, with this kind of visibility, and with this ability to define a behavior, I can now use that behavior. Just say, Well, what does the customer satisfaction when we do the process one way versus when we do it a different way? All of these things come together to say that if you've got visibility into your process, if you can see where the errors were, it not errors, but where the inefficiencies are and get an idea of how to improve them, you can.
You can improve the overall process, and therefore the customer journey through that process.
And when you're talking about hyper automation or any form of automation, that might be RPA. There might be other types of automation.
The real issue is, you need to measure the process, and understand that before you put the automation in place, the risky run, if you don't do that, is you find one particular, inefficient portion of a process. You automate that, and that portion of the process runs much faster than it used to. But maybe it just shifts the backlog somewhere downstream. So that if you're focused on a particular chunk of automation, you run the risk of not really optimizing the overall process to the extent that you think you are. Whereas, if you use tools such as process, mining, and process intelligence, to gain visibility into the end to end process, then you have a better understanding of how one piece of automation affects the overall process. It's, one of the reasons that, tools like this are so popular with the RPA vendors. Not only can we look at things like the manual steps.
We talked about task mining, and be able to look at which tasks are taking more time, which tasks would be easier, or harder to automate.
But we can help select the automation candidates with task mining.
We can use process, mining, and process intelligence to get a view over the overall process so that you get not just localized improvements, but overall process, understanding, and optimization by working with this kind of an environment that allows you visibility into your process.
Yes. Yeah, I'm an own Lean six Sigma guy.
What I see is typically folks going in and doing RPA, because I've done an RPA and and they get over anxious and they want to automate a broken process.
So all the bad things are still happening.
We don't try to clean up the process first and then those become nested.
And then you, you can do that so long until you just bind yourself into the corner, then you end up having to undo it.
And then nobody wants to raise their hand for that conversation.
And then the whole program, kind of implodes after a year, a year and a half.
And because you're going after the low hanging fruit, you're not dealing with any of the outliers.
So if you always focus on everything, the average of an average of an average, you're gonna get sub par.
So far, performance in the long run.
And that also relates to the difference between using process, mining, and process intelligence for this purpose, because with process mining, you're typically filtering out the most on average cases. So you really are looking at description of your process, behavior, performance, based on the more average cases. Whereas, what you really need to do is to consider that in the context of the outliers also.
People talk about the trying to get a faster time to automation, Faster time to value.
A part of that is not just building automation more quickly. It's building it more correctly, so that you avoid false starts if you understand your process, before you start to try and improve and automate your process.
And those outliers that you mentioned before, right.
Those are customer experiences. Of course, yes. Your customers, they might be pretty important customers.
And more than that, if they're outliers, they may be the customers that have issues. If you really want to improve the customer experience, you don't want to just focus on the customers that are following the so-called happy path. You want to be able to look at the wide variety of exceptions. And that's where traditional process mining has more of a problem and process intelligence, with less reliance.
On the schema diagram, that is a hard time looking at complex sets of different cases, really comes into play.
All right. So, I have another question here.
If, if I have all my data from all my systems and a data warehouse, can I get the same visibility and control that I have with process intelligence?
So the basic point there is that, if you've got all the data in one place, you certainly have an asset there. But, as I mentioned before, a database, a data warehouse, or even a BI system, doesn't understand process flow. So, if what you want to do is write a specific query that says, when I do step, hey, how long does it take until I get to step D? Sure, you can do that.
But if you really want to put together a realistic behavior description, and you want to keep that be able to modify that over time, and try different things, the SQL approach is going to be extremely difficult to build, extremely difficult to maintain or rule out most people who might have an interest in this. With the type of behavior that I showed you, where you're just graphically defining, what steps are included, skipped, repeated, sequence of steps, timing, stepped, et cetera.
Very easy to specify that graphically, which means that any business user, richer, has an Excuse me.
yeah, you want to turn off your monitor, because you're breaking up.
We didn't get an answer for the last question.
I've turned off my camera. So, maybe this will work better.
Yeah, let's try again.
So, you might have an answer on.
I don't know, how much here, or what are you looking for?
Enter your question.
Well, that's kinda funny, because I'm not hearing you now.
Um, one more time.
Can you hear me?
I heard, Can you hear me? But I didn't hear anything else.
I'm sorry, Richard, can you hear me now? I hear you just fine, thank you.
OK, so I'm sorry I repeat. What was a portion of the question you wanted me to readdress?
Oh, I'm going to turn.
We kinda missed the whole thing.
I guess, the water web traffic, if we have all the data different from all the systems in a data warehouse that we have, that, can I get the same visibility control that I have with process intelligence?
And the answer is it.
Process Intelligence is defined to organize that data as a process, to visualize the process, to be able to answer questions about the process, such as: is this instance is this instance following some rule or violating some rule. So you're not going to have that with the data warehouse. You're not going to have that with a business intelligence tool. You need something that understands process, or you're going to be stuck in the situation of trying to analyze that via writing SQL queries. And writing SQL queries is complicated, Maintaining them as things change gets difficult. And it rules out the ability for citizen developers or citizen users of this technology to be able to define the process behaviors they care about. So that if you want to define a business rule, a process behavior, that looks for certain conditions, it becomes very easy to do that within a process intelligence environment. It can be very difficult to do that with a data warehouse or a business intelligence tool. And that's just the visibility part.
When we talk about things such as automatically scanning for bottlenecks, well, there's nothing in a data warehouse or in a BI tool to do that.
If we talk about predictive analytics, you don't get any of those capabilities built in to a data warehouse, but you do get that built in an easy to use within a process intelligence framework.
Excellent, Thank you so much.
I was also curious on, were you move into digital twins? How do you define a Digital Twin from a data warehouse?
So just, OK, So we have to be a little careful when we talk about digital twins, because there's so many aspects to digital twins.
A lot of people will think of it in terms of a manufacturing digital twin, that if I'm going to build something, I want to build an electronic model of that, so that I can manipulate it virtually before I actually build these things.
What we're talking about, I would refer to as a process digital twin, and the way you would build that would be basically to use there's various algorithms for analyzing process data So that you might use one that processes it all down to a schema. You might do more like, what my product does, which is to, yes, we develop the schema from all the low-level data.
But we also have a lot of other algorithms for organizing the data and for extracting information from that data.
But for all of these, when we're talking about working with a process, digital twin for this kind of purpose, it's always the same data you need. You need information on each step. You need to know when it happened, what the step was, and what does this unique identifier that's going to allow you to join together, all of the steps that make up one instance, And then, of course, if you have any other data that relates to that, you can bring that in. So, for instance, with process intelligence, you might often want to look at what's the average time it takes to go through a process.
But, if, after the fact, I do a survey, and I ask people what the customer satisfaction was. Or if it's healthcare, I might look at patient outcomes. Well, that can also be linked back to a customer ID or a patient ID. And then when we're looking at the information in our process digital twin, it's not limited to just what happened, and when did it happen, and how long did it take.
But any information that that relates to, that's the process that has one of this information sphere would be the process ID. This document number, or customer ID number, means that that data can also be part of our digital Twin or process digital twin. So, these things I describe, where you can have a user describe, define a process behavior, and then, I'm able to use that process behavior as an analytic dimension.
So, all of that are things that can be done, because you built this model, That includes, what are the steps that are done, and any information that relates to that process instance, can do a wide variety of BI type analysis, as well as a traditional process, mining and process intelligence analysis.
I hope that answered your question.
That's fine. Thank you so much.
But, as, as we're bringing in new types of information into the data warehouse, we talked about spring scraping type information recording, as well as the stack of the, of the data stack right, in the sap or whatever the ERP environment is. Right.
Yeah, and there's phone calls, interactions.
Have you looked at bringing in phone calls and using analytics on the conversations of the phone calls to bring in, two, to get closer to the true underlying customer experience?
So, we can certainly bring in information about a phone call as to what number and what customer was involved, how long the call took.
We have not, I have not personally been involved in analyzing the content of a phone call.
Well, we normally get involved in my organization with analyzing the content of customer interactions, would be documents coming from the customer.
So we have expertise in analyzing documents, extracting unstructured data from documents, and that becomes very important as one way to augment your knowledge of the customer journey, so that when we often talk about the blind spots of process mining, one of the blind spots would be, the information that's unstructured. And again, I'm referring, in my case, to information coming into documents, whether they're electronic or paper, doesn't matter. But the ability to understand the document, understand the various fields in the document, and even take some freeform text and extract meaning from that.
That can be used to augment the attributes in this process, digital twin, this model and that can be used to allow for deeper level of contextual contextual information that will be a part of the the process queries that you make or part of the behaviors that you define.
Fantastic, wow, the the technologies are exploding in the last few years hasn't richer.
It certainly has, yes and there's much more to come.
But it's really interesting to hear and informative to hear what the latest and greatest is. I love your approach.
And I think you gave an excellent talk and insight into the process intelligence and RPA and process mining world somewhat, you really need to do to truly understand the ways to improve your, your product, your processes.
So thank you, Richard, appreciate it.
Parting words before we sign off.
Parting words is what you said is correct, as far as the ability to understand the process. But my real goal was to try and make it clear that it's not just for a business's understanding of their process. It also can make a vast difference in your customer's experience, when the business has this level of visibility into their processes.
Thanks, Richard. Very well, So, thank you so much.
We will be back at the top of the hour with our next presentations of Jane Shinar from Flourish. He will talk about.
customers are the real stakeholders for customer excellence, so we'll see you at the top of the hour.
Thank you so much.
And she had a few.
Product Marketing Manager, Process Intelligence,
Richard Rabin is the Product Marketing Manager for Process Intelligence at ABBYY, a Digital Intelligence company. He works closely with global enterprises to help them better understand and optimize their business process workflows, bottlenecks, and how to select the initiatives that will yield the most business value with intelligent automation, and how they will impact overall operational excellence. Richard has a remarkable academic background in Computer and Information Science and AI and has more than 35 years of software engineering expertise. He previously worked as a Senior Solutions Consultant at Appian, where he led sales of Appian’s digital transformation platform primarily in the pharma and financial services industries. Before that, he led his own consultant business, where he provided services for Kofax Insight in the areas of business intelligence, process intelligence, and behavior-based analytics.
Search for anything
November 9, 2021
11:00 AM - 12:00 PM ET
January 13, 2022
1:00 PM - 2:00 PM ET
January 27, 2022
1:00 PM - 2:00 PM ET
Watch On-Demand Recording - Access all sessions from progressive thought leaders free of charge from our industry leading virtual conferences.Watch On-Demand Recordings For Free
Courtesy of DC Government's Ernest Chrappah, below is a transcript of his speaking session on 'Going Digital To Enhance The Customer Experience' to ...
Courtesy of 's Anu Senan, below is a transcript of his speaking session on '' to Build a Thriving Enterprise that took place at Enterprise ...
Courtesy of Tasktop's Dr. Mik Kersten, below is a transcript of his speaking session on 'Project to Product: Driving Digital Transformation Insights ...
Courtesy of Nintex Pty's Paul Hsu, below is a transcript of his speaking session on 'Improve employee productivity during and post-COVID by ...