BTOES Insights Official
December 05, 2022

Process Mining Europe Live - SPEAKER SPOTLIGHT: Process Intelligence and its Synergies with Robotic Process Automation.

Courtesy of ABBYY's Richard Rabin below is a transcript of his speaking session on 'Process Intelligence and its Synergies with Robotic Process Automation to Build a Thriving Enterprise that took place at the Process Mining Europe Live - A Virtual Conference.

BLOGS COMPANY LOGO - 2022-03-25T161100.655


Session Information:

Process Intelligence and its Synergies with Robotic Process Automation

In today’s ever-changing business environment, continuous process improvement is a must for every organization. To remain profitable and competitive, there is no room for inefficiencies. This is why so many companies are turning to process and task mining to make real-time process improvement decisions based on facts. In this session, ABBYY’s Richard Rabin explains how Process Intelligence tools support existing Robotic Process Automation (RPA) strategies to visualize your processes, measure process performance, ensure compliance and build a future-proof digital transformation.

  • How Process Intelligence compares to process and task mining
  • Enhancing process transparency and analysis
  • Supporting your RPA with process discovery, automation design, and more
  • How to move from diagnostics to continuous process improvement

Session Transcript:

Hello, my name is Richard Rabin, the product marketing manager for Abby Timeline and topic for today's talk is Process Intelligence and its synergies with Robotic Process Automation.

So first, let's talk about process intelligence and particularly how that compares to, for instance, process miming that many people are familiar with.

So when we talk about process intelligence, we're talking about something that starts with process mining, but adds more capabilities on top on top of that.

So it would include process mining, as well as task mining. And we'll talk about all of these in more detail in just a moment.

But in addition to process on task mining, and there's also added analysis, and optimization options off of what you would typically expect in a pure process mining solution. And this becomes important for the work that we want to do with RPA. And, additionally, once we've got processing task mining, we've got a series of analysis options On top of that. We also process intelligence, have the ability to do operational monitoring and compliance monitoring. So we're going to be watching for each event, as it happens for each instance of a process, and comparing that against the process rules, so that we can see if there are any compliance rules or other operational rules that are violated. And finally, with process intelligence, we also have the ability to do predictive analytics. And I'll talk about what that is, and why that would be important. Also, for working with robotic process automation.

So, let's just start with process mining, with the core of process mining. The fundamental idea of process mining is that, in today's modern business systems, virtually any system you're working with, is going to be keeping a record of the transactions within the system.

So, that means transactions will be available for later analysis. And these record of transactions will almost always include in the same fields, are used by all process mining tools. We didn't need to know what event occurred when it occurred.

And each event has to have some kind of a unique process identifier, something that allows us to group all of the events that go with a given process instance. So if you're processing documents, it might be a document ID, If you're dealing with insurance claims, it could be a claim ID, or in a hospital, it might be a patient encounter number.

Richard Rabin-1

But one of the key facts of this is that typically you have multiple systems of record that you're going to be working with.

And these different system of record may or may not have some ability to visualize what's happening in terms of the various process instances that are executing.

But what we have found is that in almost all cases, most of the processes of interest to our customers end up spanning multiple systems of record. So, even if your CRM system or any of your other major systems has tools for allowing you to visualize what's going on and analyze what's going on in that system, they won't have tools that allow you to look at the entire business process, as it is distributed and spans multiple systems of records. So, with process, mining, and process intelligence, the ideas that we're going to uphold these data artifacts, these indicators of what steps happened. And when, we're going to pull them from all of the systems of record and databases. And that allows us to then recreate what happened in your process and your distributed process. Within one model, sometimes referred to as a process Digital Twin.

Now, some of the people in the past, when I've talked about this topic of asked, well, if I've already got all of the data that you're talking about in my databases, and I've already gathered all the database information into a repository.

And I might've already set up a business intelligence platform over that repository, or data warehouse. If I've already got all my data in one place, and I've got tools that are structured to analyze data, why do I need process, mining, or process intelligence? And the reason is really very simple. The reason is that databases and business intelligence tools aren't set up to be able to reason over processes. They don't, don't have tools for visualizing processes.

And they don't have tools for reasoning over processes From the standpoint of, if you're looking at a process instance, well, did some behavior occur in that instance, or not the things that you might care about? For instance, for monitoring or for looking for exception cases.

Btog CTASure. You could create SQL queries to look for some of this information.

But by the time you look at what events are included and skipped and repeated, the sequence of events, the attribute values of events, all the things that would go into a process behavior, turns out they're trying to do that. In a query, Writing the query in text becomes very difficult. And maintaining a complex set of multiple queries becomes very, very difficult.

So with process intelligence, you have the ability to easily graphically visualize your process, and also to reason about the process in multiple ways. And we'll talk about that in just a moment.

So, the next piece here would be, when we talk about process intelligence, it often brings, together, these days, process and task mining. And why is that?

Well, I've been working with process mining for over 10 years, and really process intelligence for over 10 years now.

And the early days, when someone pointed out that some of the steps they cared about didn't leave any records in any of their log files. They might be something such, as a person, use the web to go out and look for information that then copied it and pasted it into a spreadsheet that, then, might have created an e-mail that included that spreadsheet.

Well, those steps don't work well with process mining, because they don't leave behind the kind of information that process mining relies on, But they do work very well for task mining. And that's the difference today, versus when I was asked this question a decade ago, isn't that back then? The answer was simply, we don't have the ability to record these things to analyze the things. If there are no log files graded.

Well, now a task mining, we do very specifically have tools for recording what's done in these manual tasks that someone might do on a desktop that aren't leaving these kind of digital breadcrumbs.

So, for instance, when you're looking at process mining, working together with task mining, on the left, we're representing process mining with your various systems of record, that you might be pulling the data artifacts, from, tell you what automated recorded steps happened, and when. Whereas on the right, we're referring to the task mining piece of this.

So where there's a gap, in the process of mining information, because no data was recorded on certain steps. We can now specifically record those steps as they're manually done. Now, you'll notice in this picture on the right, that refers to task mining, you see multiple different desktops. And it's worth noting that when you're going to gather this information from task mining, both the overall analyze, what's going on in these manual steps.

And as we'll see, to use with RPA, one approach that is to install a single recorder on a single desktop, put it, in the hands of a user, who knows the process very well, and record, what would sometimes we call the happy path to record exactly what you would expect to happen in a well formed instance, or case.

The problem with that is that, if you really want to understand what's happening in your organization, what you're likely to find is that you have different people with different levels of training, are going to do the task differently.

You've got different types of instances of a task that require different forms.

So that one way to deal with that would be to install not one recorder on one user's desktop, but install a set of records across multiple users desktops, so that I can get A sample of how this kind of work is done across different types of users.

And another point worth making is that this diagram shows you another server between the recorder's and the task mining environment. Well, the reason for that is, if you're going to be recording users doing these tasks, you don't want to put a lot of computer computational effort on their systems to slow down what they're doing and mean they can't do their job normally. So, we typically would offload tasks such as obfuscating data to a recording server so that the recording server does a heavyweight lifting. And just passes the log files that include records of what manual steps have been done that gets passed into the task mining environment, from which we can then pull out the significant tasks, and use that for analyzing what's been going on. And, as you'll see, also, use it to help design automation for that in your RPA environment.

Now, also, when you're looking at TASC mining versus process mining, it's worth note that one of the key differences is that there are different types of analysis.

Typically, when you're talking about process, mining, and process intelligence, and this has a bearing on what you need to be able to design your RPA automation.

So, for instance, process mining, we typically rely very heavily on this type of a schema diagram that takes all of the process, instances each case, and maps them into one particular diagram, that can show you the flow of steps in your process. Now, this has been very, very useful, in many situations, where people to understand what's happening, and there's a lot more than you can see in this diagram.

It's typically annotated with how long steps took or how many times you followed one path versus another. It includes forms of animation to let you see what the flow looks like.

But the fundamental limitation of this approach is that, when a process is as simple as visualize here, looks great on such a diagram, but it's the process starts getting more complicated with more nodes, or with more options. So that there are more lines interconnecting the nodes, and it looks more like split that speaking of spaghetti diagram from old coating days.

Then, what has to happen to make the schema diagrammed more usable is that all the tools I've seen start giving you options to filter out the less frequently use cases, which may be fine if you want to see what happens in general and the flow of your information. But if you're really looking to try and automate steps, then some of the exception cases that you would have to filter out to make this kind of diagram understandable.

Event Email Graphic Virtual Conferences (26)Well, they end up being important. And you wouldn't want to filter them out. So, when you look at process intelligence, as compared to process mining, one of the big differences with process intelligence is that there's a lot more focus on various types of numerical analysis outside of this kind of a schema diagram, as well as a focus on each individual case and reasoning about each individual case. So, just very briefly, you might have things like this path diagram, where each column represents a path through your process. So, that, for instance, the most common path would typically be on the left. And you can see, it gets you the details.

But you might see at the top there, metrics so that you can see how often each of these different paths is followed, how much time it takes when you do the task one way versus when you do it in another way, or even what the costa's.

So, various ways, you can help too rank order and understand the different options you have for how a given task is done.

And then what.

Process Intelligence. You have many more displays, whether it's a graph showing you, how frequently you do a task very quickly, versus how often it's done very slowly, whereas, in this case, almost all the task is done very quickly. And those that take longer are very, very rare.

And you also have the ability to do additional analysis, such as, if this is showing me how long from Step A to step, Let's say D, Because they don't have to be right next to each other. And a histogram essentially showing me the distribution of values for how long that take.

I can also do dimensional analysis. Lead can't see much of the charts in the bottom, but you probably can see that they differ. Some have spikes were others don't. So what we're really seeing here is when you take the same time to do a particular transition of interest and break it down by operator, you can get some root cause information. You can start to see who is taking longer, and what the patterns are. And then, here I say, who?

Based An operator, but the dimension might be something completely different than operator.

It might be shift, it might be location, various ways you could look at it, And there are many other types of analyzes, which are included with process intelligence, and we'll talk more about the relevant ones of those in just a moment.

Another difference between process, mining and process intelligence is that process intelligence typically has more of a focus on each individual case, whereas process mining, your largely taking a large dataset, and mapping it to a diagram like that schema diagram and generating various metrics. But, you're not necessarily following all the data from each case, whereas in process intelligence, yes.

We are keeping all the data from every case that allows us to define various process behaviors. And we can then scan for where those behaviors occur, which might represent inefficiencies or might even represent cases where we've violated various compliance rules that were responsible for. And with process intelligence. We can then send out a notification when it's detected that one of these rules is triggered. Or we can even call a bot, A digital worker and or any other web service really. But we can call a bot to take action. So when a given situation is found, you can utilize your electronic workforce, your digital workforce, by automatically kicking off a bot to remediate a given problem that we might have seen.

One other thing I'll point out that you have as an opportunity with process intelligence, is you start out with the fact that we've got this Digital Twin, this model, of your process, based on the actual data from multiple distributed systems.

Supporting, the monitoring that we just talked about, was the idea that this information is coming in, in real time, or sometimes near real time, But the information is coming in continuously, and keeping this model up to date.

Well, if we've got a model of your process, we've got up to date information on that. Another opportunity is use predictive analytics. We can have a neural network that's trained to recognize the common process behaviors, so that as new process instances are pairing, it can look for patterns that imply the likelihood of certain behaviors and the tool that I'm used to.

This would be things such as being able to predict whether a given service level agreement a given deadline will be made or missed. Now, of course, with monitoring, you can always wait until the deadline is missed. Observe the deadline is missed and report on that, or act on that.

Both predictive analytics, you can predict that it's likely that a deadline will be missed based on historical cases and act on that. And in our case, not just if a deadline will be missed, an SLA will be missed.

It would also be able to predict things such as, if a process has multiple outcomes, I can then, at any time, while the process is executing, tell you, based on the neural network trained against other executions in this process.

What is the likelihood, based on what we've seen so far, in a given process instance, What is the likelihood that we will have one outcome versus another. So that's the ability to use predictive analytics as a part of process intelligence.

So now, how does this all relate back to supporting RPA?

We're gonna look at this in terms of process discovery, automation, design, process monitoring, and benchmarking.

In the area of process discovery, one of the things that's critical is, you don't want to automate a process until you fully understand it. I've heard from many different clients that there's a problem when they try and automate a process that's not well understood, that you can take what's essentially a broken process and make it just run faster.

You really need to get a view of what is happening within your process, either by looking at start to end, how it takes or any significant transition within your process to understand where time is being spent.

And once you understand where time is being spent, then the ability to automate a given manual task becomes much more useful, because you may otherwise automate a manual task only to find that it makes no real difference in the overall speed of your process. Because maybe it's just moving where the bottleneck is from one place to another.

So by understanding your process and where time is spent, you can get a better understanding of where automation is going to help you.

And beyond that, process intelligence, with its task, mining capabilities, will also help you find automation candidates. So after you have run those recorders that were previously referenced, you'll have a log of the steps that were done by the individuals who are doing these manual tasks. The system can then, both using automated tools, as well as manual intervention, can find the tasks within the log files, open up the overall files that show what steps people are doing, into, what are the different tasks that are being done by these individuals. Once we have a list of what the different manual tasks are, and we know some information about how each of those tasks is being done, we are then able to rank, what are the best automation candidates.

This becomes critical for RPA, because many of the easy instance is the low hanging fruit of what would be good to automate, have already been done.

So this allows you to look at other tasks being done by various people in your organization and measure, how often are they done? How long is it taking?

And also, a variety of attributes that will help you determine how difficult these would be to automate. And once you've done this discovery, a byproduct of discovering these tasks because I can use these tasks, outlines that have been reported to help generating an implementation skeleton to make it easier to build the actual bar.

Now, we talk about ordering, ranking, these automation candidates. Here's one example of how that might be done.

So if I have a log file that I've recorded from many users throughout the day of what activities they're working on. And for each users log file, I can identify tasks by repeatable set set of actions that might represent an expense report, or filling out a registration form, or various other tasks.

Screenshot (4)So we've gone from a set of users who are working at their systems to a log file of activities, two and list of the tasks that are found in those activities. And each task has various analytics done against it. So it might be, how many times did I generate this expense report, or what percentage of the log of recorded steps were involved with generating expense reports? On average, how many events, how many actions, were involved in one of those tasks?

Or what is the overall complexity?

Now, we talk about complexity, there are various ways to measure complexity. What we're really looking for, is how difficult with this task be to automate? And complexity might include, how many variations there are of this task? If to do this task, it's always done identically that would be relatively easy to automate, but if it turns out there are many different options based on the different options for an expense report, then it might be that that's more complex.

And we're also measuring the number of applications that are used within each task. And some people might even have a ranking of which applications are more easier, or more difficult to automate the steps. Or, so in other words, we're able to gather data for each of these manual tasks on both.

How complex will it be to automate? What does the potential gain? I know how often it's done, and I know how many events it has, I can look for things like how difficult it will be, too.

Automate this, as well as what the likely gain will be by time saved by automating this.

So, if I then, take the various tasks that are discovered, and I put them on a graph that shows both which ones show the most potential gain, and which ones are likely to be the hardest to automate.

I can then assist in picking the best automation candidates by looking for those that have the highest gain. With the lowest complexity as being potentially good candidates for automation.

So, rather than just relying on someone's guess as to what they think would be the right task to automate, you actually have hard data that allows you to make such choices.

So the next would be automation design.

So once I already have recorded these manual tasks and I've done some analysis to try and understand which ones offer the most promise, as far as automation candidates, well, then I have to design the underlying bot itself.

With process intelligence.

And with its use of task mining, I can actually see what are the different ways a given task is automated, pick the ones that are of interest to me and build those into a bot.

And I can then take that definition of what the steps are and directly export it to an RPA tool that can generate the skeleton of my bot from that. We're not to the point yet where you can just pick the task you want to automate and have the bot generated for you automatically.

But what we can do, now, this is actually a diagram, a path diagram from process mining, but we also have these path diagrams from task mining, where each step would be one low-level action or one form interacted with. And you can see, for a given task, what are the steps. How much variation is there based on one type or another. How long did it take when you did it? How frequently did you do one of these variants versus another?

And, at least in the tool that I'm used to, after analyzing the different ways that a task might be done, you can click to select which of these are important to you, and then export that into A, an RPA tools, such as, in our case, we're working very closely with Blue Prism.

Such that Blue Prism will get the definition of, what ever path I have said, I'm interested in and can use that definition to generate the skeleton of the bot that will have to then.

Um, automate these multiple, different options of the way this type of task is done. But rather than starting from scratch, you can take a look at a recording of the different ways that a task is done, select the ones of interest, and use that as the core of the bot being implemented.

There's also an important aspect of monitoring, when you're looking at process intelligence, working with RPA.

And it's important to note here that we talk about monitoring, we're not just talking about monitoring the performance, certainly in RPA tools, likely to give you information on how many times a bar has run, what the status is, were, how long it took.

But if you really care about understanding the behavior of your system, you don't want to look just at a set of bots in.

By themselves, you want to look at the entire system performance, so that you can understand, if you automate a particular manual process. How does that affect the flow of the overall larger system?

So, it's important to be able to look at the distributed end to end process.

And once again, what we see is that these processes, part of it might take place under the control of a bot that was created. Part of it might take place within other systems of record. So that if you want to monitor the performance of the larger system, you can't do it from just the RPA tool.

Richard Rabin-1You can't do it from just some other system of record.

You need something analysts, sometimes referred to as a control plane. Suddenly it sits outside of the normal systems that are executing this process, but can monitor all these systems. And that's what you can do with process intelligence. And now a part of that of course is to validate that the automations you're building are giving you the benefits you expect.

So that when you plugin a new RPA bot, you want to understand not just that that bot is saving 95% of the time to do that manual task, But doesn't make a difference to the overall larger process, that that bot is a part of. Sure, it might save all the time that is saved by automating that system.

But if there are bottlenecks downstream, it might not make the difference that you're hoping for. So being able to monitor the entire distributed system allows you to validate that any automation that you're creating are giving the benefits to the overall process that you would expect.

And when you're doing this kind of work, it allows you to monitor, not just the traditional system metrics, things like, how many times did I run a particular process instance, or how long did it take?

But any metric that relates to that process instance is also available to use as part of this monitoring, and I'll talk about that in just a moment.

And then there's also the compliance, as was discussed, the ability to monitor how many times these alerts or rules were triggered.

Now, when I talk about behaviors and monitoring behaviors to see how often I might have triggered some, can compliance check?

Well, there's many ways in process intelligence that you can define behaviors.

I'll just show you a few examples here, but what it boils down to is, what events are included. What events are skipped?

What events might be repeated? Sequencing events, the timing, between the events. All of these things that are important to understand process behavior, but aren't necessarily easily handled by a database or a BI framework.

So, in this case, we might look for any patients that arrived on Sunday, and we've got a record of a lab results. Matter of. fact, we got two of them, but we have no record of labs being ordered. And then after the lab result, we have a record of the patient being admitted. And it took more than 30 minutes from the time we got a result the time the patient was admitted.

So that would be a sample behavior. Another might be, anytime a performance review was done, and the next performance review took more than six months, we might want to note that is something that shouldn't have app.

Or another simple example is, anytime we did any of these insurance step plainclothes denied, file a claim and process.

We do any of these steps with having no coverage confirmation before those steps were done. So just a few examples of what a processed behavior might look like.

And now, when you're talking about benchmarking, using process intelligence and RPA.

Well, that's when these behaviors become interesting, because clearly you can do benchmarking of yore system that you're applying some automation to, to compare, say, week over week, or month over month.

Or after you've implemented a bot, you might want to compare before and after you've implemented that bot to better understand the true benefit from that automation. But, you can also break it down by process behavior. You're also able, with these process behaviors, as we described a moment ago, to be able to use that as an analytic dimension.

Show me what my results look like, when I do the process one way, versus I do it another way.

And, as far as what metrics are we talking about, well, if you're looking at something like credit card onboarding, you might have a number of times that a new card has been activate.

Another count metric would be the number of credit card applications.

Well, but creating a do a derived metric by dividing these two, I end up with the percent activating.

This becomes important, because when you're looking at the overall process, behavior, or looking at the process metrics, you might want to say, based on how we did the process based on what behavior was executed.

Does that make a difference in the percent of cards that were activated?

Or if you're talking about data from a hospital, for instance? Well, you might have the basic data as to what steps are done, such as patient arrived in the emergency department.

But you can use database joins to take studies that were done after the fact. But still have a reference to the patient ID number.

Where you might have done a survey, a survey on patient outcomes.

You might have created a score that rates patient outcomes, or you could have done it in terms of patient satisfaction, or in other retail, might be customer satisfaction.

But the basic point is that if you have access to data, like A customer satisfaction value or an outcome value, you can generate an average metric. And now, I've got average patient out.

What does that buy?

Well, keep in mind, here, we're talking about using this with a process in which we're using RPA. This would actually work with any process, whether you're using RPA or not. But what allows me to do is to define a behavior, such as standard protocol, which, in our case, might determine how long is allowed between certain steps, what steps must be done? And I can then have the system go back and characterize. Take all of the process instances that followed the current protocol, and show me the metrics, such as the average number of events, the average time, or the average patient outcome.

And I can see when I followed the standard protocol, I can see what the timings were, and I can see what the outcome is.

But then I can also define an expedited protocol.

It might involve shorter timings or more checks that need to be made. Again, it would be a different set of process, behavior rules and I don't have to set up all these rules and tell people to go collect data.

Again, Instead, I have the process intelligence framework, look at the system for which I'm collecting data And isolate all those instances, It followed the current set or the standard protocol.

And also isolate all instances that follow the expedited protocol, even if that's not the, what's required these days. Some of the people working here do follow those protocols.

And that way, without doing multiple tests, I'm able to slice and dice my results based on what behaviors were followed.

And I can see that with the standard protocol versus the expedited, well, actually, the expedited protocol took more steps, Took more time, But the patient outcome was substantially better when following the new protocol.

This means that any user can come up with ideas on any different behaviors they think should be followed. Then, even after the date has all been collected, define these new behaviors or protocols.

and then be able to compare what your results looked like when you did it the old way, versus did it the new.

Same thing if you're looking for something like a credit card onboarding example, I've got my current guidelines and I see what the time and the counts, and the percent activated. The metric we saw a little while ago are. And I can see here that when we follow the new guidelines, I can see that my percent activated is about 10% higher.

So in other words, when you're using process intelligence, you can define behaviors. You can use those behaviors for monitoring and alerting. You can use those behaviors for various types of reporting. You can also use those behaviors and analytic dimensions to show, when I do my process one-way, what are my results? When I do my process a different way, what are the results?

But taken altogether, what you'll see is that, when you're looking at process intelligence, working with RPA, another way to look at this is that it allows you to understand your process by visualizing it and annotating with various metrics. It allows you to understand aspects of how your process is performing, Not just what are the steps, but how it's performing, what are the variations, what's the difference in cost or time, when you do it in different ways? You can then define behaviors that can be used for monitoring, alerting, and even used as dimensions to slice and dice the data based on how a process was done. And finally, you can add things like predictive intelligence, things like predicting whether you will or will not make a particular deadline so that you can act on these things, and prevent a problem before it actually becomes a problem. So I want to thank you for your time today. Overall, what we've tried to cover, it would be, what is process intelligence.

Event Email Graphic Virtual Conferences (26)And how can the capabilities of process intelligence make a difference with your RPA environment, both by allowing you to setup automation faster by understanding what you're doing and building the skeleton of the automation in a semi automated fashion, as well as being able to then analyze the results when you do these changes through automation, to see how it affects the overall process outcomes, thank you very much. And if anyone would like to contact me to discuss these further, I'd be more than happy to speak to you. Thank you.


Thanks, Richard.

Hey, thanks, Richard. Sorry about that.

Finally, the joys of camera. Some great insights there.

Built so much, and I say, I went slightly watch these things you listen to.

John said, That, I wish covered that, and so some of the things I think really built for Richard on what I was talking about earlier, we're reminding us that those processes span multiple systems.

Reminding us that BI, Process intelligence will be related.

I'm not same.

I thought there was some really nice start there in terms of how we talk about connect to the process mining with the tall smiling. So Once again, thanks, Richard. Great job in science, and hopefully everyone is gonna get something back.

So I'm going to say thank you to retune once again.

We're going to be joining you on the top of the hour. Slight change the program, We're going to have the Blueprint systems coming up next. So time for us to refresh those coffee cups. And we'll be back in just a few minutes.

Right, Thanks, everyone.


About the Author

download (11)-1Richard Rabin,
Product Marketing Manager, Process Intelligence,

Richard Rabin is the Product Marketing Manager for Process Intelligence at ABBYY, a Digital Intelligence company. He works closely with global enterprises to help them better understand and optimize their business process workflows, bottlenecks, and how to select the initiatives that will yield the most business value with intelligent automation, and how they will impact overall operational excellence.

Richard has a remarkable academic background in Computer and Information Science and AI and has more than 35 years of software engineering expertise. He previously worked as a Senior Solutions Consultant at Appian, where he led sales of Appian’s digital transformation platform primarily in the pharma and financial services industries. Before that, he led his own consulting business, where he provided services for Kofax Insight in the areas of business intelligence, process intelligence, and behavior-based analytics.   


The Business Transformation & Operational Excellence Industry Awards

The Largest Leadership-Level Business Transformation & Operational Excellence Event



Proqis Digital Virtual Conference Series

View our schedule of industry leading free to attend virtual conferences. Each a premier gathering of industry thought leaders and experts sharing key solutions to current challenges.

Download the most comprehensive OpEx Resport in the Industry

The Business Transformation & Operational Excellence Industry Awards Video Presentation

Proqis Events Schedule

Proqis Digital

Welcome to BTOES Insights, the content portal for Business Transformation & Operational Excellence opinions, reports & news.

Submit an Article

Access all 75 Award Finalist Entires
Subscribe to Business Transformation & Operational Excellence Insights Now
ATTENDEE - Proqis Digital Event Graphics-2
ATTENDEE - Proqis Digital Event Graphics (2)-1
ATTENDEE - Proqis Digital Event Graphics (1)-1

Featured Content

  • Best Achievement of Operational Excellence in Technology & Communications: IBM
  • Best Achievement of Operational Excellence in Oil & Gas, Power & Utilities: Black & Veatch
  • Best Achievement in Cultural Transformation to deliver a high performing Operational Excellence culture: NextEra Energy
Operational Excellence Frameworks and Learning Resources, Customer Experience, Digital Transformation and more introductions
  • Intelligent BPM Systems: Impact & Opportunity
  • Surviving_the_IT_Talent_deficit.png
  • Six Sigma's Best Kept Secret: Motorola & The Malcolm Baldrige Awards
  • The Value-Switch for Digitalization Initiatives: Business Process Management
  • Process of Process Management: Strategy Execution in a Digital World

Popular Tags

Speaker Presentation Operational Excellence Business Transformation Business Improvement Insights Article Continuous Improvement Process Management Business Excellence process excellence Process Optimization Process Improvement Award Finalist Case Study Digital Transformation Leadership Change Management Lean Enterprise Excellence Premium Organizational Excellence Lean Enterprise Lean Six Sigma Execution Excellence Capability Excellence Enterprise Architecture New Technologies Changing & Improving Company Culture Agile end-to-end Business Transformation Execution & Sustaining OpEx Projects Culture Transformation Leadership Understanding & Buy-In Lack of/Need for Resources Adapting to Business Trends Changing Customer Demands Failure to Innovate Integrating CI Methodologies Lack of/Need for Skilled Workers Lack of/Need for Support from Employees Maintaining key Priorities Relationships Between Departments BTOES18 RPA & Intelligent Automation Live Process Mining BTOES From Home Financial Services Cultural Transformation Customer Experience Excellence Process Automation Technology Healthcare iBPM Healthcare and Medical Devices Webinar Culture Customer Experience Innovation BTOES Video Presentations Exclusive BTOES HEALTH Strategy Execution Business Challenges Digital Process Automation Report Industry Digital Workplace Transformation Manufacturing Supply Chain Planning Robotic Process Automation (RPA) BPM Automation IT Infrastructure & Cloud Strategies Artificial Intelligence Business Process Management innovation execution AI Lean Manufacturing Oil & Gas Robotic Process Automation IT value creation Agility Business Speaker Article Systems Engineering RPAs Insurance Process Design Digital Speaker's Interview data management Intelligent Automation digital operations Six Sigma Awards thought leaders BTOES Presentation Slides Transformation Cloud Machine Learning Data Analytics Digital Transformation Workplace Banking and Capital Markets Data Finance Professional Services Education IT Infrastructure IT Infrastructure & Cloud Strategies Live Blockchain Interview Solving Cash Flow with AI BTOES White Paper investment banking Analytics Insight BTOES19 Consumer Products & Retail Enterprise Agile Planning Government Operational Excellence Model Project Management Algorithm Automotive and Transportation Banking Business Environment Digital Bank Enterprise architecture as an enabler Hybrid Work Model Primary Measure of succes Relationship Management Sales business expansion revenue growth Adobe Sign Agile Transformation CoE Delivery solution E-Signatures Electricity Global Technology HealthcareTechnologies Innovation in Healthcare Reduce your RPA TCO Transportation Accounts Receivable (AR) Big Data Technology CORE Cloud Technology Cognitive learning Days Sales Outstanding (DSO) Logistics Services Operational Excellence Example Risk Management business process automation transformation journey Covid-19 Data Entry Digital Experience Digital Network Digital Network Assistant (DNA) Digitization Drinks Effective Change Leaders HR Internet Media NPS Net Promoter Score Program Management Portal (PgMP) Sustainability TechXLive The Document is Dead The New Era of Automation Automated Money Movement Banking & Financial Services Biopharmaceutical Blue Room Effect Building Your Future Workforce in Insurance Business Process Governance Capital Market Creative Passion Digital Transformation Workplace Live Digital Workforce Digitalization ERP Transformation Finance Global Operations (FGO) Financial Services Software Frameworks Hoshin Planning Human Capital Lean Culture Natural Gas Infrastructure Natural Language Processing Organizational Change Pharmaceutical Pharmaceuticals & Life Sciences Project manager Supply Chain Management Sustainable Growth The Fully Automated Contact Center Transformation Initiatives Workplace Analytics eForms eSignatures 3D Thinking BEAM BFARM BTOES17 Big Data Processing Business Analytics Business Growth Centralized Performance Monitoring System Communication Creativity Digital Technologies Digital Technology Educational Psychologist Energy Management Health Insurance Health Maintenance Organizations Hospitality & Construction Human Centered Design Integrated Decision Approach Integrated Decision Making Intelligent Document Processing Kaizen Medicare Moodset for Excellence Natural Language Processing (NLP) Offering Managers Oil and Gas Optical Character Recognition (OCR) Pharmaceuticals and Life Sciences Photographing Price and Routing Tracking (PART) Process Design Document (PDD) Product Identifier Descriptions (PIDs) Python Quote to Cash (Q2C) Resilience SAP Sales Quota Team Work Telecommunications Text Mining Visually Displayed Work Culture master text analytics virtual resource management