Subscribe
BTOES Insights Official
By
September 07, 2020

Process Mining Live- SPEAKER SPOTLIGHT : Process Intelligence – Understand Today, Plan for Tomorrow

 

Courtesy of Fortress IQ's Jon Knisley, below is a transcript of his speaking session on 'Process Intelligence – Understand Today, Plan for Tomorrow' to Build a Thriving Enterprise that took place at BTOES Process Mining Live Virtual Conference.

cbac6548_6ec7_4099_90c4_9e7eb519699f_15333346626551pillar%20page%20line%201

Session Information:

Process Intelligence – Understand Today, Plan for Tomorrow

Abstract:

Programs often struggle because the biggest obstacle to any complex, large-scale change is the lack of detailed knowledge on current state activities. Before starting a major initiative, a company must map its processes, its systems and its experiences.

Today, the necessary level of operational detail does not exist in most companies, and it is very difficult to obtain. Process Intelligence offers a solution to this challenge and can help jumpstart your automaton and transformation programs.

Takeaways: 

  • The critical role Process Intelligence plays in automation and transformation success
  • How an industry leader is using Process Intelligence to fuel their CoE
  • Where Process Intelligence can drive value in your organization

Session Transcript:

Have John Knisely here with us today. John is the Principal ... Fortress IQ, where he helps companies leverage process intelligence to accelerate their automation and transformation programs. Prior to his current role, John served as the Chief Architect for Intelligent Business Automation and the Defense Department's Joint AI Center. So, John, I'm very pleased to have you with us, tremendous experience and cross industry leadership, very much looking forward to your presentation.

John, your audio is not going on right now.

Hey, John, I'm not getting your audio on our side right now. So if you can do a double check there, just make sure audio is OK.

Do you have it on my Can you hear me now? Yeah. Maybe about during the intro.

Yeah, I can hear you now if you can do your intro again because we missed the first slide with your intro.

Let me, let me go back.

There, you get me now?

Yeah, you're good now.

OK, well thank you, thank you very much for the introduction, goes. Hey, we're excited to present today, I've got some of the earlier sessions from Jen Pack and Malek, and they've been great. So let's just jump into this and we'll get started.

Peter Drucker, the business visionary, credited with really inventing modern management, taught us back in the 19 fifties that what gets measured gets improved.

And I would argue, you know, this can be considered the first steps in the ongoing journey that we're on towards creating a data driven business culture, you know, one that relies on data to guide your decision making rather than intuition or personal experience.

And an homage to Drucker. And given the availability of data today and focus on it compared to previous generations, you know, we've extended his famous mantra.

What gets discovered gets improved.

So from what gets measured gets improved. What gets discovered gets improved.

And with the advent of big data, the emergence of analytics, the adoption of artificial intelligence, the appetite for data driven decision making has been satisfied, Drucker one, we want to measure everything today and have a lot of great tools to do it.

So the real issue and the question is, you know, what new data can we discover? And ultimately leverage to make improvements, you know?

What are those areas that are not yet discoverable, or, you know, we don't have reliable data source for.

And one of the biggest potential sources you know are the day-to-day tasks and activities of a company that occur below the surface. You know outside of ESOP, the manuals, you know, the ones that have evolved organically over time and the ones that really dictate how work really get done in an organization.

And until today, there's been no real way to capture that. You know, and decode that work to explore it. And really make sense of it. And use that data to to improve the organization.

So, you know, let's let's unpack this challenge an opportunity a bit more. The potential impact of gaining a better and more complete understanding of your current state is fairly logical.

Screenshot - 2020-09-07T152937.944If you step back and think about it, Google Maps can easily get me from Chicago to Dallas.

950 mile, an hour drive, there are a couple of different routes I can take.

You know, it also tells me I can fly in two hours and 10 minutes or so.

Moreover, there's a training that's available to take 22 hours or even options for for walking and biking if I'm so inclined.

It's great, detailed information to help me make a good decision.

But what if Google does not know where I am currently?

It's basically useless.

It can't tell me anything about getting to Dallas.

And that's the moral. You can't get from A to B if you don't know where A is.

If you don't know where you're starting from, and we know this intuitively, you know, what's the first thing you do. When you go on a diet, You know, you step on the scale and figure out your starting point. No one can start to die by saying they want to get down to £180, but they don't know what they way currently.

And, and business is no different. You know, it's next to impossible to get to that magical future state.

If you don't know what your current state is, and most companies, I'd argue, the vast majority of companies don't truly understand how they operate today, especially at the granular user activity level, they claim they do. But it's at too high a level to really make a difference.

And as a result, companies generally fail miserably short of their transformation objectives. McKinsey is pegged the success rate is just 30%.

Other reports that I've seen a place as low as 17%, and this translates to close to $1000 billion annually, that's wasted on failed transformation programs.

You take.

Without knowing where your starting point is.

No, Really plan for tomorrow.

Trouble getting gate.

Any complex large-scale change is the lack of detailed knowledge Activities.

holsteins And most companies understand how they opera on a daily basis, They have limited process understanding.

They don't really understand what their customer.

Hey, John, your audio is cutting off. again. I'm not sure what happened there, but it started cutting off.

Yeah.

I'm like, how much? How much, if you miss me, they're just, I came in pretty quickly, about one minute and most.

Let me switch over to computer audio, and see if that gives us a little better.

Let's try that.

Hello.

Yeah.

Can you hear me now?

Yeah, I can hear them actually more clearly now than before.

Btog CTAOK, let's try this, hopefully this one sticks sticks around a little better for us. Actually, it's much about an hour. Thank you for that. Yeah, now I was, I was on the phone because of this problem.

So, no, no worries at all. Apologies for the audio problems we were having there.

We'll give back.

So, since Forrester identified the process gap in a report a few years ago, that, you know, vendors and ultimately, customers faced, and they noted that everyone was on a mission to close it.

The, the process insight's landscape has been very dynamic to say the least, know, we could probably debate the distinctions between process and tass mapping, versus mining versus discovery versus intelligence, you know, all week long, and still not reach a consensus.

You know, they offer unique degrees of completeness and consensus, accuracy, time, cost.

But, society, suffice it to say, each approach takes a slightly different model, to collecting data, and documenting the processes, but all of them at some level produce workflow reporting.

That enables an organization to gain an understanding of their operations and the right one for your organization.

You know, as always, ultimately, depends on the requirements, and you know what? You're looking to get out of the technology in the near and long term.

Looking first at the traditional manual approach, using consultants and business analysts, This is often classified as process mapping.

It allows you to go end to end through a process and across multiple applications, but it does not provide the coverage across the organization and the sample size tends to be limited, which can create problems due to resource requirements and costs, the ability to scale is limited.

And while we can capture variations and exceptions, there's also the potential to introduce human bias into the outputs as well as errors and omissions that can occur given its manual nature.

On the other side of the spectrum, or the more pure technology approaches, they generally accelerate time to insight and provide more scale, but they often have trouble delivering full coverage.

The academic mining solutions, which we've heard about today, can give a very detailed insight into the workflow around certain applications, but they have trouble going end to end and require access to log files which limits coverage, can also impact deployment times.

The discovery approaches that many of the RPA vendors are pushing are really good at task automation. If you know what you're looking for and want to record an assumed happy path without any variances, but again, the protests trouble scaling.

Know, finally, the Fortress IQ approach to process intelligence, that's in the middle of the diagram.

We truly feel it offers you the best of both worlds. It gives you the depth and breadth to tackle complex use cases across the entire organization.

I'll highlight, you know, in a minute, how we do it. But in terms of benefits, the solution speeds capture processes by up to 90%.

You know, generates documentation to create your transformation roadmap, Captures application coverage across the entire enterprise and really offers frictionless deployment in minutes, you know, with no integrations or log file requirements.

Let me let me provide a quick overview of the Fortress like, your Application and how it works to give you some context into how we are able to deliver those outcomes across multiple use cases.

The platform delivers detailed insights on user activity to help companies make more data driven decisions around key business initiatives.

You know, automation, operations, customer and employee experience, analytics, even compliance.

And we we fundamentally do three things: Data collection, mining and reporting on current state operations, and it starts with capturing continua screenshots via lightweight, lightweight software agents installed on Target desktops.

The agents automatically capture all the inputs on the screen and don't require any software integrations, and that's really key. There's no API or integration or log files required, so projects can be rapidly deployed. You can be capturing data in minutes.

These images are then interpreted by computer, vision, technology, and natural language processing, and convert it into structured data.

Following collection, this massive new dataset of user activity that's really never been available before is mined using machine and deep learning algorithms to categorize and segment, and sequence the patterns and the information.

And this information is then ultimately used to discover and map all the relevant processes within the organization across any and all applications that the employee uses.

21The final step is the reporting and the visualization of this data. The platform will tell you, you know, what applications are being used, When are they being used? Who is using them?

What is the process?

frequency and duration our users interacting with the application? What are the process steps and all the permutations of it?

All sorts of insights that can be explored to help accelerate automation, Enhanced workflows, improved customer service, increased compliance, whatever the use case may be.

And, again, the platform allows you to understand current state, because, without that is, it is very challenging to get to the desired future state.

So, you know, very cool technology that can drive value very quickly, typically and in 2 to 4 weeks, compared to traditional process mapping with consultants, You know, there, you may be looking at 4 to 6 months, millions of dollars, lots of human resources, lots of errors and omissions as well. And the biggest issue with that traditional model, It just does not scale.

Let me let me briefly touch on security, because it comes up in just about every early conversation.

And if you think about it, logically, we've got the sensors that are installed on computers that capture user activity and screenshot.

I mean, I totally get it. You know, people are rightfully concerned about security. And it's why it comes up so often.

So, we've, we've addressed the security and data protection issue at multiple levels.

The first thing is that we allow you to establish and allow and or deny list of domains and applications that you want to receive data from.

Pretty straightforward, observed these applications don't capture data from these applications. So, so, that's kind of the first layer of security, but obviously, it's not enough because some critical target applications may have sensitive data in them.

So, for the next layer, you know, even before the data gets to the fortress like your system, we've got a masking appliance, which we affectionately call pegg because, it's the privacy enhanced gateway.

And Peg is critical because Internet enables AI Cloud benefits with on premise deployment security.

The appliance allows you to select which data fields even get to us.

So if you've got Social Security numbers, or PII information, you can have those fields not passed to us.

And no business process work that I've ever seen requires individual Social Security numbers.

They may be part of a process, but we don't need any actual records to deliver the process intelligence.

Uh, on top of those, those two first layers, You know, everything that's sent to us is encrypted with TLS 1.2. All data at rest is encrypted as well.

We've also got a multitude of security certifications and regulatory standards that we comply with, know, soc too, and HIPAA and GDPR and ISO 27,001. And we're continually adding and working on more of them.

And, and finally, I note that, you know, one of our more recent leadership hires was a risk and security director. So safe to say we take the security and privacy issue very seriously.

I've alluded to this a few times already and there are a variety of business areas across the enterprise that customers use the platform to support.

It can truly help, you know, support your organization's key strategic initiatives.

Automation is probably the most obvious one, as companies struggle with scaling RPA programs and developing the ... for development, which we generate directly out of the system.

As you are aware, with RPA, you know, development always outruns the analysis and the roadblock to scaling is the delay on priorities or documentation.

But, because the technology can uncover and document the unknown at scale, we can play in a bunch of different areas as well from operational excellence and customer experience where we can find no more discreet bottlenecks and sources of friction that need to be optimized over to compliance and analytics where we're giving you access to data to really understand what is going on in your organization.

around continuous improvement.

Companies find that we are identifying areas not typically uncovered by traditional methods.

We're, we're able to surface these, these incremental improvements by analyzing user activity at such a granular level.

It may be the, the efficiency someone gains after using an application for a long time. You know, those kind of hidden shortcuts that aren't in the manual can be promoted to newer users.

or a field and a form that we see that gets re keyed with the same information dozens of times a day, by, you know, 20 or more employees.

Turning that into a simple drop-down box is is really a no cost fix, essentially that can save hundreds of hours over the course of a year and reduces data entry errors.

And, and the really cool thing that we're starting to see clients is, is, they're starting to think up new use cases that we have not considered.

We had one recently use the system to gather current system requirements for a major platform modernization effort.

And they estimate they could shave a full year off the overall project.

That's kind of a good segue into exploring outcomes in a bit more detail. Again, as a reminder, you know, Fortress IT does not deliver the desired future state. But we enable it by providing that current state insight. So the project can be delivered faster and cheaper and better.

You know this quote after two weeks, You guys know our business better than we do.

That's from an insurance company CEO, who we helped with a massive and rapid employment fix enrollment fix, that delivered millions in current year revenue to the company.

They could not enroll companies fast enough, and were leaving money on the table.

They had predicted that the process assessment would take up to six months, and automating, it would take another six months, using fortress' IQ.

They were able to assess the process, and all the variations by the 30 administrators who did the work in less than four weeks.

And then they were able to use the documentation generated by the system to develop a number of RPA bots in three months and eliminate that enrollment bottleneck and the whole backlog.

And so that anticipated 12 month project got delivered in four.

And they were able to recoup over $2 million in the current year.

Let me, let me take a moment to review a recent case study that was published. And, if you're interested in more details, we can certainly get to the original article, as well.

Screenshot (4)It highlights our work with Dentsu, one of the largest global marketing companies.

They operate in over 145 countries, with 65,000 employees.

And we work with our Automation Center of Excellence, which is run by Max Shepherd off, the firm's Chief Automation officer.

And like many companies in the industry, much of tensors expansion in recent years has been fueled by numerous acquisitions. Last count I saw was, was north of 150.

And, obviously, this creates and an ecosystem of vastly disparate systems and processes and creates an environment that is very ripe for automation and system re-engineering.

And, you know, Fortress IQ is a key component along with catalytic and UI path in their technology stack.

The c.o.a.b. works with over 400 people in the organization who are identified as automation champions and experts, and it functions to optimize the working lives of the employees.

And the goal is ultimately to, know, augment workloads, and make the teams as efficient and productive, and happy as possible.

And as we all know, by automating a boring and repetitive tasks, the coe can help mitigate or eliminate the laborious strain of routine tasks and approval, turnarounds and bottlenecks.

So with a mission and a team, and a scope, you know, so large and so critical to, to successful transformation.

You know, obviously, there's a huge need to discover and document and prioritize processes for potential automation and re-engineering.

And as you see on the screen there, Dentsu was able to automatically mine model and document 2200 processes in just five months with just two people operating the fortress IT platform.

Equally impressive, later in the article is noted that it would have required more than 30 business analysts to gather the same level of detail and insight if the work had to be done manually.

So, a very interesting and compelling story to people in Fortress IQ.

Doing the equivalent work of over 30 VA's. I encourage you to read the original article. As I said, we're happy to happy to get you a copy of it. It's a great example of understanding today, to improve tomorrow.

In an especially, you know, agile and creative industry where you may not always expect a significant automation transformation program.

Building on on the case study, let me explore, the core of where Fortress IQ. really drives business value for organizations.

Whether it is scaling automation or optimizing processes, again, improving customer experience, you know, that the foundation for the value delivered really stems from accelerating delivery.

I'm sorry, Accelerating discovery.

You know, understanding that current state is really the linchpin to any successful change program.

So looking at discovery from a manual process compared to forge's IQ's approach, no manual process typically begins with an artifact review, May also run a workshop, or do some Kaizen sessions.

You'll then document the process and do some analysis every time there's going to be rework that's involved.

And, at the end of the day, typically, what we find, and what we're hearing for clients, is that it takes, you know, usually, two people, four weeks, often more, to document a single process with Fortress IQ, we have, you know, a few hours at most, to install the sensor, and start capturing data.

If we're looking at new systems or custom applications, it may take a day or two to classify those frames.

But then, almost immediately, you know, we're starting to get early insights, before we'd even mind any data.

This early information may include, you know, machine events, and application usage, and duration, and timelines, control types.

And, and, and you have that access to that, almost immediately.

Then, if we go into further analysis, it may take, you know, 1, two weeks to analyze that data and really document the processes.

And then, you know, we've got uninstalled a sensor, which, you know, takes minutes, essentially. Ultimately, you know, this adds up to discovering 2 to 3 processes, again, because we're observing all the work and activity that the target target users are doing.

So we can do We can discover more processes over a shorter period of time with fewer resources. So we're discovering 23 processes in every cycle with one FTE, over a week, week, and a half.

And this: this delta between manual and automated discovery drives efficiency of your VAs and consultants, you know, enables your automation team to work more efficiently, uncovers the previously hidden processes for auto.

For optimizations, it really enables transformation discovery.

We've developed a full business value model that goes into much more detail, that I'd be happy to share with anyone that's interested in exploring it, just just let us know after the after the presentation.

one of the, one of the keys to our client success and the Platform's ability to quickly drive business value is our strong network, and, and growing partner group.

For companies that want to jumpstart their adoption of process intelligence or don't have the internal resources to fully support a program, partners are a great avenue.

As you can see, we're aligned with many of the most respected companies in their field.

The advisory firms at the top can really help with strategy and implementation, best practices.

And even production, in some cases, in that middle layer of complimentary technologies.

You may already have some of them in your automation tech stack or be considering them, but we're able to integrate with each of those solutions to varying degrees And we're adding to it all the time.

Let me also mention that many clients go it alone and rely on in-house resources to build out their process intelligence capabilities that can work as well.

We do provide a dedicated customer success team that's provided at no cost. And this group really works hand in hand with your team to get you started on your process intelligence journey.

Ultimately, the goal is to, you know, enable you to fully leverage the platform independently.

Screenshot - 2020-09-07T152937.944But we recognize that it will take a few cycles to get there and work with you to be self-sufficient. It's the classic, you know, Paul Crawl, Walk, Run, Type Model. And in full transparency, I would say the companies driving the most value from the platform are the ones who are running it themselves. They've got the resources. They've got the Understanding to really drive incredible value. Like a, like I shared about Dentsu from The tool totally independently.

In closing, and to wrap up a bit, you know, let me summarize the key value drivers of the platform, you know, in one place. And we've really touched on all of them throughout the session. So, no reason to dwell on them, but it can be helpful to see them altogether.

First off, you know, we've created a unique platform to decode your operations and really capture for the first time ever, you know your user activity in an efficient way and transform it into an asset to support strategic business initiatives.

And we do that by, you know, delivering 100% recall to really ensure accuracy and greatly reduce the project Brea Work, you know, ongoing or recurring discovery as possible, and really enables you to assess project impact over time.

I've talked about a few times, just the massive, massive time to value improvement.

You know, 80 to 90%, Literally going from, from 6 to 8 months, to 2 to 4 weeks.

Uh, any process?

Any application across the entire enterprise, nobody else in the market can deliver at that scale.

And, you know, we believe a more complete and less biased model than the traditional consulting or BA approach, You know, we're generating those level five PBDEs with, you know, step by step instructions to really scale your workflow programs.

one final advantage that that's not noted is the Platform Versatility. And I've touched on it a couple of times, you know, obviously, it can go very deep into a process, but, again, it can also go very wide across the organization.

As we, as we get ready to shift into Q and A Let me take a moment to just thank you for your interest. Apologies for some of the audio issues that we had earlier in the show, but overall, I think this has been great.

Hopefully, it's been informative for you, You're able to get a sense of how Fortress IQ enables you to explore use cases across the entire enterprise to really execute more consistently and reliably than the competition.

As I've said a couple of times, you know, with any large, complex project, to be successful, it's critical to understand today, so you can improve tomorrow.

I've got a couple of next steps on the screen, If you're interested, definitely check out our website to download our solution brief, to get more detail on the platform.

You can also request a demo that's really tailored to your organization, and also feel free to connect with me anytime. Happy to strategize and advise you on how best to incorporate process intelligence into your automation and business transformation journey.

So thanks again, Josie, If you're ready to jump into Q and A, I'm ready when you are, Let's do it, John. Let's do an excellent, yes, let me just let me know. What's perfect after you make that adjustment? So, everything's was is mu is how it goes? You use the phone because you think it's going to be more reliably and the good, old-fashioned DSL line that I've got run it and in my house out here in the country can work flawlessly and deliver the video.

Exactly, exactly right. And I never felt a that's for sure.

Hey, we got questions and comments area that has come out throughout throughout the presentation. So that's good. So I'm gonna start by, one of the things that has emerged here, has to do, with the technology itself, kind of this use of RPA, with a new dimension, with sensors. And, as you mentioned, you know, machine, vision, and natural language processing, built on top of that. Tell us a little bit about the technology here. A lot of people are asking, is this something that, Is this a proprietary technology that has been created by, by the company, is this a third party technology that you're using just the technology itself? It feels like RPA, three-d..

I mean, it is definitely, you know, part of your intelligent automation, tech stack, and, again, really focused on that initial, sort of process discovery, process, intelligence, process, analytics space.

So, yes, prior proprietary technology, that's been developed over by our team over the past two years, really combines computer, vision, natural, language processing, deep, learning algorithms, and really brings it all together to deliver that detailed insight into your day-to-day users activity.

Yeah, I think the more marketing minded on the call are saying that, you know, there may be a name for this thing that you're creating here, because it's quite unique, well. Done. Well done. So, one of the questions that, let me blow up the comments section here, so I can see better. William Fuller was asking?

About opportunities to obtain detailed knowledge of current operations in large enterprises. And so, Um, and, and that is a question itself, I'll add to that by saying that where you're seeing the best use cases of this.

RPA was kind of a full immersion with natural language processing and machine vision, like you just described. Where I said what are some of the best use cases look like? And ... Williams Direct question is that, how are larger enterprises leveraging this type of technology and applications?

Yeah. A couple of questions. Let me try to unpack them. I think I can, I can get to all of them, in terms of the use cases, traditionally, You know, and again, we're only talking two years. But you know, in the old days, you know, more than six months ago, primary focus was around trying to accelerate RPA, and anybody who's been involved in RPA knows that it's not the development that slows you down. It's finding the right opportunities, documenting them, prioritizing them, and then getting that documentation over your development team. Because typically, what I've always seen is, you know, you spend months trying to document, assess the process. And then the minute you hand it over to RPA developer, the first thing they do is go back to the subject matter expert And say, OK, what's your process? And that, you know, essentially, that whole time has been lost.

Because we're giving that, you know, very granular, look into what the activity is. The RPA developer can really start almost instantly, know, development, and does not have to go back because the documentation is much more accurate and detailed than they've been accustomed to.

21So that's, that's what a lot of our early use cases are. More recently, it's been great to see organizations start using sort of more of the general data and analytics and insights provided by the tool outside of that standard RPA. So, for instance, we had one organization that came to us, and they were doing a HR system migration and upgrading their systems. But they needed, they had 106 plants all over the world. And they really need to find out what was happening in those plants, so they can help build the requirements for the new platform. And so we did that as we did that assessment with them. And, interestingly, originally, the plane was just survey the people of, you know, directly with it, with her hand-written survey, and they wanted to use us to validate the information that's coming back end. So, they didn't necessarily believe what information was going to come back end from the various plants.

So, they wanted to have sort of the perfect recall about what was actually going on and what tools people were actually doing. We've also had use cases where an organization was looking to shift resources from one geography to another.

They needed to very quickly develop the ..., the standard processes and procedures that the team was doing. So, then they can transfer that work to another part of the organization. And, sort of, again, I've sort of turn those as as as discovery transformation programs. So, sort of, you know, big, complex programs, but, again, things that you need, that, that detailed insight on what you're currently doing, to really jumpstart the improvement that you're trying to get.

Again, even, even compliance, you know, we're starting to get organizations looking at it for compliance. The first step in almost every audit compliant procedure is, document your current processes.

And with this technology, we can very easily deploy that sensor onto the target desktop, record them for two weeks, and then get a, you know, very detailed insight into what exactly that user or that group of users is doing.

I definitely can see many use cases and compliance, safety related alone, in the energy industry. We had a lot of needs for, you know, making sure that someone is taking some sort of training, For example, that you know that that person is, they are taking that training, and they're going through the training correctly. It may not be the original aim of your application, but certainly there's a there's a there's a lot more use around it, and also touch on the other piece of the question. In terms of deployment to a large enterprise.

You know, we generally work with, you know, the Global 2000 we've got everybody from you know, a bunch of folks in the fortune 10 as well, so, you know, very comfortable working with large enterprises, and the way deployment works we don't have to capture activity on everybody's desktop. So, if there's a group of, you know, 50 or 60 that is doing roughly the same type of work, you know, we may put Sensors on 10 of those people. We may sort of target, um, you know, a couple of people have been there for a long time, A couple of new users. So you get a diverse sources of data, but we don't need to capture information on everybody who's doing the task, just that sort of subset.

And, really, what we're doing is we're turning it into a big data problem, because we see somebody who's doing this process, you know, dozens of times a day, or once a day, or, once a week, whatever it is, over a set amount of time. And so, you know, if they end up going to lunch in the middle of the process, one day, you know, that doesn't really throw off our dataset. Because, you know, we're looking at this larger spectrum of data and larger dataset that we can toss those out as, as outliers, and then really, sort of, you know, turn this into a big data problem, mine it, and really identify the core processes, sub processes, and tasks that go on with that group of people.

That's very good because that comment area, right there addresses a number of the questions that came up, which is, I have 200 users. Do I need to equip, you know, 200 laptops desktops with with sensors and so you addressed that you have kind of a selected way of getting a representative sample if you will the eco dystrophy?

Now John on that on the on the on the professionals that and pieces of equipment where you deploy the sensors Tell us a little bit about this the the sensors themselves installation is just something that you ship to someone or they can install themselves that requires a technician to be there. I mean, what does this look like? Installation is very simple. As I mentioned, you know, it's, it's literally can be deployed in, in minutes. It's a standard, you, know, MSI file, so it's a point. And click install it, can be pushed out by IT.

Or it can be simply installed by end. Users just download the file and double click on it. We've got one client who actually installed it into their app store, So, the person just goes to the app, Store the internal Kinda corporate app store and download it from there.

As I mentioned, it's very lightweight, it's less than a two meg file and uses very limited resources You know, system resources, I think are like less than 3% CPU and maybe around 8 to 10 kilobytes per second bandwidth, so uses almost no system resources, almost no bandwidth as well.

Is there any hardware to the sensors? No hardware, it's it's purely just a piece of soft software agent that sits on the computer, can be easily installed, non installed when the when, the, when the observation is finished.

Perfect. That's, that's good to know, because there's some questions about that real, are regarding, if there is a hardware component to, to this so that you have addressed that. Another question here comes from when a fuller he asks, How many process levels are recommended for data collection, all of them, or some fixed process level? Does Does that ring a bell to you?

I will say, we will, we will argue that we do a level five process documentation and at that level it is it is the screen by screen shots of the activity and what's going on on that desktop. You know, a level three might be sort of that traditional swim lane diagram or PSYPACT that we've heard about today.

But we go all the way down to that very detailed level and that's why when we pass it over to development or, you know, operational excellence or continuous improvement, whatever the team, maybe they get a very detailed, no documentation about what that person is doing on a, you know, essentially click by click basis through, through their day. And really, that information allows, you know, this sort of accelerated discovery. And again, I think it's important to note, it didn't really touch on during the talk.

This technology doesn't get rid of your business analysts, What we're really aiming to do is make them more efficient and just like RPA try to shift them from lower value to higher value work.

So we're trying to get them to analysis, you know, essentially immediately, and allow them to skip the whole data capture and in discovery process and just let them start analyzing you know, almost instantly. So, then they can, you know, accelerate their output, because they don't have to go through that whole documentation and preparation. Artifact review, and all those pieces, we enable them to get to analysis. You know, instantly, and, But, but there T, you know, we we can't tell you what the idea of future state is. We're telling you what you're currently doing and how you're currently operating.

And, you know, from there, we need that, you know, human intelligence to say, OK, yes, there's the bottleneck. How do I fix it? You know, we can't tell you how to fix the bottleneck or do the implementation of it. That's, you know, going to rely on the people, but we're just, we're speeding up that time the analysis, more than anything else.

And that's very, that's very good insights, from your very honest, on what it can be, or can be done by the software and what needs to be done by the business. Very, very good.

So, John, where do you talked about some of the use cases, But if I want to get started and playing with this, actually, before that question, I have a different question that popped up here. that, I think is, that, I think is important.

You are getting, as you mentioned, it's interesting because you'll come in to maybe look at one process, one aspect of someone's job, for example, And then through this, through this technology, and the monitoring that's taking place there, This person does many other types of work throughout the week and you can capture all of that.

And then you have this richness of information now that you can analyze, I'm curious just maybe a bit anecdotally here How much of that becomes part of the effort? I wonder because a lot of times You know You do some process work, and it turns out to be like 10% off someone someone thus during and then you discover there's this other 90%. I'm curious about the evolution of this. Is that a natural progression? That people naturally start looking at all the things. Our clients have stay very well focused on only the 10% that thereafter.

Yeah. Yeah. Interesting. Let me, let me answer a couple different ways, you know.

Number one, you know, when we go in and sort of observe. You know, we're Again, we're observing generally everything except that, you know, what's on the allow or deny list on the security side as well.

But it's funny, you know, and twice in the past, you know, 60 days, we've had some outputs and we've gone into the client and said, OK, we observed these 10 people for two weeks. You said they all do essentially the same job.

Only three of them are in the core application that you said these people would be in every day.

Screenshot (4)You know, these three people are doing 90% of the work, that you, of, the entire group, you know, in terms of, in terms of time, spent in a specific application and, you know, immediately sort of heads perk up, and like, hang on, how, how do you figure that out? You know, we thought they all did the exact same thing, So that is very much, you know, an eye opener. You know, for people, a lot of times, again, it's giving you this level of detail that you have traditionally not had access to.

You know, so that's kinda the the biggest surprise people get and again, you know, we're sort of observing everything when you go in and do it manually.

Typically yours, you're going in and saying OK, I want to see how you do, you know, unmatched transactions. You know, how you, how you solve those? You know, we're seeing there much larger piece and that's why typically from an observation period, we're getting, you know, 2, 3, 4.

You know, PDS or opportunities for for re-engineering as opposed to the human analysts who's just going in was sort of that laser focused on the one, the one opportunity that they're trying to get. So, it's just, again, that natural extension of the platform.

Very good, Jon. Our time is up, but it has been an incredibly interesting journey on understanding this, this application of Fortress IQ is doing, their unique, very interesting approach to understanding processes, understanding opportunities for improvement and innovation. Thank you so much for taking the time to share your insights, sharing your expertise with the global audience today.

Thank you very much as a and again, thank you for helping me out with the audio issues. I think we got to solve there and apologies that. We had some, some hiccups there in the beginning. But thank you very much and have a great rest of the conference. I look forward to tuning into some more of it.

Thank you, John.

Have a good one.

Ladies and gentlemen, this wraps up day one, off, process mining live. Tomorrow, let's take a look at what we're gonna cover tomorrow, The first session tomorrow. We'll have a Sandy ... talking to us about processing intelligence and the silver bullet for analyzing disruption. That will be followed by Norman. Norman Rank is play, is a technology leader at the Mac when the tile. And the norm is going to be talking about internet of things and data mining for results, and the company, and who's going to show very practical applications that they are, they are having for the technology right now in their organization.

Following Norman, we have J M. ..., and J M is going to talk about bimodal process mining. For operational agility, help understand what kind of mining and analysis will give us the insights. We need to improve our business operations and understand and act on those insights from process mining and analysis activities. And then, tomorrow we wrap up the day with a presentation from prison is engineering and it's going to be on leveraging process mining should design for success, strategic facility and capacity planning. And this is going to be done by the director of process engineering at praises Engineering. Christiana Pumphrey is going to be here with us tomorrow. And she wrapped up, they choose for us tomorrow. So, thank you so much for an incredible engagement. They awesome questions throughout.

For those of you who want to follow up on additional items, go on their LinkedIn. Make your commentary, ask questions, and to the extent that we can, we're going to answer those. We appreciate your feedback on all of the sessions. And, and we'll carry on there until tomorrow, and where we're going to be live, again, at this same spot. And I hope to see you back. So, thank you for now. Have a great rest of your day, and I will see you tomorrow.

pillar%20page%20line%201

About the Author

more (8)-2Jon Knisley,
Principal, Automation and Process Excellence,
Fortress IQ.

Jon Knisley is a Principal at FortressIQ where he helps companies leverage process intelligence to accelerate their automation and transformation programs.

Prior to his current role, Jon served as the Chief Architect for Intelligent Business Automation at the Defense Department’s Joint AI Center.

 

pillar%20page%20line%201


The Business Transformation & Operational Excellence Industry Awards

The Largest Leadership-Level Business Transformation & Operational Excellence Event

opex_assembly

business_assembly

Proqis Digital Virtual Conference Series

View our schedule of industry leading free to attend virtual conferences. Each a premier gathering of industry thought leaders and experts sharing key solutions to current challenges.

Download the most comprehensive OpEx Resport in the Industry

The Business Transformation & Operational Excellence Industry Awards Video Presentation

Proqis Events Schedule

Proqis Digital

Welcome to BTOES Insights, the content portal for Business Transformation & Operational Excellence opinions, reports & news.

Submit an Article

BTOES UNIVERSAL GRAPHIC - NO DATE.webp?width=1200&name=BTOES UNIVERSAL GRAPHIC - NO DATE
ACCESS 50 VIDEO PRESENTATIONS
Access all 75 Award Finalist Entires
RESEARCH REPORT 2021/2022
BTOES AWARD - NO DATE
BTOES UNIVERSAL GRAPHIC - NO DATE
Subscribe to Business Transformation & Operational Excellence Insights Now
btoes19.png
png
ATTENDEE - Proqis Digital Event Graphics-2
ATTENDEE - Proqis Digital Event Graphics (2)-1
ATTENDEE - Proqis Digital Event Graphics (1)-1
png

Featured Content

  • Best Achievement of Operational Excellence in Technology & Communications: IBM
  • Best Achievement of Operational Excellence in Oil & Gas, Power & Utilities: Black & Veatch
  • Best Achievement in Cultural Transformation to deliver a high performing Operational Excellence culture: NextEra Energy
   
Operational Excellence Frameworks and Learning Resources, Customer Experience, Digital Transformation and more introductions
  • Intelligent BPM Systems: Impact & Opportunity
  • Surviving_the_IT_Talent_deficit.png
  • Six Sigma's Best Kept Secret: Motorola & The Malcolm Baldrige Awards
  • The Value-Switch for Digitalization Initiatives: Business Process Management
  • Process of Process Management: Strategy Execution in a Digital World

Popular Tags

Speaker Presentation Operational Excellence Business Transformation Business Improvement Insights Article Continuous Improvement Process Management Business Excellence process excellence Process Optimization Process Improvement Award Finalist Case Study Digital Transformation Leadership Change Management Lean Enterprise Excellence Premium Organizational Excellence Lean Enterprise Lean Six Sigma Execution Excellence Capability Excellence Enterprise Architecture New Technologies Changing & Improving Company Culture Agile end-to-end Business Transformation Execution & Sustaining OpEx Projects Culture Transformation Leadership Understanding & Buy-In Lack of/Need for Resources Adapting to Business Trends Changing Customer Demands Failure to Innovate Integrating CI Methodologies Lack of/Need for Skilled Workers Lack of/Need for Support from Employees Maintaining key Priorities Relationships Between Departments BTOES18 RPA & Intelligent Automation Live Process Mining BTOES From Home Cultural Transformation Financial Services Customer Experience Excellence Process Automation Technology Healthcare iBPM Healthcare and Medical Devices Webinar Culture Customer Experience Innovation BTOES Video Presentations Exclusive BTOES HEALTH Strategy Execution Business Challenges Digital Process Automation Report Industry Digital Workplace Transformation Manufacturing Supply Chain Planning Robotic Process Automation (RPA) BPM Automation IT Infrastructure & Cloud Strategies Artificial Intelligence Business Process Management innovation execution AI Lean Manufacturing Oil & Gas Robotic Process Automation IT value creation Agility Business Speaker Article Systems Engineering RPAs Insurance Process Design Digital Speaker's Interview data management Intelligent Automation digital operations Six Sigma Awards thought leaders BTOES Presentation Slides Transformation Cloud Machine Learning Data Analytics Digital Transformation Workplace Banking and Capital Markets Data Finance Professional Services Education IT Infrastructure IT Infrastructure & Cloud Strategies Live Blockchain Interview Solving Cash Flow with AI BTOES White Paper investment banking Analytics Insight BTOES19 Consumer Products & Retail Enterprise Agile Planning Government Operational Excellence Model Project Management Algorithm Automotive and Transportation Banking Business Environment Digital Bank Enterprise architecture as an enabler Hybrid Work Model Primary Measure of succes Relationship Management Sales business expansion revenue growth Adobe Sign Agile Transformation CoE Delivery solution E-Signatures Electricity Global Technology HealthcareTechnologies Innovation in Healthcare Reduce your RPA TCO Transportation Accounts Receivable (AR) Big Data Technology CORE Cloud Technology Cognitive learning Days Sales Outstanding (DSO) Logistics Services Operational Excellence Example Risk Management business process automation transformation journey Covid-19 Data Entry Digital Experience Digital Network Digital Network Assistant (DNA) Digitization Drinks Effective Change Leaders HR Internet Media NPS Net Promoter Score Program Management Portal (PgMP) Sustainability TechXLive The Document is Dead The New Era of Automation Automated Money Movement Banking & Financial Services Biopharmaceutical Blue Room Effect Building Your Future Workforce in Insurance Business Process Governance Capital Market Creative Passion Digital Transformation Workplace Live Digital Workforce Digitalization ERP Transformation Finance Global Operations (FGO) Financial Services Software Frameworks Hoshin Planning Human Capital Lean Culture Natural Gas Infrastructure Natural Language Processing Organizational Change Pharmaceutical Pharmaceuticals & Life Sciences Project manager Supply Chain Management Sustainable Growth The Fully Automated Contact Center Transformation Initiatives Workplace Analytics eForms eSignatures 3D Thinking BEAM BFARM BTOES17 Big Data Processing Business Analytics Business Growth Centralized Performance Monitoring System Communication Creativity Digital Technologies Digital Technology Educational Psychologist Energy Management Health Insurance Health Maintenance Organizations Hospitality & Construction Human Centered Design Integrated Decision Approach Integrated Decision Making Intelligent Document Processing Kaizen Medicare Moodset for Excellence Natural Language Processing (NLP) Offering Managers Oil and Gas Optical Character Recognition (OCR) Pharmaceuticals and Life Sciences Photographing Price and Routing Tracking (PART) Process Design Document (PDD) Product Identifier Descriptions (PIDs) Python Quote to Cash (Q2C) Resilience SAP Sales Quota Team Work Telecommunications Text Mining Visually Displayed Work Culture master text analytics virtual resource management