Quantcast

The Two Data Deadly Sins of Process Execution

CREATOR: gd-jpeg v1.0 (using IJG JPEG v80), quality = 82

Are you ready to execute your processes? BPM architect Paul Grobler has two words of warning before you hit that ‘run’ button:

You’ve spent a lot of time defining your processes. You’ve interviewed your stakeholders and run plenty of workshops. You know how to translate the process model into something that will actually execute in your process engine.

But there’s one thing you’ve forgotten. And that one thing could turn out to be the proverbial fly in the ointment, spoiling all your hard work and stealing away your team’s hard-earned credibility.

That one thing is data quality.

These two ‘data deadly sins’ are the key reasons why so many processes fail to deliver the expected benefits:

 

Sin 1: Bad quality data means bad process

Any conditions you build into your process model need data to determine the direction of the process flow. If the data is bad, your process will flow the wrong direction and cause unexpected results. “But I’ve tested my process thoroughly!” I hear you say.

It’s likely that Go Live will reveal many data anomalies that you have missed. You couldn’t have catered for all possible permutations in your testing cycle. How does this come to your attention? An angry end user phone call perhaps? Or worse, a customer experiences problems before they are caught internally.

The result? One of the most difficult to recover from: a bad customer experience. Not to speak of the embarrassment faced by the organisation, or blame pointed your way.

 

Sin 2: Bad process means bad quality data

If your data is not being used as an asset you are committing another deadly sin. There is only one way data gets into your information systems: and that is via the process. Whether it is a process running in a process engine or something sketched out on a whiteboard, the process of getting data into the process engine is just as important as the optimization you are trying to achieve. Bad quality process = bad quality data.

How does this manifest itself? Data captured by your process, especially if it is Master or Reference data, always has a downstream use in the form of operational reporting; it’s your process that’s at stake when the information is feeding a dashboard ogled by management.

 

Where does this leave you?

Take care of your data, and your data will take care of your process. Ignore data quality, and you will lose face and be accused of bad BPM practice. Even if you don’t have a formal convergence of BPM and Master Data Management in your environment, separating your data layer from your process layer will not only make your processes more agile and more efficient, it will also increase your options for reuse – both from your process and your data perspectives.

 

Paul Grobler heads up the MDM Practice for Zetta Business Solutions in Cape Town, South Africa. He consults on solving business problems using MDM, BPM, SOA and EA across multiple industries including finance, telecommunications, medical and government.