Dovetail process flow and master data management

0

man working with data outside at nightIn my last post, I discussed the issue of temporal inconsistency for master data, when the records in the master repository are inconsistent with the source systems as a result of a time-based absence of synchronization. Periodic master data updates that pull data from systems without considering alignment with in-process business activities create the potential for this inconsistency. The way to address this is straightforward: don’t do your master data consolidation as a periodic process. Instead, push your identity resolution and master data management (MDM) into your business processes.

That is, of course, easier said that done. It typically requires two key activities that are not insignificant:

  • Align all enterprise process flows and all related master data touch points with the master record life cycle.
  • Renovate existing systems to use defined master data services as a way to manage entity data.

Align with the master record life cycle

One way to understand alignment with the master record life cycle is to consider the entity’s life cycle in relation to the way the business operates. Then you can identify where in each part of that life cycle the master data is created, read, updated or decommissioned.

Let’s use customer data as an example. There is a life cycle associated with the customer’s relationship with the business. It starts with the initial contact and continues through qualification as a prospect, becoming a customer, executing transactions, interacting with customer support and service, and doing product upgrades. The cycle can start over again when the customer makes another purchase.

Within the processes for each of these interactions, there are places where the customer is asked for data – and there are places where an agent (such as a customer support representative) reads the customer’s data, or where the data is updated. Examine each of these touch points and determine how that phase of the interaction affects the “composite” view of the customer’s record. Also consider what type of master data service could be invoked to ensure that the master record remains consistent with the simultaneous business processes in flight.

Renovate existing systems

The second activity needed to keep master data synchronized is more complex. It requires renovating your existing systems to use those defined master data services for managing entity data instead of maintaining a local copy within a dedicated data store. That means giving each business process the ability to search the master repository, read the master record and update the master record when a change is made that might impact other business activities. I refer to this as “dovetailing” the business process with master data management. And that means instituting policies and rules for ensuring that the master record remains consistent for all master data consumers.

While the “transaction hub” MDM operating model that is engineered to support this approach, its description is focused on the technical aspects of the MDM system and less so on business process alignment. I have rarely seen an organization that has adopted the transaction hub model, and when I have it is usually for cataloguing, such as product data management. But I do believe that there is room for evolving the transaction hub model by layering the business process modeling and data governance techniques that can operationalize observance of implicit temporal dependencies.


Get the TDWI checklist: Seven Tips for Unified Master Data Management

Share

About Author

David Loshin

President, Knowledge Integrity, Inc.

David Loshin, president of Knowledge Integrity, Inc., is a recognized thought leader and expert consultant in the areas of data quality, master data management and business intelligence. David is a prolific author regarding data management best practices, via the expert channel at b-eye-network.com and numerous books, white papers, and web seminars on a variety of data management best practices. His book, Business Intelligence: The Savvy Manager’s Guide (June 2003) has been hailed as a resource allowing readers to “gain an understanding of business intelligence, business management disciplines, data warehousing and how all of the pieces work together.” His book, Master Data Management, has been endorsed by data management industry leaders, and his valuable MDM insights can be reviewed at mdmbook.com . David is also the author of The Practitioner’s Guide to Data Quality Improvement. He can be reached at loshin@knowledge-integrity.com.

Leave A Reply

Back to Top