Foundational to P2P success: Master Data Management

Yatin Anand
Yatin Anand
Director, KPMG, a Coupa partner

Yatin is an experienced leader with a proven track record of driving supply chain and procurement strategy and transformation initiatives across several industries. He specializes in leading global teams on multi-country, multi-cultural programs to deliver strong, measurable results.

Read time: 9 mins
Keep It Clean Blocks

Master data management is critical to the success of any purchase-to-pay (P2P) or spend analytics initiative, since enhancing master data leads to the ability to make better decisions and drive sourcing, improve compliance, and help achieve Environmental, Social, and Governance (ESG) goals. While it is not often considered a fun and rewarding project like a P2P implementation, and thus often an overlooked piece of the puzzle, many of the benefits ascribed to P2P and spend analytics are the result of rigorous and proactive master data management. The quality of the input determines the quality of the results. It’s the classic “garbage in, garbage out” scenario.

There are typically two key master data components that serve as input into the P2P processes: vendor master data and item master data.

For vendor master data, “garbage in” is when you have missing or wrong data elements, such as payment terms, contact information, vendor tax identification numbers and PO transmission email. This results in multiple records with mismatching information as well as outdated vendor information.

For item master data, “garbage in” is duplication and inconsistency across data elements. This usually results from the lack of a well-defined taxonomy and further exacerbated by poor governance, yielding multiple versions of the same item, many of which then have differing or incomplete information.

The taxonomy chicken and egg

My colleague, Dipan Karumsi, has written in this space before about the importance of a well-defined taxonomy. It’s a bit of a chicken and egg argument because the lack of a defined taxonomy creates a lot of dirty data, however if you’ve had a poorly-defined taxonomy all along, you cannot just take your dirty data and put it into your new and improved taxonomy. You have to first cleanse the data, and then categorize it into your new taxonomy.

One of the biggest challenges we see in our clients’ organizations is decentralized data management with no enterprise-wide governance. There may be multiple owners across business units or regions and data resides in numerous different systems. Additionally, conflicting policies lead to inconsistent processes around creation/updates of master data elements thereby complicating maintenance and increasing total cost of ownership.

Loading low quality vendor master data into your P2P system can result in a number of challenges, ranging from users selecting the wrong vendor when creating a requisition, to purchase orders being transmitted to an incorrect vendor email address, to payment delays to the vendor. Similarly, data inconsistencies in the item master make it difficult to roll out catalogs and aggregate enterprise-wide demand.

All of this is compounded in a global organization. Even something as basic as the definition of master data may vary from geography to geography. As each may treat master data differently, the processes associated with the setup and maintenance of that data may vary significantly.

First step: An as-is assessment

One of the first things we do with clients is conduct an as-is assessment to gain an understanding of their current state practices and unique requirements related to setting up, updating, and de-activating master data records.

A data quality assessment is conducted in parallel to gain insights into the consistency, completeness, and accuracy of the data. This information allows us to analyze the client organization’s maturity and decide whether a tool needs to be deployed to facilitate a data cleanse and enable master data governance. The choice is usually directly related to the size of the company and scale of the project.

If we are looking at a global implementation with a large number of records across different plants, geographies, and business units, we probably need an enabling tool. A tool-based approach also works best when there are multiple back-end systems. The tool then sits on top of all of them to standardize and control the flow of data into each one. For small to mid-tier companies where all of the data is housed in a single ERP, manual cleansing is a possibility.

Defining a target state

Regardless of whether or not a tool-enabled approach is selected, it is critical to define and implement a target state operating model that can maximize and maintain the quality of master data on an ongoing basis after the initial cleanse is completed. This entails enterprise-wide standardization with variations only to accommodate for specific language, legal, tax, and regulatory requirements across the multiple dimensions of process, governance, and organization.

Standardized data setup and maintenance processes across the enterprise help drive consistency in master data. Up front duplicate checking drives the creation and sustainment of a single, golden record based on the established taxonomy. Streamlined approval flows increase the possibility of timely sign-off from appropriate parties prior to setting up or updating a record.

Clearly defined policies and data standards along with metrics and reports to measure process performance, data quality and data usage allow for enterprise-wide governance of master data activities.

Setting up an organizational framework that includes a centralized governing body that provides strategic vision and direction as well as along with localized groups to support day-to-day master data quality needs helps establish clear ownership and stronger controls.

Taking the time to focus on master data management is critical to drive an optimal outcome from your P2P initiative. Greater accuracy and completeness of master data plays a significant role in obtaining P2P business case benefits resulting from increased spend visibility and enhanced compliance.

Yatin Anand is Director, Operations Advisory Services, at KPMG in Chicago. He focuses on business transformation across the supply chain within various industry sectors.