There Are Poor Tools. And There Are Portals. A single portal for all your suppliers has landed.

Learn More

Engaging And Enabling Your Organisation With High-quality Supplier Data

supplier data - Engaging And Enabling Your Organisation With High-quality Supplier Data

Introduction: engagement is the measurement of success for shared services functions

Regardless of industry or company size, shared services providers share the same core objectives: to improve efficiency, save time and save money.

As digital transformation continues to impact and differentiate organisations, the ability to execute this vision relies ever more heavily on very high-quality data.

One of the key measures of success for shared services functions is engagement. But not just engagement of a few data specialists – we mean engagement of your entire company and all of your internal partners.

To be successful, it must be clearly understood that you’re looking to centralise processes with the caveat that there will also be a high degree of localised flexibility to cater to different needs.

Enabling the organisation with supplier data management

You aren’t taking control away from local stakeholders (unless that’s one of your objectives). It’s not about policing but rather about facilitating and helping. This centralised function is important because it’s a big efficiency driver.

It’s not practical to have ten people approve the same change to the name of a supplier, which should be the same for everyone. A single person with the right authority could do this.

To establish centralised processes across a diverse stakeholder group of internal customers, specialised tools are required. The purpose of centralised supplier information management is to enable, not replace, existing systems. There will always be specialist tools; to think a single tool will replace them all is wrong and will result in an uphill battle with existing stakeholder groups.

The only thing you’re removing is the ability to make changes to master data, which isn’t normally what’s important in these tools. For example, an accounting system needs a supplier record to pay an invoice but its goal isn’t to facilitate ongoing maintenance and governance of the supplier data record; its interest is in creating a transaction linked to a supplier master record.

Key functionalities of your tool

Below are the six data differentiators which you should use as a litmus test for your tech evaluation:

  1. Data Model
  2. Data Consolidation
  3. Data Flow
  4. Data Governance Workflow
  5. Data Staging
  6. Data Integration

Data model

The data model of any supplier master data management solution must be highly flexible to provide an efficient way for internal business users and suppliers to work within it. Most data models in ERP and P2P solutions are rigid or limited.

Variations between systems can sometimes be complicated. The key requirement is that your solution lets you flexibly define your data model and how things interlink.

Data consolidation

The tech needs to support consolidation from multiple disparate systems into a single view and flexibly accommodate 100% of the attributes from those systems for 100% of your suppliers.

If you have five different ERPs, then there should be one supplier in your centralised solution with five relationships (these relationships could break down further into company codes, purchasing organisations or operating units – depending on your ERP) holding all of the data attributes.

Data consolidation is one of the main risks and key success factors of any project which tries to centralise data from multiple sources into a single source. Don’t be fooled if anyone tells you this is simple, it’s not.

Data flow

Creation and maintenance of supplier data needs to start in the centralised solution. This will ensure that you don’t duplicate efforts. This will then be fed into transactional systems so business users can create workflows and generate purchase orders.

There is a reason for the data flow and why you see P2P or supplier network solutions normally receive data from the ERP first. It’s because they need a transactional supplier record to reconcile purchase orders and invoices back to the ERP. If they didn’t do this, then interfaces for purchase orders and invoices would fail.

This approach is unsuitable for centralised supplier data management because you only want to work on a single record and the system should be keeping the other records and information in sync. You don’t want three records in your centralised solution.

Data governance workflow

Every single change to data should be governed. This means that there is a workflow process, so that when a change happens it will go through various approvals. You should be able to easily add your own data governance workflows in the system and extend any out-of-the-box functionality. Data governance is specific to how an organisation is structured and its particular industry.

Also, there needs to be clear differentiation between global (common) data and local data. A good example is that the tax ID and name are global data, so they should be common for everyone. Payment terms, on the other hand, are considered local data because different organisations may have different terms with the same supplier.

Data governance has to be fine-grained to be efficient. You can have a different type of data change request for a manufacturing location, remit-to location, ordering location, bank account, legal entity and so on. These examples all have key differences between them and normally go to different stakeholder groups.

Data staging

In order to not ‘garbage up’ your data, your solution needs to ‘stage’ it. This means that if a supplier or internal business user makes a change, it shouldn’t be applied immediately. Changes will only be visible to users and sent to downstream and upstream systems once they’ve been validated through the appropriate data governance channels.

This helps avoid duplicates by ensuring that records are only created after the right checks have been done, to verify that this new record doesn’t exist somewhere else in the organisation.

Data integration

To enable and fuel other systems, a master data management tool should have sophisticated abilities to pull and push data in order to eliminate duplicated manual efforts. Here’s a practical example to explain how this works in the real world.

Example:

You have three different ERPs (this example works with a single instance as well, but is easier to explain with multiple ERP instances) – one for Division A, Division B and Division C. All divisions are using ACME Tools Ltd, so you have three records, one in each ERP. You have loaded these records into your P2P tool and have three records of the same supplier in your P2P tool. A user in Division A decides to change the data of ACME Tools Ltd.

  • Will it only change it for their record in their ERP? Will this data no longer match with the other two records?
  • Will it change it for all three records? If the answer is yes then who will approve this change? Is it all three Divisions or just the one Division? What if the other Divisions don’t want their data to change?

The above example illustrates the need for the six data differentiators to create a single view across multiple locations and addresses.

Silos are a challenging part of life in all large organisations and can often be seen as one of the main reasons why data is poor quality. However, data is rarely viewed in terms of the opportunities it presents. Data is the language in which you do business and, more than anything else, the one thing that binds the silos together.

Data quality issues are fixed like anything else – through change. Remember this is a business transformation project. Data quality is a gap in understanding and not about a single system.

You should embrace multiple systems because consolidation is a short-lived objective in today’s ever-changing business world. It won’t be long before new acquisitions, regulations or other business demands introduce new systems into your landscape.

Conclusion: People, Processes, Technology

Data governance is the people component within enterprise data management. You should focus on collaboration, connecting people and knowledge. When you do this, you can transform any single group’s capacity to imagine, understand and unleash a new way of working with data.

Understanding data usage and flow across silos creates a shared understanding of the business that allows information to be reused and connections to be optimised. A shared data understanding bridges the silos of the modern organisation to build a business-wide network. To understand your data is to understand your business and your internal customers’ needs.

Practicality is more important than purity. Too often IT, shared business services functions and master data management practitioners put too much emphasis on data theory rather than solving their business’s problem. The objective – solving the problem – should be the guiding principle, not the theory or a strategy, or even which brand of system to use.

If you found this interesting then take a look at our other resources on data and supplier data management here.

Posted in

Share this post