Pure Procurement https://www.pureprocurement.ca/ Tue, 09 Jul 2024 16:40:47 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://i0.wp.com/www.pureprocurement.ca/wp-content/uploads/2019/05/cropped-PureProcurement-Logo.jpg?fit=32%2C32&ssl=1 Pure Procurement https://www.pureprocurement.ca/ 32 32 123295621 How to Display Transaction Codes in SAP Menu https://www.pureprocurement.ca/display-tcodes/ Sat, 18 Mar 2023 02:08:03 +0000 https://www.pureprocurement.ca/?p=1786 Are you tired of sifting through the SAP GUI menu just to find the right transaction? If so, I have some good news for you! Did you know that you can customize your SAP menu to display transaction codes (Tcodes) for all the available programs in SAP? It’s a simple process that will make your life easier over time.

Illustration of how to display the SAP TCodes in the menu

Why Show Tcodes in the SAP Menu?

As you learn familiarize yourself with the Tcodes you use most often, you will learn them by heart and no longer have to navigate through the menu. It’s also helpful when trying to find a transaction you don’t use often. In this article, I’ll guide you through the steps to display Tcodes in your SAP menu.

Here’s what the end result looks like:

SAP Menu with Tcodes displayed

How to Display Tcodes in Your SAP Menu

Now that you understand the benefits of displaying Tcodes, let’s move on to how to display them. Here’s a step-by-step guide:

  • Open the SAP Graphical User Interface (GUI) and log in to your account.
  • Click on the ‘Extras’ button, followed by ‘Settings’ option in the SAP menu bar (or press ‘Shift + F9’).
Printscreen of 'Extras' and 'Settings' option in SAP GUI Menu Bar
  • Select the ‘Display Technical Names’ option and press the ‘Continue’ button at the bottom of the window (or press ‘Enter’).
SAP GUI menu settings window with options highlighted

That’s it! You should now see the Tcodes displayed next to the program names in your SAP menu.

The two other options (‘Display Favorites at end of List’ and ‘Do Not Display Menu (Only Favorites)’) are not as useful so I don’t recommend using them. The first changes the order of your ‘Favorites’ to be after the regular SAP Menu which doesn’t make much sense. The second removes ‘Favorites’ altogether which makes even less sense…

Tips for Using Tcodes

Now that you have Tcodes displayed in your SAP menu, here are some tips for using them effectively:

  • Use your ‘Favorites’ section: For the transactions you use most often, ‘Right Click’ on them and use the ‘Add to Favorites’ option. This will add them to your ‘Favorites’ section and make them more easily accessible.
  • Learn the Tcodes: Take some time to familiarize yourself with the Tcodes for the programs you use most frequently.
  • Use the ‘Command Field’: When you start to know the Tcode for a given program, you can type it directly in the ‘Command Field’ and press ‘Enter’ to navigate to the transaction instead of using the menu
  • Use the Tcode structure to your advantage: Most transactions have 3 variants (create, change and display). The ‘create’ transactions finish with a 1, the ‘change’ transactions finish with a 2 and the ‘display’ transactions finish with a 3. Use this to your advantage when navigating and searching transactions with the ‘command field’.

Conclusion

By displaying Tcodes in your SAP menu and using the tips above, you can make your workflow more efficient. The steps are simple to follow, and once you’ve done it, you’ll wonder how you ever worked without it. Start using Tcodes today and see how much time and effort you can save.

———————————
Do you already know your Tcodes by heart? Does it make you sad that these Tcodes are etched into your soul? What other benefits do you see in listing the Tcodes in your menu? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on July 9, 2024 by Joël Collin-Demers

]]>
1786
Without Quality Data, Artificial Intelligence Is a Shriveled-up Prune https://www.pureprocurement.ca/artificial-intelligence-needs-data-quality/ https://www.pureprocurement.ca/artificial-intelligence-needs-data-quality/#respond Wed, 27 Jan 2021 02:15:19 +0000 https://www.pureprocurement.ca/?p=1684 Shriveled-up prunes resembling an AI application with no data
I’ve got nothing against prunes… Really.

Building Artificial Intelligence (AI) applications in your business without high quality data is like driving a Ferrari without gas. It’s expensive and it gets you nowhere.

Defining Artificial Intelligence

Before dissecting my claim, it’s important to first state that the field of AI is overwhelmingly vast. Currently, Limited Memory AI is the most promising and useful type of AI for business. Therefore, it is the focus of this article. Limited Memory AI uses machine learning (ML) algorithms that combine various data sets to establish probabilistic knowledge. These data sets can come from facts (e.g., measurement conversion rates), past events (e.g., previous system transactions), simulations (e.g., weather projections) or even previous algorithm executions (e.g., a toy car better recognizing a wall after hitting into it multiple times). In essence, the goal of these algorithms is to iteratively crunch data, recognize patterns and establish principles to make predictions within a confidence interval. Rules can then be build around those predictions and confidence intervals to automate actions within your systems.

As you can see, to effectively harness Limited Memory AI, you need quality data. It must be complete, available & accurate/trustworthy. Without it, system predictions are worse than human predictions (which are not great to begin with…). So, how can you ensure you have a solid data foundation on which to build your organizational AI capabilities? The answer is simple enough but attainment is difficult…

Building a good data foundation requires mastery of data quality management process for three types of system data: organizational data, master data and transactional data. In short, this involves establishing a set of standard processes and tools to govern your data’s lifecycle (creation to decommission). How do we do this?

Data Quality Management

Organizational Data Quality

First, there is organizational data. This is the easiest type of data to manage as it is the most static. Organizational data represents the entities that make up your organizational structure. These can be concrete entities such as physical plants, office buildings or production lines. These can also be conceptual entities such as divisions, customer segments or departments. Because of its foundational nature, organizational data is naturally updated when there are changes in the organizational structure. These are rarely forgotten as they enable operations and reporting according to the new structure. Everything else “hangs” on these data objects.

Nevertheless, ensuring data quality requires good governance. You should create processes and tools for requesters within the organization to submit their creation/modification requests. These tools should capture all the data required to action these objects within your systems (e.g., web forms). Furthermore, you should have mechanisms in place to verify the validity of organizational data on a consistent basis (e.g., automated yearly questionnaires to data owners). Even better if you have automated triggers to decommission old data when required (e.g., office closure events).

Master Data Quality

Second, there is master data. This is where things become more complex. This type of data changes continuously. Master data represents the objects needed to carry out business activities, or transactions. For example, in order to execute a purchasing transaction, you need at minimum a supplier, a product and a price. As these objects do not define the organization’s structure, nor can they represent a transaction on their own, they are categorized as master data. Furthermore, sources for master data can be internal to the organization (e.g., a bill of materials for a finished good as determined by the marketing department) or external (e.g., a supplier’s address, pricing, payment and banking information). This makes managing the quality of this type of data difficult.

Internal Master Data

For internal master data, assigning data ownership to the teams responsible for creating that data will help. For example, plant maintenance teams should be responsible for a manufacturing plant’s location address data (where everything is located in the plant). If teams are empowered, staffed and accountable for data maintenance, they will maintain address data as it changes. Why? When the digital world reflects the physical world, doing a god job is much easier. However, your teams need help transitioning to this model. Consider using a Data Centre of Excellence (CoE) to help them ramp up in competency. You can also turn to automation once your data quality processes are mature.

External Master Data

For external data, you also need data owners. However, as the “real data owners” are outside your organization, this is not sufficient. The ideal setup involves getting a data feed directly from the source that automatically updates your data (e.g. commodity pricing from the reference index). For distributed master data, you might be able to find solution providers that specialize in aggregating, enriching and publishing the type of data you’re looking for. They might even be using their own internal AI/ML technology that runs on a high-quality data foundation to provide this service. However, when this type of feed doesn’t exist, you can fall back on supplementing your process with user knowledge, judgements and/or the use of surveys. Bonus points if you partner with a solution provider to develop the feed you need!

For example, in the case of supplier data, it is the supplier who is the “true data owner”. It is their actions that impact data validity over time. This makes keeping a supplier database up to date difficult. While it’s not impossible to get thousands of your suppliers to regularly update their information in your systems, it’s pretty close. It’s also important to note that attempting it will drive you mad. So, if you can connect your systems to a service that does it automatically for you, doesn’t that make sense? Of course, you will always have internal contextual information to capture as well (e.g., your particular salesperson with a supplier). Additionally, these data feeds won’t always be 100% accurate. Therefore, coupling data owners with data feeds is crucial to ensuring high quality external data.

Transactional Data Quality

Finally, there is transactional data. This type of data is also difficult to manage. It represents all the different variations of business processes executed within your organization. Every day, every time someone interacts with a system to execute an action, they are logging a transaction. For example, purchase orders, goods receipts, invoices, sales orders, production orders, employee onboarding, creating a new user in a system, etc. are all part of transactional data. Each type of transaction, say creating a purchase order, can also contain multiple variations (e.g., Stock, Non-Stock, Subcontracting, Consignment, Free good, etc.). This represents hundreds if not thousands of different processes all generating transactional data. Without oversight you quickly end up with a “data box of chocolates”… You never know what you’re going to get!

Governing Processes

In this case, it is not the transactional data itself that needs to be governed. Rather, it is the overarching processes that need owners. As processes are made up of transactions, if each process owner ensures a coherent end-to-end process from a business and application perspective, it will produce high quality data outputs. When analyzed, these will provide meaningful insights about how to further optimize your business. To achieve this, identify and catalogue your business processes, assign them to an owner, and then formally define and translate them to system processes and transactions.

In short, start by identifying and cataloging your processes in a business process hierarchy (BPH). You can use the APQC Process Classification Framework as a starting point. Assign a business owner who is responsible for defining, overseeing, managing and optimizing each key process. This should be a senior person within the associated function. By focusing on process measurements such as data quality output, continuous improvement of process outcomes will become top of mind. This will lead to increasingly sophisticated “application guardrails” as process and application weaknesses are identified and resolved with new process activities, system configurations, development or add-ons.

A Mountain to Climb

As I’m sure you’ve realized by now, mastering quality for these three types of data across your organization is no small endeavor. In fact, I don’t think I’ve ever seen it done across a whole organization.

Should that deter you?

No.

Do not surrender progress to perfection. Start small. Master data quality for the areas of your business where Limited Memory AI feasibility and benefits align. This type of approach does not need to be widespread within your organization to have massive impact.

Start Small

For example, let’s explore the purchasing (P2P) process. Why not start with data owners for your top 20 vendors? Or, by assigning process owners for your top 5 most executed purchasing processes? If these are already optimized, start with troubled vendors/processes. The Pareto’s principle usually applies here (80% of your problems come from 20% of your vendors/processes). Give your team a set of ground rules (a short training, document templates, terminology, etc.) and begin. Adjust as needed along the way. As you start cultivating a data quality mindset with your team and tracking the results, the initiative will snowball and grow by itself. As you create “believers” in your team, this will lead to uncovering opportunities for automation of the various data quality management processes you’ve built.

Develop Artificial Intelligence Literacy

In parallel, work on developing your team’s Artificial Intelligence literacy. Once you have acceptable data quality levels, the real difficulty becomes identifying valid opportunities for the use of Limited Memory AI within your business. You must be able to separate the art of the possible from the empty promises. As a starting point, you now know that high-quality data is required for the effective use of Limited Memory AI, regardless of its source. When AI initiatives come up in internal meetings or supplier sales pitches you can now ask the most important question to consider the proposal: “Where does the underlying data come from?”

How to “Mr. Miyagi” Data Quality

Once you understand data quality management principles and pair that knowledge with Artificial Intelligence literacy, you may start asking yourself: “Couldn’t we use Limited Memory AI to help automate the data quality management process?”. That’s when you’re ready to fight Johnny Lawrence in the All Valley Karate Championship…

As you design data quality management processes and assign data owners to different types of objects, you will undoubtedly realize that mastering data quality is hard. For master data, the job is never done because data is always changing/updating and you have imperfect information (e.g., supplier data). For transactional data, there are always new process variations that pop up because of specific just-in-time business requirements. These are typically executed outside the system and supplemented by accounting entries or by perverting existing processes in your systems. This all contributes to data quality degradation.

The Tools That Will Help

The most impactful Limited Memory AI tools available on the market today are tools that help you proactively manage these data quality management problems at the source by automating a large part of the processes described above. These applications recognize that it is unreasonable to expect humans to keep data quality perfect and that you are most likely starting your journey with low quality data. They execute machine learning algorithms that work on continuously and iteratively bettering data quality levels with a combination of available data and user inputs. Over time, as the application develops operational rules based on usage, these applications have the potential of getting you pretty close to perfect. Better data may not be perfect data, but it sure as heck is better than bad data.

In Conclusion

Once your quality data foundation is in place, this opens up additional opportunities to leverage Limited Memory AI to recognize patterns and (more reliably!) predict the future across your business.

Don’t think of AI initiatives as an all or nothing, now or never proposition. It is a journey with a promise of exponential returns but requires long term vision and continuous work.

Take your cue from the plum tree… On average, it takes 6 years for a plum tree to bear fruit. Then, you either put in the work to pick the fruit or you end up with a bunch of shriveled-up prunes.

What will your data be doing in 6 years?

Summary

  • Limited Memory AI is currently the most viable form of Artificial Intelligence for business purposes.
  • In order to benefit from these possibilities, data quality must be high.
  • To improve data quality, you must tackle three kinds of data: organizational, master and transactional.
    • For organizational and master data, position data creation and maintenance actions as close to the true data owners as possible.
    • For transactional data, use process owners and process management techniques to get high data quality outputs from your processes.
  • Start small, with the data objects and processes that will provide the most benefits with higher data quality levels and Limited Memory AI functionality.
  • As you augment your data quality management maturity, work on automating processes where possible and developing your organization’s Artificial Intelligence literacy.
    • Automate data consistency problem correction (e.g., setup business rules around “illegal” report line item combinations with notifications). Data owners can act as reviewers instead of researchers when this is setup.
    • You need to know how to spot real Limited Memory AI opportunities within your business. You also need to be able to spot empty promises to avoid building/purchasing AI “shelf-ware”.
  • The shortcut in this process is purchasing/building Limited Memory AI applications that help you automate data quality management processes (such as maintaining master data or mining & defining processes). These applications greatly reduce or eliminate the data owner’s operational task load in data quality management processes.
    • These are the types of “AI applications” that show the most concrete value today on the market.
    • As these applications are adopted more widely, new AI applications that need high data quality levels will emerge and be viable for use by organizations who have a data foundation in place.
  • How are you working on your data foundation? Remember the Plum tree!

———————————
What other mechanisms have you used to master data quality? What other factors do you see contributing to the success of AI initiatives in organizations? What are your AI initiative lessons learned? Let me know in the comments.

If you liked this post, why not  Subscribe

]]>
https://www.pureprocurement.ca/artificial-intelligence-needs-data-quality/feed/ 0 1684
Get Value From Your Spend Data Without a Source-to-Pay System with Susan Walsh https://www.pureprocurement.ca/get-value-from-your-data-with-spend-classification/ https://www.pureprocurement.ca/get-value-from-your-data-with-spend-classification/#respond Thu, 07 Jan 2021 01:10:15 +0000 https://www.pureprocurement.ca/?p=1492 Montreal skyline with picture of Susan Walsh, data classification expert
“It doesn’t matter which software you implement. If you don’t have your data right BEFORE you implement it, there’s no amount of software in the world that’s going to fix it.” – Susan Walsh

Note: This post is the transcript of the episode. If you prefer listening to audio, you can listen to the episode on the podcast page.

On the last episode, we explored the criticality of vendor data for Source-to-Pay system benefits realization. On this episode, we widen our focus to see how you can quickly get value from the spend data you currently have in your organization with data classification to enable better decision-making.

In 10 years of working with clients on Procurement improvement mandates, I’ve never worked with an organization that had excellent spend data. For a multitude of reasons, organizations typically have their spend data all over the place. So, what’s the easier, fastest way to fix that? How should you manage data cleansing in relation to a Source-to-Pay suite implementation like Ariba, Coupa, Ivalua or Jaggaer?

To help me explore this subject, I am joined by Susan Walsh or The Classification Guru. She’s worked with dozens of organizations over the last 8 years to help them find invaluable insights in what they considered to be useless data. Data cleansing, classification & taxonomies are the name of her game and, in our chat, she shared her no non-sense approach to getting results on these fronts.

—————————-

*The transcript from this interview has been edited for brevity and clarity.

Introduction

J: Hi Sue, thanks for joining me today.

S: My pleasure.

J: I thought we would discuss how you can get value from your procurement data before you even start implementing a source-to-pay system like Ariba, Ivalua, Coupa, Zycus, all those big brand names on the market. First, before we dive into that subject matter, I wanted to ask you a question on how you developed such a deep interest in data and data flow modelling, data classification, etc. How did you get mixed up in such a niche world?

S: It was an absolute, complete accident. I had opened my first business which was a women’s clothing shop and had to shut that down after about eight months with a lot of debt. Being desperate for a job, I went online, found an ad for some data classification work with a spend analytics company. I had never had any kind of interactions with that world before but as soon as I started classifying the data, I just felt like it came really naturally to me. I picked it up really quickly. Because I had worked in companies previously, I felt like I was bringing in an added level of knowledge to the classification because I understood what the businesses were spending their money on…. And, that’s how it started.

After about five years of working there, managing a team and helping grow the business, I decided that there was an opportunity to offer just the data prep, the data classification, and the data cleansing side as a stand-alone service, not part of an overall software service or consultancy or a part of something else…. So far so good.

J: Awesome. Yeah, because you’ve been operating as the classification guru for a little while now, right?

S: Yeah. I’m coming up on my three-year anniversary, which in itself is pretty amazing.

J: Congratulations.

S: My first business didn’t make it to a year and I know a lot of businesses are lucky to make it up to three or five years, so I’m hanging on in there.

J: Awesome. It will give us a lot of meat to go through this interview. So, does the fact of getting a set of data all clean also get you excited?

S: Yeah, it does. Before I meet with a client, I’m excited on their behalf because it’s hard to see the potential until the product or the data sheet is finished. Then there are endless opportunities in terms of supplier rationalization, cost-savings, targeting any kind of rogue transactions. I love getting feedback on how they’ve been able to save money or improve processes because of something that I’ve done that I really enjoy doing.

Value Drivers for Data Classification

J: Would you say those are the main types of value drivers that data cleansing, classification, data flow modelling provide?

S: Yes. That as well as time-saving for the people doing it. I do this on a day-in, day-out basis and have done for eight years. I’m very efficient, very knowledgeable. When I come across a lot of suppliers, I’ve classified them many times before, so I’m very comfortable in knowing whether the classification is right or not or what to classify them as. As well as that, there’s the unknown quantifiable side, which is helping to prevent costly mistakes that might happen if you haven’t got accurate data.

J: Actually, there’s a piece of risk management in there, risk mitigation as well.

S: Yeah.

The Process of Data Classification

J: Okay, great. All these terms we’re using can seem pretty intangible, right, to some of the listeners who haven’t necessarily played a lot in data. I’m really interested in getting your perspective. A company you come into would generally have, I’ll say, dirty data or incomplete data or a lot of duplicates, whether it be in vendors, invoices, purchase orders or even P-card data. When you show up at a company, what’s your process to say: “okay, what’s the state of the data today and what can we do to get to a perfect set of data that we’ll be able to pull those insights from”, whether it be cost savings or supplier-based or identifying tail spend or what have you.

S: Yeah. I treat it very much on a client-by-client basis. I don’t have a standard template that I use. Before I start it and when I’m at the quote process, I’ll ask to see a sample of the data. At that point, I can then look at what level of detail is available and gauge if I can meet their objectives with that level of data. A lot of my clients have never had classified data before, so it’s a really good starting point to work with me. They don’t have to learn any technology, they’re not responsible for doing anything. They can just trust me with it and they’ll get something back that’s useable and actionable.

J: Which steps would you usually start with if I had a bunch of invoice data or purchasing data and I wanted to get a better handle on what it represents for my company?

Supplier Normalization

S: The first point of call will always be supplier normalization. It is a great tool not only because you will find that you’ll have multiple versions of the same supplier and files, and which is particularly prevalent in global companies or companies with multiple divisions where maybe the systems don’t talk to each other. You maybe have IBM, I.B.M., IBM Inc. We standardize that to IBM. That then gives you a true picture of how much you’re spending with each supplier without having to even classify anything, which in itself is invaluable. That also means that when I start to classify that data, it’s going to help me be consistent and accurate because I have more of the same suppliers under one normalized name. I’m not classifying IBM five times; I’m classifying it once under a normalized name.

J: Okay. It’s about getting your supplier base normalized and then from there you’re able to hook on all of the spend data to the different suppliers.

S: Yeah. It can be more efficient, more effective, yeah.

Attacking the Spend Data

J: After you’ve normalized your vendors, you have other data sources like P-Card data, invoices, maybe some PO data as well. I’m guessing the next step is taking all of those different sources of data and hooking them up on to your list of normalized vendors so that you have that picture of spend?

S: Yeah, that’s right. I have data modelling software and visualization software that I use so I can take multiple file sources and pull them all together.

J: Even if they are in different formats?

S: I tend to work in Excel but they may come from different systems.

J: Yeah, I meant like different columns, different data sets.

S: Yeah, no, that’s not a problem at all. Within my software, I can then standardize the columns so you might find that in system A, column 1 is the name and then in system B, the name is called column 2. I can standardize it all and make sure that it all adds up to the same and that’s when you can then check for duplicate POs and also where there might be similar PO numbers but not quite the same. For example, if it ends in a zero and one ends in the letter O but it’s actually the same PO, that’s a potential fraudulent activity.

Handling Spend with No Commodity Code

J: Okay. The other piece where I’d be curious to hear you on is non commodity code spend. Often, I’ll have either non-purchase order invoices or even P-Card data that doesn’t have a commodity code attached to it because that’s usually a purchasing system principle (vs. finance systems). How do you get around that or how do you go about assigning those different values to those different pieces of spend?

S: If there’s existing information when I’ve normalized the suppliers I can then match or name in description and if there’s an existing commodity code, I can map that over. That’s really simple. Then anything that’s left over, I will then do manually to make sure that it’s correct. The next time that that information then shows up, I’ll be able to then map it over again, so it will be semi-automated, if that makes sense.

J: Yeah, no, absolutely. Based on a business rule that would be for that business or for that industry.

S: If that’s what’s been specified, yes, I can do that as well.

J: Like I know we chatted previously a couple of times and you had mentioned the example of DHL where depending on the industry or how you are using that supplier, it might have different commodities that need to be attached to it, right?

S: Yeah, and hotels too. First of all, DHL, myself, possibly yourself, we might be using DHL as a courier or postal service but if you’re a manufacturer then it’s more likely to be logistics and warehousing. For me, because most of my process is more manual, that’s where my knowledge comes into play. That’s harder to automate. You have to know the industry, the company you’re classifying for to specify what it would be. I think you always need to just have that human eye to double check.

Same with hotels. A hotel is a hotel but it might not be, which sounds a bit funny. If let’s say 50 grand of spend for the hotel, the chances are that that’s going to be venue hire, room hire or some kind of function. If it’s $5,000 then that might just be accommodation. You can set a rule for that quite easily but there’s a bit of knowledge and experience in there as well.

J: For sure. That’s what I’m getting a sense of, the value that you would bring in one of these mandates having done it a bunch of times for different clients and industries. You’re able to know, okay, well, I know in my mind I have this database of tens of thousands of vendors that I’ve seen over the years and I know that when I get to these vendors, they are specific cases because they operate in different industries and in different commodities as well, so there’s judgment to be put into place there.

S: Yeah. Sometimes, there are suppliers that have names that look obvious to what they do but it’s not actually what they do. I’m trying to think off the top of my head of an example but it could be Bob’s Cars. You would assume that it was a vehicle maintenance or a taxi firm but actually in this instance, it may be a toy shop. Again, that’s when your knowledge and experience comes in. You look at the supplier name, you look at the description and then you google just to make sure. That’s when you find actually it’s a toy shop. It’s not what I thought it was.

J: Yeah. Then those are the types of mistakes that people that will be looking at the data later on in reports and what not, they’ll know because they’re part of the business and these are suppliers that they are dealing with on a daily basis. If you didn’t catch that at the outset when doing your first spend exercise then they lose all confidence in the data, right?

S: Exactly. It highlights the need for looking not just at the description or the supplier name but both in conjunction with each other.

J: Right.

S: Another example, I had someone that I was training to work for me at my previous role. The supplier was LinkedIn. The description was restaurant and it had been classified as a restaurant.

J: Okay.

S: Now, I know talking here, you think that must be really obvious but if someone’s not trained in the right way, they might just look at the description and not think to look at the supplier name at all. That’s where the restaurant classification came from. That is a true example. That’s happening in real life right now within organizations.

Tools for Data Classification

J: Right. I see how that can be a problem. You mentioned tools a little bit earlier, and I’d be interested to know what tools you use to do that process and what value it brings in terms of being able to automate.

S: Yeah. So pretty valuable to me, I’ve been using Omniscope, which is made by Visokio, for eight years now. During that time, I’ve developed my own methodology on the best way to classify data and also put in place some really great checks to make sure that the data is accurate if it’s already been classified or once it’s finished by doing final checks. I can very quickly spot where there’s maybe multiple classifications against the one vendor where there shouldn’t be. If it’s like ABC Taxis, and there might be four lines classified as taxi and one line classified as travel. Then, I know we can fix that and change that and make the data more accurate. Then ultimately, it filters up into reporting and analytics and decision making.

J: Okay. When you come in to do one of these mandates, the tool works as an ETL, I would imagine? The client sends you files to put into Omniscope, you work your magic in there with the different methods that we’ve outlined previously and then you are able to spit them out back to the client?

S: Yeah. I don’t have any connection to my client’s system. They will send me an Excel file, I will take that, I’ll put that into Omniscope. I’ll do the work I need to do and then I will export that back to Excel, and send it back to the client. Then they can do what they need to do with that. Sometimes they’ll put it back into their own system and sometimes they’ll just use it as a spreadsheet and do some reporting and analytics.

How to Get Around Excel Line Limit Restrictions

J: Okay. And I guess because Excel does have a line limit, I think it’s like around 65,000 or something like that.

S: I would say I struggle at about 50K I think…

J: Yeah, it starts getting slow.

S: Yeah. I did some database cleansing for a client at the end of last year and I put together nine sources of information. That came to 2.8 million rows. This can handle a significant amount of information and it could still handle more than that, so it’s really not an issue.

J: Okay. I guess then you get it in chunks if it’s from Excel and then you output it in chunks as well?

S: Yeah. Actually, when I came to sending the file back to the client, I deduped the 2.8 million rows down to 1.3 million rows. That was still too big for an Excel so then I had to further split the file into … I think it was a Mailchimp mailing list, so I had to split it into subscribed and unsubscribed so that they could actually open the file.

J: Right, okay. Yeah, Omniscope is not the problem, it’s the other tools.

S: Yeah. There’s always ways around these things though.

J: Okay. You’re piquing my curiosity here. Do you have an example?

S: Yeah. For example, there’ll be a lot of people working at home right now. They’ll maybe want to be looking at some large files in, say, Excel that they would normally maybe have access to a different system in the office but for whatever reason, they can’t do that at home. As we’ve just talked about, Excel can’t handle a massive amount of information. It will freeze. It will crash. You might even lose some data. What I would suggest is that you would split that data up by department or division or by country and look at it in chunks like that. I wouldn’t suggest an A-Z split but if you can do it by department then that’s the best way to try and keep consistency within the data. There’s little tips and tricks that you can use to get around things.

J: Yeah. If you do it that way, you’re able to take out your numbers by region, for example, and then build up your reporting that way if you need to.

S: Yes. Then, if for some reason something went wrong with the region file that you were working on, it would only be the region part that was affected whereas if you did an A-Z split or different parts of the alphabet, it would affect multiple different countries, regions, divisions if you’re working that way.

J: Then it’s a mess.

S: It just protect things. That’s what we’re trying to avoid.

Licensing for Data Classification Tools

J: Okay. Then just a last thing from like a technical perspective for licenses. You said you could work with a file that your client has given you and send them back. For a client, you can literally show up with your own software and the client doesn’t even have to get licenses?

S: Exactly. I buy the software license annually. They don’t have to get involved in it but another option is to take it in-house. I can do the first part of the project and then get their staff trained up so that they could then carry on themselves.

J: Okay, yeah.

S: At the end of the day, I think it’s really important that organizations own and understand and are familiar with their own data, so I’m really happy to help out and fix things. But, the best way to make sure that your data doesn’t get into that situation again is for the people that are working with the data to understand and be looking at it on a daily or regular basis.

J: Right. Yeah, because I would imagine that otherwise, they’re going to be calling Sue every month and that gets out of hand real fast I’m sure.

S: Well, I don’t mind that. I just think that’s not a sustainable solution for data.

How to Automate Data Classification

J: Okay. Let’s move on to that stuff then, right? You’ve come in and then this first step is to help them set up a process and rules to be able to classify, cleanse and put data together from multiple data sources so that it gives insights that we can actually action in the next months.

S: Yeah.

J: What are the next steps in terms of being able to automate that or bring it to a further maturity level?

S: Yeah. You’ve got this shiny new data. It’s fantastic, it’s almost perfect because I would never claim that there’s a 100% perfect data set out there. That data is continually changing and updating on a minute-by-minute basis. It’s not going to stay like that for very long. There’s going to be new information coming in all the time. The most important thing to do is to regularly maintain that data.

Depending on the volume that you are dealing with, I would suggest monthly or quarterly refreshes. The way that I would do it is if I’ve classifed that first set of data, I can then merge that with the new data and when it matches on multiple data points then the classification will pull through. Then there will always be some new data that hasn’t been seen before or hasn’t been classified, so then that would be manually classified by myself.

J: Or the people that you’ve trained, right?

S: Yes, exactly. From an internal point of view, if they don’t have the same software that I’m using, I can show them processes or they could write scripts that would pull through the existing classification and basically do it the same way that I do it.

J: Okay, interesting.

S: But, it is really important to have that human check it the first time around.

J: Yeah, no, absolutely. As you’ve said, you might have rules that don’t make sense in the context of your industry or your business and how you’re using suppliers or how you’re purchasing things.

S: Yeah. If there’s any kind of AI or automation involved, it has to learn from good data.

J: Right. Yeah. I think that’s something that people don’t realize often. They’ll say: “We’ll just put AI into the mix and it will solve all the problems like a magic bullet.” Right?

S: Press the magic button and everything’s fixed, yeah. If only. I’m afraid, unfortunately, it doesn’t work like that. So much money has been spent on great software but then the data hasn’t been prepped and cleaned before implementation. Either, it’s caused lots of problems and they’ve had to pay to fix those or the staff just haven’t engaged and adopted it, and they’ve had to abandon it. Then, you spend all that money on software that doesn’t get used.

Data Classification in the Context of a Source-to-Pay Implementation

J: That is a great segue to my next question actually. I mentioned at the outset, the tools like Ariba, Coupa, Ivalua and all those big source-to-pay suites that are on the market right now. I hear this a lot from folks out in the market as well, and I find it’s a bit short-sighted thinking… “Our spend data is bad right now so we need to implement one of these systems to get good data and therefore make good decisions”. Then there’s this huge mountain to climb where I have to roll out a global implementation of a big tool to be able to get that end game.

I feel like what we have been discussing is how to get to that endgame just by manipulating and cleansing and classifying the data in parallel. So you think that there’s a role for those systems to be in place so that you don’t have to do that exercise over and over again? That you can get to a state where the process is built up to a mature enough point so your data inputs are clean from the outset and you have less of a need for data classification.

S: Yeah. Again, it’s that working with your data on a regular basis and knowing it because then you start to recognize quite instantly when something’s not right. It makes everything easier in the long run. It really doesn’t matter what software that you implement. If you don’t have your data right before you implement it, there’s no amount of software in the world that’s going to fix it.

J: Do you see the services you provide or data classification, modelling, etc. as something opposed to source-to-pay suite or something that works with source-to-pay suite, if that makes sense.

S: It should absolutely be hand-in-hand. No matter what you are implementing, you have to make sure your data’s right before you start. I think that it needs to be seen more as an investment rather than a cost because if it’s carried out at the start of any project, there will be much less cost, time, mistakes further down the line.

J: Right. I see it as something you could potentially do in parallel. As you’re deploying different sites on your solution, you have one site on the solution but you still have 10 sites that aren’t on the solution so you can still employ the methods that you’ve outlined so far to join that with the cleaner data that’s in your solution until you get to that desired end state.

S: Yes! Again, I talk about consistency a lot. It is so important to be consistent throughout your whole company. Like you said, it doesn’t matter what systems people are using. If they are working using the same consistent principles and methods then it’s a lot easier.

Commodity Code Best Practices : How to Use the UNSPSC

J: Maybe we can get a bit nerdier here. We talked a little bit about commodity codes earlier on….

S: Yes. Oh, let’s do it!

J: Often, with these types of systems, when you’re starting to think about that, if you don’t already have an internal taxonomy, inevitably the UNSPSC code or, hold on, let me try my party trick, the United Nations Standard Products and Services Code.

S: Showing off… After all these years, I still don’t know the full title. I get stuck after United Nations.

J: You know, the UN code… That code… You know that code, right?

S: That one, yeah.

J: It often comes up in discussions, right? What’s your perspective on using that taxonomy? What are the advantages, the drawbacks when you’re a company looking to put further effort into data classification on the procurement side?

S: Yeah. I mean, I’ve worked with it an awful lot so I kind of know it inside out, which is possibly not a good thing and very, very nerdy. On the positive side, if you have quite a lot of information in your invoice description, then the UNSPSC is a really good place to start because there’s a lot of detailed information. You’ve got lots of different types of nuts, bolts and screws, and you can be very specific. It breaks down all the stationery, all the IT products. So, it can be good.

On the flipside, the version that I have, there’s around 1,000 different options within the taxonomy. There’s at least 10 different variants of Apple. I don’t work with many companies that need to know different variants of Apple to that degree. I think they’re trying to be everything to everyone, and at points it becomes too much. It can be too intimidating, I think, for companies to use especially if they’re using their staff to classify it. In the beginning, it could be hard to navigate.

J: Would you recommend, in that case, if I’m dead set on using the UNSPSC, should I do an exercise of rationalizing it first so that I only keep the code that are significant to me?

S: That would be a good idea. But, there are also a couple of examples within UNSPSC where you have a couple of items that are repeated. I think real estate services is one of them. At the commodity level, you’ve got real estate services listed twice. I think one of them sits under real estate and one of them sits under sales management. I’ve probably got that confused but it’s something like that. I’ve used it as an example before. You have to know which level one or which segment it needs to sit under.

J: Just for those who haven’t interacted much with the UNSPSC. I think you’re referring to the four levels. There are four levels of depth in the commodity?

S: Yeah. You’ve got your segment which would be your level one, then family, then class, then commodity.

J: Okay. And so, you need to know the context of your spend within…?

S: Actually, there’s more than one right answer. Again, it’s about setting those standards and being consistent. In a lot of cases, it might not matter which version you pick as long as you stick with the one version.

J: Okay. Do you usually see businesses go all the way down to that fourth level as well?

S: I see them try… [laughs] I think sometimes they give up before that point.

J: The reason I ask is like I’m wondering, does it provide value to go down to that level because I think the whole exercise of data classification and cleansing is to get you to a point where you’re able to make good decisions, data-backed decisions.

S: I think it depends on the product. When I was talking about nuts, bolts and screws, it might be really important for you to know which type of nut you’re buying rather than just a nut. In that instance, that would be really important but to another business, they might only want to know that it’s hardware.

J: So, I think it goes back to it depends, which is the typical services consulting answer but it’s got to be contextual to your business and, back to your point, it’s got to be consistent over time.

S: Yes. I would advise that if you don’t need that much detail but it’s available to you, to put it in because you don’t have to use it but it’s there. But if you decide at a later date, if you’ve only classified to, say, hardware level, that you then want to know the types of bolts, nuts and screws. You have to pay for the same exercise again. In my opinion, it’s better to have too much detail than not enough because you can always take out the too much detail but to add in the detail, it’s timely and costly.

J: Right. If it’s readily available to you. You wouldn’t necessarily go and gather additional detail in a category if you don’t have it and you don’t even need it.

Getting the Level of Detail Right

S: Yeah. Personally, I’m finding that most of my clients don’t want that much level of detail. They just want maybe topline like IT or professional services. The other thing that I would say is that depending on what industry you’re in, the UNSPSC won’t work for you. So, I’m working with a client right now who’s in the charity industry, so they have a very specific set of spending commodities and you won’t get that information in the UNSPSC at all. I’m building a customized taxonomy for them. It really depends on the industry, the company and what your objectives are as well.

J: Is there another piece that you’d consider in the decision? I think the relevance certainly, the one you’re pointing out right now, is super important. But when trying to put together a classification or pick a standard, do you think they should consider as well what their suppliers are using as standard? I know with catalogue for example or with EDI or CXML interchanges, it’s often easier to line up on a global standard.

S: No. I would say you always have to do what’s best for your business. If your suppliers are using a different catalogue, it can always be mapped to whatever you need or whatever your taxonomy is but the most important thing is to always have a taxonomy that is suited to the needs of your business, not someone else’s.

J: Awesome, and who’s going to be using it within your business, I would imagine. If it’s accounting, procurement and maintenance, for those MRO nuts and bolts.

S: Yeah. I don’t know if you find this but what procurement needs to see from their data and what finance needs to see are generally two very different things. Also, how they class the data as well is very different.

On Classifying via General Ledger Accounts

J: Yeah because your finance folks are trying to get those balance sheet reports out of the door at the end of the month based on the GLs whereas procurement is trying to negotiate better deals over time, right?

S: Yeah. They need more detail. Personally, I’ve worked with a lot of GLs and I find that they’re notoriously unreliable. I’ve worked in businesses where a GL could also be a budget or a project. It doesn’t necessarily have to be an item. I know from my own experience where sales have run out of budget, so they’ve asked to put something under marketing’s budget but it’s actually sales spend. You wouldn’t know that with a GL code. However, if you’re classifying your data based on the supplier, it would be more apparent where the spend should sit.

J: That’s interesting because as soon as you said that, I told myself that this happens probably in 100% of the projects I’ve ever worked in.

S: Yeah, it’s really common.

J: Yeah. I didn’t realize it. Then if you’re making commodity-based decisions based on the GL information then you’re probably making some decisions that are based on wrong data.

S: Yeah.

Conclusion

J: Okay. Interesting. Super interesting actually. I don’t want to take too much of your time here. I appreciate you talking with me. Do you have any key messages that you’d like to share with the audience in terms of data journeys if they’re starting to embark on one or they’re looking at how to get better?

S: Yeah. Start simple. Don’t go all in with the software. Get to know your data. You should get familiar with your data. Your data should have a COAT, put your COAT on your data before it goes out to go into software. It should be Consistent, it should be Organized, it should be Accurate and it should be Trustworthy. If it’s not those things before it goes into software, it’s certainly not going to be those things when it’s in the software.

J: Right. I like that image of putting a COAT to your data. It’s very UK of you.

S: There’s something special coming on that soon so watch this space.

J: Okay. When you say watch this space, where’s the best place people can get a hold of you and your material?

S: I tend to hang out mostly on LinkedIn. You’ll find me at Susan Walsh – The Classification Guru but you can also find me at theclassificationguru.com and the Classification Guru YouTube channel as well. Whatever your platform of choice is. I’m also on twitter on @ClassificationG.

J: Cool. I know you run some pretty fun little contest there.

S: Yes, yes. I do like my Fun with Words.

J: Yeah, Fun with Words, sorry. I was looking for the word there.

S: Yeah. I’ve done replace a song with data, replace a book with data, replace a TV show with data and replace a film with data. It always get such great engagement. I’ve done the same with procurement as well. Replace a song with procurement. It’s been great fun and I really enjoy that everyone gets involved. Some people have even written whole lyrics based around data or procurement. It’s really good, yeah.

J: All right. Well, thanks a lot for taking the time to chat, Susan, I appreciate it.

S: My pleasure.

J: I know I’ll be reaching out on LinkedIn, and I hope that others do as well to join into the fun and get more literate on data.

S: You’ve reminded me that I need to do a new post about that soon as well.

J: Yeah, it’s my pleasure. Talk to you again soon, Sue. Take care.

S: Yeah, it’s been great. Thank you.

———————————
What have been your biggest challenges in trying to round up your spend data? In your opinion, what is the biggest hurdle to clear to put in place good data quality assurance processes in your organization? What other tools are you using to cleanse and classify your spend data? Let me know in the comments.

If you liked this post, why not Subscribe

]]>
https://www.pureprocurement.ca/get-value-from-your-data-with-spend-classification/feed/ 0 1492
How to Buy the Right Procurement Systems https://www.pureprocurement.ca/how-to-buy-the-right-procurement-systems/ https://www.pureprocurement.ca/how-to-buy-the-right-procurement-systems/#respond Tue, 20 Oct 2020 01:39:01 +0000 https://www.pureprocurement.ca/?p=1118
Sign Post in the wind
It’s easy to get lost in the wild when buying Procurement software

With the proliferation of enterprise procurement systems available today, buyers are confronted with an anxiety inducing paradox of choice. There are well over 100 procurement software companies worldwide today, each focusing on different parts of the Procurement value chain. Certain larger providers even argue they can be your “one-stop shop” for the entire Procurement value chain. This plethora of options reflects the increased focus on Procurement organizations as value drivers for the business. Generally speaking, this is great news for supply chain professionals. However, it makes buying Procurement systems and the associated Procurement system architecture decisions more complex.

To make things simpler, it is important to understand that there are 3 main categories of Procurement software solutions. These three categories are: ERP systems, Source-to-Pay (S2P) suites & other “best of breed” solutions. Considering each category along with their strengths and weaknesses is important when crafting a holistic Procurement system architecture. More importantly, establishing a strategy at the start of your digital transformation will guide optimal system purchasing decisions. This will preserve precious organizational resources for other impactful initiatives instead of investing in “shelfware“.

The worst outcome in any digital transformation initiative is realizing that backtracking is necessary. It is devastating for morale, disastrous for the pocketbook and cruel to your calendar. Let’s discuss the 3 categories you need to master to avoid this fate:

Category 1 – ERP Systems

The first category consists of traditional Enterprise Resource Planning (ERP) systems. First and foremost, ERPs are designed to support a business’s core business processes. For example, finance, planning, procurement, inventory management, production, sales, HR, etc. The main function of an ERP system is to ensure data integrity between these processes by enforcing their integrated nature. For example, the receipt of purchased goods automatically updates inventory levels. Financial general ledger accounts are also automatically updated to reflect new assets (stock) and liabilities (accounts payable to supplier).

Whether you choose Microsoft Dynamics, SAP, Oracle, Workday, NetSuite, JDE, etc., the ERP system acts as the nervous system of your business. It is always a mission critical system that supports your core functional operations, such as the production of financial statements. In larger organizations, this makes the ERP a non-negotiable building block to consider in your architecture. This dynamic gives ERPs unique characteristics:

Strengths

  • Robustness. Leading ERP systems have been around for decades. Therefore, they are extremely robust at all levels (error management, data integrity, security, etc.). They’ve typically also been built natively from the ground up which improves the compatibility and stability of their code base.
  • Scope of Integration with Other Functions. As previously mentioned, in ERP systems you can’t carry out operations without automatically affecting other functions tied into your transaction. This ensures information integrity and accuracy across business functions leading to more efficient work flows.
  • Good support of direct purchasing processes. In this context, I define “direct purchasing” as the purchase of goods or services where at least one of the following is required to support the lifecycle of the goods/service purchased:
    • A material/service master record
    • Inventory management functionality
    • Integration with Material Requirements Planning (MRP)
    • Integration with maintenance work orders (MRO parts are therefore considered direct in this context)
  • Direct procurement is highly integrated with other functions (production planning, inventory management, etc.). Therefore, running your direct procurement processes in ERP is a good general rule of thumb. Of course, as user interfaces are generally less intuitive in these types of systems, you may need to optimize certain parts of the process with additional systems and add-ons(Category 3).

Weaknesses

  • User experience. Because of the sheer size and complexity of a typical ERP (e.g. SAP ERP consists of more than 250 million lines of codes developed over 14 years), modernizing the look and feel of such a system is a long, slow and arduous process. Smaller products/companies (as in other categories) have a great advantage on this front because they do not have to consider this baggage when developing new solutions.
  • Speed of innovation. Similarly, the integrated nature of these systems makes speedy innovation complex. Small changes can have deep ramifications on other parts of an integrated process. This means that development, testing and deployment of changes is longer than in smaller systems, all other things being equal. However, with the development of cloud ERP offerings from most providers, this is slowly changing. Vendors have tighter control on how the system evolves in customer instances and therefore more flexibility to push updates while controlling the code base.
  • Workflows. Typically, when using an ERP, a user will need to be heavily trained to use and navigate through the system ahead of time to carry out processes. As ERP systems are usually transaction-based, users must execute the different transactions that make up their business process sequentially from memory (e.g. create requisition, convert requisition into PO, approve PO, expedite order, goods receipt, etc.). ERP systems do have certain workflow capabilities but they are typically limited to a specific transaction (e.g. PO approval) and do not govern the overarching process end-to-end.

Category 2 – Source-to-Pay Suites

The second category of systems to consider are Source-to-Pay (S2P) suites (and their corresponding Supplier Networks). Today, these systems are almost exclusively sold in a Software as a service (SaaS) model. As opposed to ERPs, S2P systems cater directly (and singularly) to the Procurement and Accounts Payable functions. Their main function is to support the widest variety of Procurement use cases possible to help organizations support the end-to-end Procurement value chain. For example, S2P suites will typically offer Sourcing and/or Contract Management modules which help structure and optimize processes that are typically not well supported in ERP systems.

Whether you choose SAP Ariba, Ivalua, Coupa, Jaggaer, Zycus, etc., Source-to-Pay suites act as a “functionality extender” for your ERP. They support the Procurement and Accounts Payable processes where ERP doesn’t do the job (or does a very deficient job). However, it is important to note that if you do not require support for a direct purchasing process (as defined earlier in this article), you may also be able to carry out all your Procurement activities in a Source-to-Pay system with only minimal integration back to the accounting modules of your ERP.

With this in mind, it’s possible to see why S2P suites have their own characteristics:

Strengths

  • Speed of innovation. Given their cloud architecture, S2P systems are evolving very quickly to meet user needs, industry specific requirements and/or integrate innovative concepts, such as Machine Learning, into novel use cases. For example, SAP Ariba is releasing new functionalities every quarter at the time of this writing (vs. yearly+ for the SAP S4/HANA ERP solution). This gives S2P suites the ability to create additional value for Procurement organizations at a faster pace than ERP systems. Their unique focus on Procurement and Accounts Payable use cases only further accentuates this reality.
  • User experience. As most S2P systems are operated via a web browser, their user interface is much more intuitive than that of legacy ERP systems. As web applications are now the norm in our personal lives, the learning curve is flatter for new users as they can leverage existing skills.
  • Mobile Integration. Along the same lines, because of their web architecture, it is much easier to setup and use mobile apps for S2P systems than it is for ERP. Less technological layers are involved and therefore they require less expertise to implement and maintain.
  • Workflows. Source-to-Pay systems take the reserve approach from ERP systems on this front. An end-to-end process workflow guides the user through the end-to-end process in most S2P modules (vs. executing transactions). For example, to run a sourcing event, a user would typically answer a set of questions about his event and be led through a set of tailored workflow items to complete his sourcing event (e.g. find suppliers, prepare tendering documents, send documents to suppliers, collect responses, evaluate and award, etc.). This is one of the great strengths of S2P systems vs. ERP systems. It is harder for users to veer off the established process because of the boundaries established by workflows.
  • Modularity. I often compare Source-to-Pay systems to Swiss Army Knives for Procurement. You can pick and choose the modules that generate value for your organization and pay license fees accordingly (e.g. implement only Contract Management but not Sourcing and Procure-to-Pay) . In ERP systems, you will typically have a single license cost for the entire application, whether or not you use all the modules available. Therefore, S2P systems gives you greater flexibility. You can selectively use S2P systems for processes not covered by your ERP.
  • Good support of indirect purchasing. In this context, I define “indirect purchasing” as everything that does not fall into direct purchasing as defined above. Typically, this covers purchasing requirements that they are user-driven, ad-hoc purchases. These are good candidates for execution in the Procure-to-Pay module of S2P systems. However, a good Supplier Network solution should be able to cater to all your purchasing needs.
  • Sourcing & Contract Management. Typically, ERP systems don’t support Sourcing & Contract Management processes very well (for direct or indirect). Therefore, when looking to support these processes via software, you must turn to S2P systems or “Best of breed” applications.

Weaknesses

  • Fragility. In comparison to “older”, robust ERP systems, S2P systems are much “younger”. Therefore, system administration functionality around security roles, error management, data integrity, etc. are less developed and therefore more restrictive and fragile. It is much easier to end up with missing or corrupt data in a Source-to-Pay system than it is in an ERP system.
  • Modularity. To further my Swiss Army Knife analogy, it is also possible to unwittingly take out the knife attachment out and to stab yourself with it when implementing a S2P system… Since you can pick and choose the modules you want to implement in a S2P system, you need to ensure you are picking the right ones to marry with your other systems (e.g. your backend ERP). Otherwise, you risk harming your ability to support your end-to-end Procurement value chain. For example, you could end up with functionality gaps and/or duplicate functionality that will leave you in a worse off state than before. Why? Because you will have added complexity and/or manual steps to make processes work that weren’t there before the implementation of a S2P suite.
  • Integration with other systems. Any time you introduce additional systems in a technological landscape, you are adding complexity. Over and above the technical challenges that come with integrating systems together, each new system also has their own organizational data structure that needs to be understood and adopted by your business teams. For example, a business unit could be named a “plant” in one system and a “purchasing unit” in another. Users need to know this and to understand the links and nuances between different concepts in different systems. Furthermore, you might not be able to draw 1-to-1 correlations for all concepts in the different systems you are trying to integrate. This potentially means you will need to create data/business rules in the interfaces to ensure each system has the data they need to properly function at a conceptual level. This is getting better as S2P systems aim to provide standard connectors to all big ERP systems and standard APIs that can be called to retrieve/input data to integrate business processes across systems. However, integration of systems remains the most difficult thing to do as you add a Source-to-Pay application in your Procurement system architecture.

Category 3 – “Best of Breed” Applications

Thirdly, you have the best of the rest: “best of breed” systems. In this category, you will find niche systems that focus on functionality that caters only to specific slivers of the end-to-end Procurement value chain. This is also the category where you will find all the innovative startups and niche players trying to apply the following concepts to Procurement applications:

  • Machine Learning
  • Artificial Intelligence
  • Blockchain
  • Microservices
  • Process Mining
  • Market Intelligence
  • Data governance & Management
  • Ticketing tools
  • Career development
  • Etc.

These smaller players can both develop functionality for processes already covered by ERP and S2P systems (e.g. Sourcing, Contract Management, Purchasing, etc.) and functionality not covered by other system categories (e.g. Smart supplier master in the cloud, sustainability optimization, commodity market intelligence, etc.). However, to survive, a key assumption must ring true: your “best of breed” solution generates more value in it’s niche scope area than any other ERP or S2P solution. Otherwise, they won’t stay in business very long…

Whether you are looking at systems like Tealbook, Celonis, SirionLabs, etc., including “best of breed” systems in your Procurement system architecture makes sense if it will help generate benefits that can’t be achieved with the bigger, wider scope ERP and S2P systems. However, it is not that simple. You still need to consider your ERP/S2P architecture when looking at these systems as they will most likely be the providers/consumers of data from your “best of breed” systems. Strengths and weaknesses of these types of systems are as varied as the systems themselves. However, there are two specific common characteristics of this category of systems that should be noted:

Strength

  • Fit for purpose. A “best of breed” application should cater to an exact need or generate a very specific benefit for a Procurement organization. As it is built for this single purpose, it will do a great job at executing on its promise. If you are not convinced this will be the case when looking at a system that isn’t an ERP or S2P suite, then file it in a 4th category: systems to avoid…

Weakness

  • Integration with other systems / processes. As “best of breed” systems have a singular focus, it is often difficult to integrate them with the rest of your Procurement system architecture (your ERP and S2P suite). This consideration should always be top of mind when looking at systems in this category: How does it fit into the whole? Will it create more manual work? Even if you answer these questions in the affirmative, it might still make sense to include the system in your landscape. However, you should consider the additional operational costs they will generate in your business case. The best systems in this category know integration is often their biggest issue and provide an integration toolbox (APIs, data dump mechanisms, standard connectors for leading ERP/S2P systems) to help with the integration activities. But again, technical integration is only one half of the integration equation. The other half of integration lies in the conceptual integration of the new application into the organization’s processes and jargon.

That’s Great but How Do I Decide Which Procurement Systems to Buy?

Every system added to your Procurement system architecture adds complexity. More complexity means more administrators, more master data, more training material, more interfaces, more support tickets, etc. Therefore, the best Procurement system architecture is as simple as possible to meet the requirements of your organization. No simpler and no more complex. This may seem esoteric but it is a great guiding principle to refer back to when thinking about adding/removing systems.

This means that when in doubt in a purchasing decision, you should err on the side of less systems. That may sound funny coming from someone who implements systems for a living but the truth is that a system is only as good as the process it supports. Chaotic processes supported by systems will produce systematically chaotic processes…

Sequence of System Purchase/Implementation Is Also Important

The sequence of purchase is also important based on the system category characteristics discussed above. Here are some general guidelines to follow in the rollout of your systems:

  1. Define, document and understand your business processes before doing anything else. A process culture goes a long way to making system implementations go much smoother.
  2. With knowledgeable partners, craft a desired end state Procurement system architecture. This exercise is about marrying internal requirements and constraints with external offerings. It should take into consideration:
    1. Your organization’s purchasing profile (the commodities you buy and processes currently executed)
    2. The strengths and weaknesses of the system categories outlined above.
  3. Implement and optimize your ERP first. Remember the ERP is usually a non-negotiable foundational building block that will dictate where you should go from there.
  4. Identify process areas that are lacking support or are low performing when supported with ERP.
  5. Implement S2P modules or “best of breed” systems to address the identified process areas.
    1. Sequencing S2P and/or best of breed system implementations will depend on data/technical constraints and where the most business benefits lie. In a perfect world, your first implementations will help cultivate savings that can be used on the next initiatives.
    2. When there is crossover between a S2P module and a “best of breed” system in terms of functionality, it is important to note that ongoing system maintenance costs are multiplied based on the number of different systems in your landscape. Therefore, unless there is a “game changing” reason to go with the “best of breed” system, it is best to go ahead with the equivalent S2P module. That is unless you won’t utilize any other S2P modules for other process areas in your desired end state architecture. In this case, you should choose the best solution based on functionality/ERP integration capabilities alone.
  6. Bask in the operational glory of your completed Procurement system architecture.
  7. Cultivate a continuous improvement culture. Stay abreast of new developments in the Procurement system space to identify new opportunities.

Conclusion

When laid out in the 7 points above, building a sound Procurement system architecture may sound simple. However, it is anything but… This is often a difficult, multi-year process which requires alignment of multiple stakeholders with conflicting interests and views. Therefore, it is essential that you create a plan and ensure stakeholder buy-in before you start making buying decisions. It will be the difference between glory and gory.

Note: I did not include middleware in this article as it is more of an IT system than a Procurement system. However, it is my belief that middleware should be part of any 21st century Procurement system architecture.

———————————
Do you see any other categories that should be included? Would you create subcategories in each main category? Do you see other guiding principles that should guide Procurement system buying decisions? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on January 4, 2021 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/how-to-buy-the-right-procurement-systems/feed/ 0 1118
Purchase Order Text Copying Rules in SAP https://www.pureprocurement.ca/sap-purchase-order-copying-rules/ https://www.pureprocurement.ca/sap-purchase-order-copying-rules/#respond Wed, 09 Sep 2020 12:48:02 +0000 https://www.pureprocurement.ca/?p=1089 The following article shows you how to automatically copy texts to the SAP purchase order header. With copying rules, you can automatically copy texts from master data objects such as the vendor master.

Purchase Order Header Text Copying Rules Configuration

Ideally, you’ve built a Vendor master data application architecture that continuously and dynamically updates your Vendor information with publicly available information. This ensures you always have access to the most accurate and up-to-date information. However, there is always information that only your company knows about a vendor. For example, contractual terms or conditions between your company and the vendor that should always be specified on your purchase order documents. But how do you store this information when SAP doesn’t have a specific field to capture your data? You configure a copying rule and store the data in the vendor master.

Alternatives: It is also possible to copy texts to the purchase order header from the following objects when it is more appropriate (e.g. it doesn’t make sense to copy the information 100% of the time):

  • Contract
  • Purchase Order
  • RFQ/Quotation

Constraints: Standard SAP does not let you copy Vendor Master texts to a Purchase Order item text. Only the following objects can act as the source for Purchase Order item texts:

  • Contract
  • Info Record
  • Material Master
  • Purchase Order
  • Purchase Requisition
  • RFQ/Quotation
  • Sales Order

To create and copy a custom text from the vendor master to the Purchase Order, carry out the following steps:

1) Define a new text type in the vendor master

You can define your text as a Central Text if it applies to all purchasing organizations. Otherwise, you can define your text as a Purchasing Organization Text and the text will only appear when the specific Purchasing Organization is used.

In both cases, ensure the ID for your new texts are in the SAP Customer Namespace (Start with a Y or a Z) to avoid future issues in service packs or upgrades. It is also important to note that these text objects are cross-client objects. This means you will create/delete them for all clients in the system concurrently. Execute your cross-client steps first in your configuration sequence (or transport import sequence).

  • To define a new Central Text, access the following configuration point and create your new text:

SPRO > Materials Management > Purchasing > Vendor Master > Define Text Types for Central Texts

Configuration Table for New Central Text
  • To define a new Purchasing Organization Text, access the following configuration point and create your new text:

SPRO > Materials Management > Purchasing > Vendor Master > Define Text Types for Purchasing Organization Texts

Configuration Table for New Purchasing Organization Text

2) Maintain Purchase Order Copying Rules for your New Text

Access the following configuration point:

SPRO > Materials Management > Purchasing > Purchase Order > Textes for Purchase Orders > Define Copying Rules for Header Texts

Text Linkage Configuration Table
  1. Select the Purchase Order text where you want your New Custom Text copied
  2. Double-click on Text Linkages

Once in the Text Linkages dialog window, ensure you are in maintenance mode and click the “New Entries” button. Fill the new entry with the vendor master as your Source Object you’re your new text as the Source Text. If the chosen text already has a copying rule for the source object, you will have to change the existing one.

If the text should always be copied, leave the Fixed indicator blank. Use an Asterix ‘*’ if the text needs to be manually adopted by the user. Use the Fixed indicator ‘N’ instead of deleting entries, should this be an applicable use case, as deleted entries remain in the tables in the background if you delete them outright.

Purchase Order Header Text Copying Rules Configuration

After configuration, it’s time to test.

3) Bask in the Glory of Your New Text

  1. Start by maintaining your new text in a vendor master (XK02). Maintain a Central Text via General Data views or a Purchasing Organization text via Purchasing Organization data views:
Vendor Master Central Texts
Vendor Master Purchasing Organization Texts
  • Create a Purchase Order (ME21N) for the vendor in question. The text you maintained should appear automatically in the Header Texts according to your configuration.
Copied Text in Purchase Order

If you selected “*” in your configuration for the Fixed Indicator, the following button will appear when the text is selected in the Purchase Order. Click the button to copy the text to the document. Otherwise, the text won’t be included in the purchase order even though you see it in the text box.

Option to Copy Text in Purchase Order

There you have it! Enjoy your new text. From here, you can include this new text on the Purchase Order form, in outgoing idocs and/or send it to the Ariba Network to transmit this new text to the vendor via your Purchase Order.

Additional Notes:

  • You can also use this process to modify the copying rules for existing texts (system delivered)
  • If you have trouble with the copying rules configuration (e.g. the newly configured text is not available in the copying rules), try saving, logging out of the system and logging back in. Alternatively, sign in with another language.
  • If you run into issues with deleted entries, you can display deleted entries via the following menu option:
How to Retrieve Deleted Entries

———————————
What sort of information have you needed to maintain and copy from the vendor master? What use case does it satisfy? Have you found other ways of supporting this requirement? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on March 17, 2023 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/sap-purchase-order-copying-rules/feed/ 0 1089
Enriching Vendor Data with Machine Learning Tools with Stephany Lapierre https://www.pureprocurement.ca/enriching-vendor-data-with-machine-learning-tools-with-stephany-lapierre/ https://www.pureprocurement.ca/enriching-vendor-data-with-machine-learning-tools-with-stephany-lapierre/#respond Fri, 19 Jun 2020 03:08:44 +0000 https://www.pureprocurement.ca/?p=1084
Title image for podcast episode 3
“Vendor data should be at the core of your Source to Pay transformation and technology should just be way to drive compliance and enable the execution of your strategy.” – Stephany Lapierre

Note: This post is the transcript of the episode. If you prefer listening to audio, you can listen to the episode on the podcast page.

On the last episode, we explored the critical success factors for your Source to Pay system implementation. One of the factors that came up and got a strong reaction was vendor master data quality. Everyone agrees this is an important topic. However, when it comes down to the specific actions that should be taken to tangibly set yourself up for success, there aren’t many experts on the topic.

Today, I am fortunate enough to be joined by one of these rare gems. In fact, my guest today has built a company around the fact that being the master of your vendor information is the backbone of success in Source to Pay.

Stephany Lapierre is the Chief Executive Officer at Tealbook, a Canadian vendor network software provider based in Toronto, Ontario. Stephany and her team have been working since 2014 on putting together a platform that leaves your ERP vendor master in the dust. By doing away with limited and static vendor information and developing tools and processes to delivery up-to-date, granular and contextually specific information to buyers and sellers, Tealbook is making a name for itself.

While on this journey, let’s just say she’s gotten very intimate with the good, bad and the ugly of the vendor master. That’s why I’ve asked her onto the show – to discuss how companies can effectively cleanse vendor master data, how to measure progress and how to keep consistent vendor data quality levels.

We dive into the problems with traditional thinking about vendor data quality management and walk though the process and tools that can be used to really see progress on this front in your company.

—————————-

*The transcript from this interview has been edited for brevity and clarity.

Introduction

J: Thanks for joining me today, Stephany. It’s really a pleasure to have you on the podcast to discuss one of my favorite topics: vendor relationship and data management. It plays a big role in the success of folks in a bunch of different endeavors whether it be transformation projects or operational activities on the buy side or even the sell side for suppliers.

S: Thanks for having me.

J: I thought we could discuss what you guys do at Tealbook but also why vendor master data is important and why nerds like us in the procurement space get so excited about it. So, maybe we could start there. How did you develop such a deep interest in managing vendor data?

S: (laughs) I won’t bore you with all the details of what inspired me… However, let me share something that happened recently that was such a validation of what we’re doing. I was at a procurement conference and I was speaking to about 90 heads of procurement. There were some of the largest companies in the world. I asked one simple question, I said: “Raise your hand if you have some confidence in the quality of your supplier data.” Everyone laughed. Nobody raised their hands. They were all looking at each other, smiling. I said: “Raise your hand if you think supplier data is critical to the digital transformation.” Everyone raised their hands. I said: “Can we all agree that there’s a major data crisis and that we have a massive gap.” That led to, after my presentation, spending six hours at a table talking to procurement teams about what that actually means. I think that comes from the fact that if you look at the history of procurement, it has traditionally been a highly transactional function.

Procurement has always been about automating transactions or relationships; a kind of a layer with suppliers. But, suppliers are really critical to competitiveness. They’re really critical to an organization as much as their customers and as much as their [internal] talent. I mean, look at the sales and marketing side… They’ve been using data and analytics for a really long time. Even over the past few years on the talent side, you’re getting a lot of technology investments on big data to understand talent and trends to try to get the most out of your talent. However, very little has been done for suppliers other than software to try to automate parts of the workflow or a part of the transactions.

So, I think its left a really big gap for executives in organizations who are demanding speed, agility, innovation, competitiveness. Especially, now that so many industries are being disrupted. It’s really leaving procurement to think about how they are going to manage this transformation… How can they digitize the function in a way that’s scalable and brings more value to the organization? In that regard, I think what we’ve seen over the last few years is most procurement teams have run to P2P companies because they have been telling a good story.

I think there’s certainly a good story around adopting cloud based technology to help give you more visibility into process, invoicing, etc. However, its left a big gap on the data front. A lot of the “data thinking” has been done too late or has been overlooked. Mostly, it’s because it’s not easy to reconcile data. It’s not easy to be able to have the data that you need to feed those [Source to Pay] systems and to keep the data in those systems as relevant or accurate or transparent as it needs to be. I mean, a year and a half ago, we were doing webinars to educate procurement teams on what Machine Learning and AI means. Now, a year and a half later, we are talking about failed digital procurement transformations.

I think we’re coming to a point where there’s a big crisis and there’s a demand for a solution. Luckily for us, we foresaw this happening a few years ago and started building our technology so we could provide a data solution.

J: Yes! I can see that happening as well on the different mandates i carry out with clients. I’m curious to get your opinion on what the ideal state looks like?  You’re saying it’s hard to sync that data across systems, and be able to have up-to-date data. What does that ideal state look like? Is it very accurate data for 100% of your suppliers? Is it for a certain slice of them? How can you tell yourself that you’re ready for source-to-pay transformation or implementing other S2P technology from a data perspective?

S: I think that data should come first. I think you should have visibility into 100% of the supply base. It’s been so difficult, I’d say even impossible, to maintain information in a way that was useful. Typically, we’ve had to prioritize suppliers which are more at risk or where we’re spending more. However, there’s now a lot of opportunities to get data for 100% of the supply base.

J: When you say maintaining that information was difficult, is it because the companies are doing that manually?

S: Yeah. We have customers who have a small vendor base but some who have 200K or 300K global suppliers. So, we’re talking about onboarding 200K companies, collecting data from those 200K companies, maintaining the records, validating those records; and some on an annual basis. Then you need to maintain everything about those suppliers like their category. How similar are they to other suppliers that you’re already doing business with? What is the level of risk? What is the level of performance and relevance and trust? Are they certified? For what? Is that record accurate, valid, reportable? There are so many components and all those components have been collected by companies but they’ve been collected by different functions across different systems.

The data stack has lived in those systems in a way that makes it really difficult for organizations to try to reconcile the data. It’s been done through data cleansing but then it only gives you your spend data. That’s a great snapshot but it’s not a dynamic record that gets better over time. It’s not a record that will give you insights so that you can start becoming more predictive, you can start foreseeing things that may happen or opportunities that you’re missing because you don’t have that visibility.

J: Yeah. And to add, you don’t have diversity information if that’s something you’re using as a criteria for spending decisions. Also, as soon as you have that data captured, it starts ageing and losing its accuracy over time.

S: The most amazing thing for us when we work with clients is getting a list of vendor masters, and [showing them the true picture]. Let’s use an example of a customer with who we recently deployed the solution . They said they had 19K suppliers. They shared the 19K records and, within days, we showed them that they actually have 5.9K companies in their file. You know there’s so much duplication [out there…]. 570 of those suppliers were duplicates and instances of duplication. So they had, in some cases, 30 different contracts across the different businesses for the same vendor and they were not aware of this. Sometimes, there’s good reasons for that. In others, there’s massive opportunities to leverage this information in negotiations to get economies of scale.

There’s also the opportunity to see what your accurate diversity data really is. This same company had a pretty strong diversity mandate but they were only spending 3% of their spend with small and diverse businesses. We were able to give them a pipeline of 84 million dollars of validated spend that could be contracted to small and diverse suppliers. We also found another 60 million dollars of potential pipeline that looked like it could be attributed to small and diverse but needed validation. With this information, you can start to be more proactive about realistically [achieving your targets].

In this instance, what was really interesting is that they had a very big focus on aboriginal businesses but only about $160K out of their $84 million dollars in spend was with aboriginal businesses. So, to be able to see that and say, “Okay, now we can take action” is great!

So, these are all snapshots… However, once you turn the light on to your data, you can see things that allow you to drive a strategy by seeing your categories. Because we look at 300 different dimensions of trends and similarities between companies, we’re able to show you clusters where you have suppliers that do the same thing or look very similar to one another.

Then you can start to look at things in categories. For example, we had a client who had two hundred something translation service suppliers. Those are all suppliers where you collect data, you maintain information, you validate records [over time], you pay separately with no economies of scale, and you still have people in the company going out to Google to find translation services for Spanish or Mandarin because they don’t know… They don’t have access to the information about who we already have under contract. You’re creating this unnecessary burden on the business, on finance and on legal, and you’re introducing unnecessary risks.

Seeing those clusters helps you execute or build your strategy and how you drive your technology [initiatives]. We also have an interface but the way that you can drive that is to consolidate [your spend amongst vendors]. How can policies and processes in each major category drive consolidation to make sure that we’re leveraging those supplies more effectively? Which ones are more valuable for what so that we can continuously deliver that insight to the business and reduce the cost, the risk, increase savings and build better partnerships?

There are also other categories where you see that you don’t have a lot of suppliers and sometimes with good reasons. Most of the time it’s complacency but if you’re saying, “Hey, we don’t have a lot of suppliers in these categories” you can look to a global network of clusters we are identifying. You will find clusters of similar suppliers that do the same thing and are doing business with companies that are very similar to yours based on what you’re buying. These could be clusters where your strategy should be about increasing competitiveness. You may adjust your policies so that you need at least three or ten or how many bids per sourcing event [to drive diversity in this category].

Another use case may be that you’re driving compliance around savings and creating that hyper competitiveness. To me, data should be at the core of your transformation and the technology should just be a way to drive compliance to be able to execute on that strategy. The technology should be adaptable because it’s going to keep changing. There’s thousands of niche new digital solutions that are coming to market. You need to be able to plug in those technologies to your data so that you can drive compliance. You need to be able to change and evolve with technology. I think is really critical to the future of procurement.

J: I agree with everything you’re saying… It’s scary. I think what you’re describing has been the “Holy Grail” as we’ve gone through these different generations of source-to-pay tools over the last couple of years. We’re also seeing constant consolidation in that industry as well.

People have been trying to consolidate their vendor data in their back-end ERP system or in their data stack somewhere. I’ve seen organizations try to do this in the middleware where all the applications can come in and query a single source of truth for vendor data. What are the common pitfalls that you see with those approaches and what are you doing with Tealbook from a process perspective to address those issues?

S: There’s a couple of angles to this. To quote one of our Fortune 100 clients, “We don’t want to collect, maintain, validate records of suppliers anymore. It’s really impossible. We want to be able to access the right data. We want it to be maintained, not because we’re maintaining it but the record is maintaining up to date. We also want that record to be validated not by a third party somewhere in a third world country, we want it to be validated by other buyers like us and we need the insight to drive our business forward. Our insight is very limited.”

So, when you’re building your processes and you’re adopting technology like a P2P, how is that data being cleansed, enriched, distributed to your system in a way that’s really effective and dynamic, and continuously refreshed? It’s impossible right now. It’s hard. So, the mentality of different companies is: “Well, we’re going to push that down to our suppliers. Our suppliers are going to do it so that we don’t have to do it.” Or: “we can hire a third party supplier to do it.” The reality is even the largest companies in the world will tell us, “We thought we were influential enough to get suppliers to do it but the reality is the majority don’t.”

That’s because suppliers have been asked to update hundreds of thousands of different systems, and each instance with the same software across their customers in ways that doesn’t really help their ROI. Yes, they’re motivated when it comes to an invoice but then you’re getting very limited information and even that’s hard to maintain… Banking information, contact information, etc. So, when you’re trying to manage all of the data at the speed at which things are changing in the market, it makes it incredibly difficult to track. So, we’ve taken an approach, obviously we’ve used technology to do this, primarily based on Machine Learning (ML) to proactively build records on each company in the world.

The footprint is expanding continuously because we’re always crawling and finding ways to find companies that are as similar and relevant to the ones that are already doing business with our customers. So, we work on breadth but we are also working on the completeness of that information; depth. We’ve built algorithms to be able to do this but we also need to consider how you layer that with customer data so that the structured data we provide becomes more relevant to them. If I want to sort suppliers by capabilities, by spend, by relevance to me as a buyer within my organization, by suppliers who have an MFA, by tiering them as preferred or strategic, and then layering other data sets around your diversity targets, or GDPR compliance, whatever it may be…

All that data should be coming together in one place and be useful. We do this in many different ways but mostly, it’s a combination of all of that, and the power to be able to have a record that continuously improves. That’s a big change from the past when I could buy data lists.

J: Something like Dun & Bradstreet?

S: Exactly. Then if you see records that are wrong, there’s nothing you can do about it. Then you don’t trust it… You’re going to react:, “This is garbage or I’m not trusting it.” You’re reporting using that data hoping the data is as accurate as it can be. You know it’s not good… It’s not perfect but you’re hoping it is. I think the big difference is when we’re working with customers, it’s a journey. Whatever they give us from their vendor masters is really bad. They all apologize, it’s okay, it’s bad. Everyone has bad vendor masters too. But, now we can “turn the light on” and add to it. Say you give us 2% of the completeness of all the information you need to know about a supplier, we’ll add 10% to that. We’ll add capabilities, contact information, location, certificates, relevance. We’ll add things that are really valuable to your business but now we may have 85% to 88% of completeness that we still need to build. That will depend on how you build your strategy, what systems you integrate, how you’re communicating with your suppliers. If you decide to roll out Tealbook out as an interface, and that’s an option, how are you rolling it out? What type of data or compliance processes are in place? How are your users interacting with the platform? All of that is to get to 100%. You’ll never get to 100% but you’ll never go backward.

J: You can get closer.

S: Exactly.

J: It’s so interesting that you mention that big organizations will push that task out o suppliers. The thinking is suppliers know their own information better than we do, so they should be the guardians of that information. But, at the same time, when you’re dealing with huge multinational organizations, it’s a collection of individuals and each individual in that company will know different bits and pieces. You might be talking with a supplier about another supplier in a competitive scenario and they’re both part of the same company through some structural device but they don’t even know it themselves.

S: They don’t even know their customer, right? The supplier may not even know their customers. There’s very little visibility.

J: So, if we can’t rely on companies to maintain their vendor data and we can’t rely solely on suppliers to maintain it either, we need a collection of everything that’s out there. Every little piece that everybody knows in the ecosystem should be consolidated to a central vendor master. At least that’s how I understand what you’re saying, right?

S: Exactly. That’s our thing. Tealbook has been named the source of [truth], the golden record, the digital vendor master, however you want to call it. We define ourselves as being a really smart supplier data cloud. Very similar to back when you had your phone and all your pictures and your contact numbers. You used to have to email people to tell them that you lost your phone to send their contact information again. (laughs) So painful. Now, all your data is in the cloud. So, no matter what laptop, iPad, latest version of an iPhone you may get, it’s just a matter of just connecting it to your cloud and then all the data gets populated. You’re not bound to the technology anymore.  

You can actually evolve much faster and the friction and change in the technology is seamless. That’s the way that you should think about your data. How can your organization, how can procurement have really good supplier data that continuously generates more transparency and insights? Over time, you’re becoming a lot more strategic, you’re becoming a lot more predictive and you’re not bound to the technology.

Inevitably, each function will ask for the latest AI, contract management system; Quality will ask for the latest quality management system, and they all require supplier data.

How can you setup that new technology in a way that you can grab data to manage the process and update it with new insights while executing processes? Another important point is everyone wants to know that the supplier data being used is actually accurate and it’s updated. Every process should contribute to making your data better; not just consuming it.

J: Okay. Could I extrapolate that you see technology like Tealbook sitting as the single source of truth of that vendor master data and pushing updates to systems like your contract management system you were mentioning earlier to say: “Here’s the latest vendor master and here’s how you should update it in the system.” Even the fields that may be unique to that contract management system in the vendor master would be maintained in Tealbook or… How do you see that relationship?

S: Yeah. We are going into the deep, deep details. Typically it starts with [Source to Pay system]. What system are we using, how accurate is that data? How easy is it to distribute? Especially if you haven’t implemented your S2P tool yet. You’re depending on your system integrator to do that data cleansing and enrichment and distribution. It’s a high variable and it’s not really their strength. They’re really good at doing the implementation, the integration, the management.

J: Yeah. Design, build, deploy.

S: The data, it shouldn’t be in their hands. If you can remove that variable by adding a product or a solution, suddenly you have better data that’s easier to distribute. You will have a much more effective investment in your S2P system. In an example that came out last spring, I think it was the city of New York, they were $54 million over budget with their S2P implementation working with a system integrator.

A lot of that was caused by poor data quality – not because of the software itself. If you’re feeling like: “my neck is on the line, I want to make sure that my multi-million dollar investment and my S2P solution is effective. I want to ensure people are going to engage, that the data is going to be up-to-date…”, then that’s a really good way to start engaging with us. For example, we can look at data flows to an ERP that can be adaptable to all these different niche solutions that maybe you’ve already invested in or you’re looking to invest in the future.

[You need to ask yourself] how you are building a technology ecosystem that can plug in to the same data, and to your point, what fields need to be updated. It can be done. We don’t start with a full integration upfront. We typically start with let’s turn the light on to your data then we can partner with a change management firm or work directly with the customers if they have the resources to look at the data. Then, we start thinking about their strategy and how they’re going to roll it out and what they’re going to integrate with the data into… We’ll typically start as a standalone because you want that data to start getting better right away. You can export the data out of Tealbook and then update it to other systems as an interim while you’re figuring out what that integration roadmap will look like long-term.

Now, we’re also in heavy discussion, and moving forward on partnerships with S2P and P2P providers that don’t have a supplier ecosystem are looking for a way to generate more value. They want to remove that high variable in the implementation success: data maintenance. We can even help when it comes to supplier discovery, or onboaring vendors to a network or whatever you’re looking for to be more competitive. In this case, Tealbook would be  integrated [out of the box] and it comes with your P2P or your S2P system. That’s the future for us.

J: Cool. The way I see it, to use your expression, “Turning the light on on your data,” feels like as soon as I import data into Tealbook, even if my suppliers aren’t logging in to add more info or I’m not adding more info, because of your machine learning algorithms, data is going to start getting better on its own, right?

S: Oh yeah, it’s really cool. In sectors where we have more customers, we use that community aggregated knowledge to give even more insight. So those customers get so much value because it’s beyond just what we were able to generate on their suppliers. We have that community insight that’s aggregated, so you don’t need each other’s data. It gives you more relevance, it gives you more analytics around benchmarks and things like that. And, the suppliers have been typically invited by multiple customers to come to the same place so if they update their certificate once, it’s updated for everyone. If it’s validated once, it’s validated for everyone. Now, you may be a company that needs the data to be validated based on your compliance requirements. Well, then you can validate it in Tealbook and the record is now validated twice: once for you and once for the community.

But, in sectors where we don’t [have a big footprint yet], we’re still able to generate a lot of value upfront. The way that we’ve approached that with chief procurement officers has been: “yes, you are the first one [in your sector] and we can generate a lot of value but now you can help us build that community data [with others in your sector].” We already have, for example, within a couple of weeks, three deals that are moving forward in a particular sector. We’ve got vendor masters from three companies that expand our machine learning extensively in those sectors, in those regions, in those capabilities. Now, those three customers are going to start really generating a lot of value [from the network effect].

J: And leveraging each other’s knowledge, right?

S: Yeah, and without sharing proprietary information. That’s really critical to our customers because we have clients in highly regulated or proprietary industries. They don’t want to share secrets but at the end of the day, they’re getting a lot of insights and value that’s so beneficial without feeling that they’re making their data visible to others. 

J: Okay. Then what are the challenges you’re facing in developing this technology? I’m sure there’s issues and road blocks that are important like any new endeavor?

S: Well just building a tech company… We could have a whole other podcast on that subject… or ten. I built a consulting firm prior to this in procurement and doing change management, building procurement functions for hyper growth companies. And so, I’ve been an entrepreneur for a long time but building a tech company is a completely different beast. Especially since I think we were probably a little ahead of the market. Because of my consulting background, I saw what was happening, and I was looking for solutions from my clients. How can we build this transparent, enabling, scalable agile procurement function? It’s like… “we can’t”. (laughs) You start introducing systems and tools and suddenly there’s no easy way to to reconcile the data…

I sat on the idea for Tealbook for nine years until cloud technology started becoming adopted, and a lot of the S2P software moved to the cloud. Then, I knew that there was opportunity to use big data. I didn’t know it was called Machine Learning at the time – that was four and a half years ago. So, building the technology itself, I was really fortunate to work with a lot of customers. Coming from the space definitely helped me understand the actual business drivers, the ROI, the use cases as to why you need good data.

Again, there’s a lot of magic with building companies. One of the magic moments for me: we had an MVP, we had six customers and a very small team.Our MVP was developed by a third party and then I met my CTO, Geoff Peddle. Geoff had worked at Google, he had done two Masters in computer science. His second Masters was in Machine Learning. Before that, he was building social media platforms using big data to make it really usable in the social media environment, and selling the analytics to media companies. Before that, spent 10 years at Ariba building the catalog and supplier network. Before that, he was at IBM, and he was in Toronto and available.

J: So, you had to jump on him, right?

S: Yes! That was really magic: to find someone that completely understood what we were building; someone that has all the different pieces that we needed to build our own data scientist team in-house, and build a software that was massively scalable that we could use with banks and highly regulated industries . There’s so many components that goes with that (security, scale, etc). Then the Machine Learning, it’s about data. I hear this all the time at conferences or when talking to procurement teams or even analysts that say to procurement teams that they should build their own data scientist team.

I always cringe because it’s really competitive to get really good data scientists. Data scientists, if you don’t know what that means, can be challenging to hire for and manage in a way that motivates them. For us, they’re really motivated because of the way our CTO manages that team. It’s very different than our software team. It’s more academic almost than project based.

When you hear [about hiring data scientists in organizations], it means data scientists are working on a small set of internal data. There’s not much that they can really do, right? Whereas, for us, it’s about continuously finding new sources of data. More, more, more!

J: Yeah.

S: We do a great job. We scour hundreds of millions of websites. We built it on the Google cloud, so we have access to a lot of data and Machine Learning. [For us], it’s about finding more sources that completes the buckets of data that we’re looking for. From more general supplier profile information to accredited data to vendor master data to risk data, we are always building that completeness of the record. So, to answer your question, the challenge is how do we find more sources of data and that’s what we continuously look for.

J: You’re saying you’d never get to 100% [completeness]. However, you must see that percentage grow over time not just within a specific client’s account but also within what you’re able to deliver to the clients, right?

S: Yeah, it’s so cool. For example, we have a client that’s a Fortune 50 company. The way that we started engaging is that I asked very simple question to the transformation team. I said: “When an employee needs a supplier today, what’s the process for that employee to start their project, whatever it is, to get that supplier on board to start working with them?”

The chief procurement officer spoke for like half an hour about all the things that they’ve put together to help that stakeholder get the information that they need through procurement. All I said in that meeting is that all of this, not overnight but over time with Tealbook will happen when I snap my fingers. There’s no reason why that employee cannot find all the information with the insight that they need to drive and make good decisions. That spiraled us to talk about use cases like okay, if you have the data, what’s the priority? Supply diversity was a priority.

We’re talking about a company that sits at the billion dollar round table; that’s been very, very well established with a 30 year old supplier diversity program. Still, they’re looking at 200K suppliers, and they’re like, “We think we’re missing some. The way that we’re doing this today, it requires a lot of effort. It’s very manual, so we also missing opportunities.”

J: It’s also based on the skill of the senior buyers they have on their team I would imagine, right? The relationships, the knowledge they’ve built over time?

S: Well, there’s software that does supplier diversity but they only do it for the suppliers that are uploading their certificate or they’re capturing certificates for. So, we did the exercise. When we started with them, it was clear that we did not have supplier diversity data at the time. What they said: “We don’t care but if you could get it and make the data better, that would be a first good use case to show us that you can improve the quality of our data over time.” They challenged us and so we started looking for sources of supplier diversity data. In 10 business days, were able to find over 800K small and diversity certificates, read those certificates, unify back to their profile, crawl all the information about that company and created almost 500K small and diverse business profiles that were available on Tealbook.

We matched it against their record and found 1600 suppliers that met the requirements that had been missed in their reporting. We improved their reporting by 20% in 10 business days. That’s crazy.

J: Those are all publicly available sources of data that you just started mining?

S: That was just the publicly available data. Imagine if you could tap into paid sources!

J: Yes, StatsCan or D&B, etc.

S: Exactly. In the US, it’s easier because those suppliers have to report at the state or the national or the city level. They have to either be registered with the government or they upload the actual accreditation certificate. It’s so much easier in the US. Canada is not as easy but we have groups of customers now that have [diversity] mandates and the challenge with the national associations is that although they do a great job of supporting the communities, which we could never replace, the databases are typically dated. If I’m a buyer, I have a [diversity] mandate for veteran owned, small businesses, woman owned, gay, lesbian, African-American, aboriginal, etc. Now, I have to subscribe to these national associations in order to get access to their database, and their database are not super awesome. I’m a buyer who’s just looking for maybe translation service or an IT consulting firm or an ad agency or whatever it may be. I don’t really care if I don’t have a strong diversity mandate. I’m not going to go to 10 different databases … Maybe my company pays the fee but it’s still a lot of time…

So, there’s a lot of limitation in scale. To be able to search for translation services [on Tealbook], I’ll go through the following process: I’m looking for translation services in Canada, ideally in Toronto. I’d like them to be woman owned. If they’re aboriginal owned as well, are they certified? Are we working with them already? I can search by spend, I can find similar companies who are certified, build a list in seconds. Then I can also show my manager, “Hey, every time I do a sourcing event, I’ve included all the suppliers based on my compliance, my [diversity] mandate.” I can actually be more accountable to it.

I think it’s a big mindset [shift] and it should be an opportunity for the national associations because if they’re in there, it adds more credibility; if I can sort by NMSCC or Reconnect or Rebank or VA or whatever it is. It just adds the fact that the certificate is now valid. It’s a credible source and it’s in the hands of all buyers and possibly employees that are not in procurement. Now, you’re completely scaling access to those suppliers in the hands of people making decisions every day.   

J: Those people making decisions every day based on that data, do they have access to the source of that data? Where Tealbook got this piece of data?

S: Yeah. You can click on the links to the national associations. Right now, you can click on the FDA or whatever it is. You can see it. You can even click verify so then it’s verified for your entire organization as you do it, or we can automate that process for our clients if they don’t want to do it manually. Our clients can report the supplier diversity now on their dashboard. They can take their quarter and the classification based on their requirements, and it’s a click of a button to generate that report and submit it, which is, again, another game changing opportunity.

J: So, for people we’ve gotten super excited now (laughs), or maybe it’s just me, I don’t know… Sometimes I get excited about stuff that other people look at me funny for… What does an implementation project look like for a tool like Tealbook?

S: It’s simple. It’s really easy. To get started, we just need a list of vendor masters (addresses, etc). Ideally, we need more than a legal name for us to automate that process. If we don’t then we have to do some manual manipulation to make sure we’re getting 100% match. Our clients send out their messy files and we take it and implement. Then for the strategic component of it, I’d say we’ve been very successful more recently, is partnering with change management firms who are looking at the data.

They’re helping our customers prioritize and then build the strategies and processes. We did that initially but, we’re a tech company, so we’re not super equipped for that… We are building a more sophisticated customer success [model]. We really want to enable partners to be able to do it. We’re building partnerships where our partners can build some really nice businesses on the change management and the integration. However, tt’s nothing like what you would pay or the heavy lifting you’d have to do with the larger firm.

J: Like a manual spend analysis mandate or something like that…

S: Exactly. It’s simple. Then it’s really more about the change management and how you’re driving that change. In Tealbook, you have the data, you pick your filters. If you want to use our interface, it’s optional, you don’t have to use our interface. However, we find customers really like the interface. It looks like LinkedIn. So there may be a good use case for non-procurement users. We have a client who has 4K marketing suppliers right now: a large insurance company. They want to start using the interface to capture connections. They want their buyers to start rating and tagging suppliers and adding themselves as the category lead and start rating those 4K suppliers. That’s a really good use case.

A client who had Ariba said, “No, no, we’ve committed to Ariba, we’re not going to introduce another interface. We don’t want to confuse our buyers.” When they saw Tealbook coupled with the fact that they had really strong diversity mandate, they were able to roll it out to help those buyers find small and diverse businesses and that’s how it started. Now, they’ve expanded their use case to integrate their category strategies. All the preferred suppliers based on a category comes up first. Then they were able to use Tealbook to do that.

Again, they’re more strategic decisions on how you’re going to roll it out. For me, as a buyer, Tealbook is an app. As long as my company subscribes to it, I upload allmy contacts, I can add in all my suppliers and basically build a visual rolodex of suppliers. I can sort by who’s doing business with my organization, by spend, by geography, by diversity, by similarities. Then if I have the network, I can expand, find similar suppliers to the ones I’m already doing business with. If my mandate is discovery or innovation or diversity, it’s super easy to use. 

J: So, if I’m a buyer, it would be as simple as calling up my favorite IT guy for my ERP of choice and get him to send me an extract of my vendor database to start? If my company’s subscribed to Tealbook, I can upload that data and start getting more insights.

S: Yeah. It’s really cool. We used to provide a report. Now, it’s in product, so it’s a dashboard. You see all your data come to life. You see how many unique suppliers you have. You see your diversity spend that’s validated. [You see your ] pipeline, the potential. You see by categories, clusters, unclassified with suggested classifications. The “wow factor” comes up front and then it’s driven through your strategy and how you’re using it. 

J: Very cool. Well, I want to be respectful of your time. I know we’ve been talking for a while now. Is there anything that you want to leave with my audience or people listening now in terms of how they can reach you or anything you’re currently working on that would be of interest?

S: Listen, I really appreciate you inviting me to speak. We talked initially about not being promotional so I’m sorry it’s a lot about Tealbook. I’m just so passionate about what we’re doing.

J: No. It feels like you’re early in the market where there’s not a lot of players, right? So, when we talk about tools to address issues, well, your company is the use case to use, right?

S: I think the message for the audience is think about your data strategy.

Can you answer, with confidence, that you have good data or that even you have a data strategy? If the answer is no, really question why not. Question your leadership. If you’re an executive, question your team. Why don’t we have a data strategy? What’s our data strategy? How do we prepare for ever faster, changing environment and become a more value-add to the organization in a way that we can scale. I’d start there.

If you’re interested in contacting us, we’ve got a great team here that would love to dive more into some of your challenges and come back with the best way we can leverage our technology to help you prioritize those challenges and address them.

You can contact us on our website, tealbook.com. You can request a demo. I’m also very outspoken on LinkedIn. If you’re interested in following me, it’s Stephany with a “Y”, obviously with Tealbook. You can contact me. I’d be happy to have a conversation and make sure that you’re well taken care of and speaking to the right people on our team.

J: Awesome. Well, thanks so much for taking the time Stephany, I appreciate it. I know you got me excited on a couple of fronts that now I need to do further reading on. I’m sure it’s the same for the audience as well, so thanks again for taking the time.

S: Yeah. Thank you very much. Take care. Bye.

———————————
Do you believe good vendor data is critical to reaching your digitization objectives? What hurdles have you faced on the cleansing journey? What lessons learned can you share? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on January 6, 2021 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/enriching-vendor-data-with-machine-learning-tools-with-stephany-lapierre/feed/ 0 1084
6 Rules for Your Procurement Digital Transformation https://www.pureprocurement.ca/rules-for-your-procurement-digital-transformation/ https://www.pureprocurement.ca/rules-for-your-procurement-digital-transformation/#comments Thu, 02 Apr 2020 16:16:39 +0000 https://www.pureprocurement.ca/?p=1055 Tanker plane refueling a jet below
Switching from a legacy system to a new system is a lot like transferring materials from one plane to another in flight! It’s not simple.

In Deloitte’s 2019 Global CPO Survey, 53% of the 481 top procurement leaders surveyed reported that they are not satisfied with the results of digital technology implementations within their organizations. Why is it still so hard to get a Procurement digital transformation right? I think this is due to two main factors.

First, the general maturity of the Procurement technology market. All the big “best of breadth” Source to Pay software providers (SAP Ariba, Coupa, Ivalua, Jaggaer, etc.) are still developing their solutions at full speed. Loads of functionality is released every year. This tells us providers are still playing catch up to support a wide array of existing complex requirements in Procurement organizations in all industries. If they were developing functionalities to support novel use cases (e.g. leveraging Artificial Intelligence or Blockchain), the pace of releases would be slower. True “innovation” takes time.

This is important because it means any approach that relies on a single vendor will result in gaps. However, this is not what vendors would have you believe during the sales process. You can’t really blame them as they face a prisoner’s dilemma when pitching potential customers – If their solution is not marketed as the best on the market, their competitors’ solutions certainly will be. This creates a lack of transparency for organizations that complexifies making good decisions.

Second, the tendency for organizations to equate digital transformation with software implementation. If you, for example, launch a digitalization initiative for your requisition approval processes without first looking at organizational structure harmonization, process/workflow optimization, data governance and identity/user management, you will quickly run into trouble. Instead of having transformed your legacy procurement processes with the help of digital tools, you will have digitalized your legacy processes (and all current exception management efforts). The nuance is a subtle one but it is the difference between success and failure in these types of endeavours.

So how can you pull off a successful digital transformation in procurement on the first try? Here are 6 rules to consider:

1. Catalog, Understand and Optimize Your Organizational Structure and Processes Before Thinking About a Tool

Whether it be for sourcing, contract management, purchasing, receiving or accounts payable, you should take stock of your organization structure and the complete list of processes executed by each business unit. What differences exist? Are they justified by specific requirements or could structures and processes be harmonized? Working on process harmonization before implementing a tool will minimize your configuration/development costs. All units will be able to execute similar processes in the new solution. If you can’t harmonize, you will know why and can plan accordingly.

2. Data First, Technology Second.

If you don’t have a handle on the rules and processes (creation, changes, approvals, etc.) governing your key Procurement data (vendors, prices, catalogs, payment terms, materials, etc.), this is something you should address before introducing new technology into the picture. Data quality is essential to extracting the full benefits of a Source to Pay solution. Furthermore, high data quality is only possible when owners are identified, made responsible of data items and measured on data quality metrics for those items. Otherwise, you will find yourself in a Ferrari with no tires. You can still drive it on the rims but the ride is definitely not what you had envisioned at purchase!

3. Buy for Functionality and Total Cost of Ownership (TCO), Not for Lowest Up Front Price

When comes the time to think about software, do your research and trust your instincts. Yes, the typical RFI process can be useful but is not sufficient. Find ways to see the software on your short list vendors in action in other companies. Ask questions to identify the gaps vs. the processes you’ve catalogued previously. Which modules will work for you? Which ones will come up short? Involve IT to consider the context of your ERP architecture.

Be realistic and only select the package of modules that will work for you from a “best of breadth” vendor. For the rest, I advise you go the “best of breed” route. This approach will probably cost you more on paper from a license fee perspective but you can be sure that the total cost of ownership (TCO) will be higher if you use a single solution. After implementation, you will struggle for years to try to make the square peg fit in a round hole.

4. Start Your Software Implementation Slowly

Once you’ve secured your software licenses, take your time to implement. Again, it will be more expensive on paper to do it this way but less expensive than having repeated delays and project reboots.  Organizations regularly overestimate their capacity to adopt change; especially if they haven’t migrated to new systems in a recent past. Implementing new IT systems is like transferring passengers from one plane to another while both planes are flying… Easier said than done. A great place to start is with Sourcing and/or Contract Management modules. These solutions usually involve smaller populations of users and allow your Procurement digital transformation team to cut its teeth before attacking more complex, integration heavy modules such as catalog management, purchasing, goods receiving, invoice management, etc.

5. Build Experience into Your Transformation Team

Ideally, when preparing for a multi-year digital transformation initiative, you should try to recruit team members with previous experience in Procurement digital transformation. This will give your organization instant maturity that will serve you well as you run into issues during solution deployment. Experience on the specific technology you are implementing is a plus but not essential. You are looking for reflexes. Just make sure that from a timing perspective, you give these new additions to your team enough time (ideally 12 to 18 months) to assimilate your business before they are tasked with designing its future.

6. Begin with the End in Mind

Once you’ve “switched passengers from one plane to another” and deployed your new system, who will be the flight crew? Who will be the maintenance crew? Before designing a system, you should have an idea of what your operations will look like once it is deployed. A new system means new types of tasks to support and maintain the applications you’ve deployed. The folks who will execute administrative tasks related to the tool should be involved in the project from the beginning to start climbing their learning curve early.

Conclusion

Of course, there are other factors to think about as you start a digital transformation journey. However, by nailing down these 6 main ones, you’re much likelier to find yourself in the “satisfied CPO” column the next time a digital transformation survey is sent your way.

———————————
Do you see any rules to add to this list? Do you disagree with any of my rules? Why? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on January 4, 2021 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/rules-for-your-procurement-digital-transformation/feed/ 1 1055
Top 5 Ways to Build Procurement Stakeholder Engagement https://www.pureprocurement.ca/top-5-ways-to-build-procurement-stakeholder-engagement/ https://www.pureprocurement.ca/top-5-ways-to-build-procurement-stakeholder-engagement/#respond Fri, 14 Feb 2020 13:34:34 +0000 https://www.pureprocurement.ca/?p=1044 Executive being engaged by Procurement department
Demonstrate the VALUE you’ll deliver for your Procurement stakeholder in order to move the conversation forward.

As Procurement, we generate value for the business in a number of ways. These all require Procurement stakeholder engagement:

  • Stimulating supplier innovation
  • Finding cost savings opportunities
  • Negotiating terms & conditions that create better outcomes for the business
  • Etc.

Without your internal customers’ implication, it’s almost impossible to generate value for the business. You are left in the dark with regards to business needs and requirements. Correctly sizing these elements is critical to defining how you will generate value.

So what are you to do when your stakeholders are not engaged to the level you need to be effective? Here are 5 tactics you can use to boost procurement stakeholder engagement within your business:

#5 – Organize a Procurement Requirements Review Meeting

One of the ways you can work on Procurement stakeholder engagement is with a quarterly (or monthly if more appropriate) procurement requirements / pain point review meeting.

This is effective even if you don’t have an ongoing sourcing project for your stakeholder’s category. It allows you to ask questions about the products and services that they use and how they could be better. What suppliers do they come from? Would it be better for you if X, Y or Z? The answers will give you leads to investigate how you can improve your internal client’s outcomes. It might require a new RFP or it could be as simple as making a few phone calls.

If you follow up after the meeting and deliver value, you can bet engagement will be higher with your stakeholders.

#4 – Have Lunch With Your Stakeholders

As you may have guessed while reading #5, building Procurement stakeholder engagement with your stakeholders is based on building the underlying trust in the relationship. So, what do you do if a stakeholder refuses your invitation for a requirements review? “I’m too busy.” “Maybe another time.” This means you need to work on trust before you work on Procurement.

That’s why the second method I am proposing is the good old fashioned lunch. If you are met with resistance to an initial requirements meeting, propose to simply get to know them a little better over the course of lunch. Make it impossible to refuse. If they are at another location, go to their location. They only have 30 minutes? No problem. You don’t seem like someone they want to collaborate with after the lunch? “You’ll never have to speak to me again!” Everyone has to eat.

Make the exercise about genuinely getting to know them. What is their backstory? How long have they been at the company? What were they doing before? What do they enjoy/find most difficult about their current role? I guarantee pain points will eventually pop up. Then it becomes a question finding an appropriate time to make your elevator pitch.

If they are concerned that Procurement will simply be another bottleneck in their operations, you need to be ready to demonstrate how you will deliver the promised value.

#3 – Create an Internal Procurement Newsletter

Once you’ve initially engaged a stakeholder, the challenge then becomes how you will keep them engaged on an ongoing basis. One great method that requires no additional technology is creating an internal, highly contextual monthly newsletter for your key stakeholders (similar to the one you get when you subscribe to PureProcurement.ca!).

Create an email format you will reuse from month to month (i.e. ongoing projects with status, supplier spotlight of the month, key procurement department announcements, key industry news, etc.). Provide VALUE, be consistent and, after a few months, seek feedback. I’m certain you will find that for a portion of your stakeholders, they really appreciate the time taken to inform them. It also keeps you top of mind for their next procurement needs.

#2 – Shadow Your Stakeholders on Their Turf

Another technique you can use to boost Procurement stakeholder engagement is shadowing. If your stakeholders’ main opposition to meeting with you is lack of time, offer to shadow them, a colleague or a resource on their team for a day or two to get a sense of their day-to-day reality.

From there, use your observation skills to see what value you could bring to the table with your procurement expertise. If they are so strapped for time it is probably because they are dealing with lots of exceptions/firefighting. Can any of these be attributed to goods/services purchased? What about the underlying purchasing process (e.g. a really inefficient goods receipt process)?

Sell the exercise as having zero downside for your stakeholder: “I will spend a day getting to know you without disturbing you. I’ll show you any interesting findings. You get the final say on whether we launch a procurement initiative together as a result of my findings”. If you remove the obstacles in the way of your involvement in good faith, you should be able to open doors that have been closed so far.

#1 – Seek Executive Support

You’ve tried organizing a requirements review session, a lunch, a newsletter and a shadowing session but your stakeholders are still not giving you a inch to work with them? That’s when it’s time to seek management support.

When you’re not able to get an opening to start building trust yourself, you may be able to leverage the trust others have built up with your stakeholder. Chances are your manager will know the best way to approach the given team/business unit to reach a given outcome. An introduction/follow-up from a manager can go a long way to opening up the dialogue.

Your management will certainly help you if you illustrate the steps you’ve already taken to attempt to build Procurement stakeholder engagement. After all, they should be the people who most want to see you succeed. After all, you are working to help them reach their organizational objectives!

Conclusion

Regardless of the tactic you use to build Procurement stakeholder engagement, remember that you are always working on improving two things:

  • Trust
  • Value you generate

You build trust by understanding your stakeholder and delivering on your promises. Your promises need to be linked to generating value for your stakeholder. Do this and you will eventually break through any barriers.

———————————
What other tactics have you used to keep your Procurement stakeholders engaged? How successful were you at generating long term engagement? What were the keys to your results? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on January 6, 2021 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/top-5-ways-to-build-procurement-stakeholder-engagement/feed/ 0 1044
Top 5 Agile Methods to Leverage in Procurement https://www.pureprocurement.ca/top-5-agile-methods-for-procurement/ https://www.pureprocurement.ca/top-5-agile-methods-for-procurement/#respond Mon, 27 Jan 2020 01:17:16 +0000 https://www.pureprocurement.ca/?p=1037 kanban board being used for management of Procurement department
Using a Kanban board in procurement will give you instant visibility into the status of ongoing activities.

In nature, cross pollination can yield wonderful new varieties of plants. As bees and other insects buzz between different plant species, magic can happen and a new type of plant is born.

I think this is a great analogy for one of the ways you can bring innovation to your field; that is, getting inspired by methods and tools that work in other fields, applying them to yours when relevant and evaluating the results to see if you’ve got something.

In this spirit, I think there is tremendous potential to “level up” your procurement operations by applying Agile methods to your procurement operations. At its core, Agile methods are centered around improving communication, collaboration, feedback and trust in software development. However, who says we can’t use these tools for our procurement teams?

So here we go, the Top 5 Agile methods for use in Procurement. Most of these work in a Procure-to-Pay setting but all of them apply in a strategic sourcing and contracting setting.

#5 – The “Stand Up” Meeting

One of my favorite Agile tools that transfers well to Procurement is the ‘Stand Up Meeting’. Book a periodic (2-5 times a week), short meeting (15 min max) with your immediate team where each member answers the following 3 questions:

  • What was accomplished during the last period
  • Accomplishments on the “To Do” in the next period
  • What, if any, “blockers” are preventing them from doing their work

A timer is set and a scribe has a calendar/email application open. When action items emerge from the exercise (need for an offline meeting, need to notify someone, new task, etc.), the scribe plans/notes the actions and involves the appropriate people. Once everyone has answered the 3 question the meeting ends and everyone goes about executing their next period and addressing the action items.

No details should be discussed at the meeting. It’s an insurance policy to ensure everyone is spending their time on the appropriate items and to plan activities to course correct as needed.

It takes a while before a team gets the hang of the stand up meeting. The leader and team members need to keep each other honest and on point. By experimenting, you’ll find the right periodicity for maximal value. This might also change over time depending on how busy you are as a team.

#4 – Mood Marbles

Another very easy Agile tool to implement in procurement is the “mood marbles” concept. At its core, this tool is about getting a sense of the long-term mood trends of your team.

The original concept is implemented by having three sets of marbles (green, yellow and red) for each member of your team and having them periodically select one to place it in a centrally placed transparent jar. Green is happy, yellow is neutral (or trending to red), red is angry/sad/overworked. With this simple system, everyone on the team gets a visual on the overall mood of the team and can collectively react and adjust when needed.

Of course, you don’t need marbles to implement this concept. It can be as simple as a piece of paper. You could also use one of those airport bathroom cleanliness “gizmos” and you ask everyone to press after your stand up meeting. It could also be implemented using sophisticated software tools such as Officevibe (Made in Montreal!).

Regardless of how you implement the concept, the important part is how you react to tangible information about the mood trends of your team. A happy team is a team that stays and grows together.

#3 – The Backlog

The Backlog is an essential Agile software development tool that can be easily repurposed for Procurement. It is essentially a list of prioritized items with a short description of the task to complete.

However, there are key differences between a backlog and a list:

  • The items are prioritized by the requesters for each item (based on the criteria you determine)
  • The item needs a rough effort estimate assigned to it (so the backlog can be used for high level capacity planning
  • There needs to be a defined rules to update/change the backlog (i.e. a monthly review session)

This tool can be used for items as simple as capturing the continuous improvement wish list from team members and working on them in priority. Or, it can be used for items as essential as the planning of sourcing activities with your key procurement stakeholders (to complement your Sourcing/Contracting workflow system, for example).

A backlog can also be implemented with technology as simple or sophisticated as needed. It can be an excel file with permissions, a Trello board (free web based tool) or a JIRA board (Enterprise tool). You could even build it via reports in your S2P suite (e.g. Ariba, Coupa, Ivalua, etc.).

#2 – The Kanban Board

Next up on the list is the KanBan board. Whether or not you have a backlog setup, you can use a simple KanBan board in a central space to give everyone on the team a sense of the status of all ongoing projects.

In the sourcing or contract management context, the board can be as simple as five columns on a whiteboard with the following statuses:

  • To Do
  • In Progress
  • Blocked
  • Done
  • Cancelled

Each sourcing/contracting activity is a large post-it on the board and includes the owner’s name and key project info. During your periodic standup meeting, these post-its act as a checklist to ensure the team has touched on all important items in flight. If the status has changed since the last time the team was briefed on it, the post it is put in the column that reflects its new status. Any stakeholder can also get a sense of the high-level status of any project and the volume of work being done by your team by glancing at the board.

#1 – Sprints and Retrospectives

As you work on running sourcing and contracting events, other agile tools to consider are sprints and retrospectives. A sprint is a predefined amount of your team’s capacity (i.e. 2 weeks). A retrospective is a meeting to take a look back at the sprint that just finished to brainstorm how the team can do better on the next sprint (productivity, morale, external perceptions, etc.). The retrospective can also be combined with a sprint planning review to see what is scheduled in the next sprint.

If this is the case, each category manager and their team would pick and choose items to deliver from the backlog based on their priority set by your stakeholders. They would come to a consensus on what they believe they will be able to fit in the sprint based on their initial rough order of magnitude (RoM) estimate for the task (often referred to as the item’s “T-shirt size”). Rinse and repeat over time.

The big difference between traditional management and this way of working is that:

  • The reality of changing priorities and dynamics is at the center of your capacity management activities
  • Continuous improvement is built into the process.

BONUS – The Playback

The last tool I would add to the list is the “Playback”. The playback is a meeting or conversation where you re-state your comprehension of a topic in your own words to your stakeholder(s) validate your understanding. In the context of a sourcing project, for example, you would hold a playback session after you’ve gathered requirements from all your stakeholders to confirm a unique version of the requirements and tradeoffs to consider for the event. This serves the dual purpose of confirming you didn’t miss anything, clears up any incoherencies and puts all your stakeholders on the same page before going out to market. This certainly isn’t a new concept but it is formalized in the Agile methodology.

Conclusion

While not all procurement tasks lend themselves to integrating Agile methods, there’s a whole lot that do. By playing with these methods in your procurement teams, I am certain you will learn valuable information about how your team can achieve more with less while having more fun in the process.

———————————
What other Agile methods do you think could be used in procurement? Do you see value in using Agile methods and tools in procurement? What other domains could bring value to procurement? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on January 4, 2021 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/top-5-agile-methods-for-procurement/feed/ 0 1037
Mastering Maintenance Catalogs with SAP Ariba Catalogs https://www.pureprocurement.ca/ariba-catalogs-for-maintenance/ https://www.pureprocurement.ca/ariba-catalogs-for-maintenance/#respond Mon, 20 Jan 2020 03:12:28 +0000 https://www.pureprocurement.ca/?p=1022 Maintenance teams are underserved by the standard indirect purchasing Source to Pay (S2P) technology use case. This is true of all leading Source to Pay (S2P) suites (SAP Ariba, Ivalua, Coupa, Jaggaer, Zycus, etc.). Thankfully, you can drive benefits in the maintenance department by leveraging the catalog management modules of S2P technologies. By using maintenance catalogs, you can drive efficiency in your maintenance planning process and get through your maintenance backlog. The following article details how you would use a catalog management platform (CMP) to achieve this in the maintenance context. Furthermore, I touch on the benefits associated to such an endeavor and how to go about building your business case.

I’ve used SAP Ariba Catalogs with an SAP Materials Management / Plant Maintenance back end system in the article examples. However, the general concepts could apply to any catalog management platform / back end combination.

Problem

The typical process usually starts in the S2P application with a requisition and/or shopping cart. From there, users search materials and/or services to build their requisitions. The requisition is then submitted for approval to initiate the rest of the process (PO, GR, Invoice).

This poses a few problems for maintenance purchases as they usually need to be tied to a work order to track maintenance costs.

First, to associate a purchase to a work order with the above use case, you would need to import all active work order numbers into your S2P tool. This is necessary so users can select these orders as account assignment objects to assign the costs of their purchase to the work order. This usually isn’t practical because of high work order volumes, turnover and lack of additional descriptive data on the work order.

Second, while you may be able to make this approach work from a technical perspective, it is still impractical from a user experience perspective. As maintenance planners spend most of their time in the maintenance system to plan work orders (e.g. SAP PM), having them log into a separate system to do their purchasing is inefficient and prone to error.

This approach would also negatively impact maintenance reporting on work orders as we would only see the part/service cost to complete work orders in the maintenance module and would need to log into another system to see the line item detail.

Third, the above use case forces you to have different processes to buy consumable items (purchases) and reserve inventoried items (reservations) in the same work order as inventoried items need to be reserved via the work order.

Given these reasons, applying the typical S2P use case to maintenance purchasing is not optimal. Thankfully, you can improve outcomes for maintenance using another use case provided by most S2P technologies: maintenance catalogs.

The Proposed Solution : Maintenance Catalogs

The great thing about S2P solutions is that they are very modular. Like a Swiss Army knife, we can pick and choose the tool that serves our purpose for the job at hand.

One of the S2P implementation guiding principles I strongly believe in is keeping users in a single system as much as possible (or at the very least making it seem so). In a maintenance purchasing scenario, this means keeping users in the maintenance system where the work order lives. Therefore, maintenance purchases should be managed with a “direct” purchasing use case. Translation: The purchasing process (Req – Approval – PO – GR – Invoice) should be carried out in your back end system (ie. SAP MM) and stem from the work order; just as your direct purchases would stem from a Material Requirements Planning (MRP) system.

Therefore, to better serve maintenance up front, before we even get to the requisition, the idea is to integrate purchasing catalogs directly with the work order. This way, maintenance planners can lookup parts and services in catalogs as they are building their work orders instead of searching through material masters as they do today. This works by integrating a catalog management platform (CMP) with your back end system (i.e. SAP PM) so that users can punch out to catalogs directly from the work order without having to sign into a new system. Here is an illustration of how this use case would function in an SAP system:

High level architecture of the use of a catalog management platform such as SAP Ariba Catalogs for maintenance parts and services
Using a Catalog Management Platform for maintenance parts and services is a game changer.

1. SAP Master Data.

The SAP master data needed to support the catalog and requisition creation process (vendors, material masters, units of measure, etc.) must be imported into your CMP (i.e. Ariba Catalogs). Ideally, you also setup a periodic delta data load to keep data up to date in the catalog management platform.

2 & 3. Hosted Catalogs & Punch-Out Catalogs.

With data loaded into the platform, you can start creating maintenance parts and services catalogs that your users will be able to leverage when building work orders.

Hosted catalogs represent catalogs you build yourself and manage directly on the CMP in lieu of the vendor. You simply build these catalogs with the data you’ve loaded from your system. This can involve a lot of work but is the simplest way to get your maintenance catalogs up and running quickly.

Punch-Out catalogs are provided and hosted by your vendor. They should be representative of signed agreements with vendors. Usually vendors will be able to provide up-to-date and relevant information with less effort than if you put together a hosted catalog (i.e. pictures, specifications, stock amounts on hand, etc.).

However, this type of catalog requires much more work to implement because you need to coordinate the activity with an external entity. Then, you must make sure that you have mappings set up between all the items in the vendor’s catalog and the material masters in your CMP. This is required as users will be browsing on the vendor’s site for parts and when they “checkout” to come back to the CMP, we need to know what SAP materials they’ve selected.

Also, you need to have an index of the vendor catalog in your CMP if you are aiming to provide a central search experience. Otherwise, when a user searches for materials in the CMP, they won’t find anything in the vendor punch-out catalogs. They would need to search each vendor catalog individually…

Once you’ve defined your catalogue maintenance strategy, you can also maintain the prices and permissions for these catalogs if you don’t want all your users to have access to all the data.

In any case, I’m sure you can see how the complexity of managing such as solution can rapidly grow out of control. So, start small, start simple and go from there.

4. Work Order

Just as your maintenance planners do today, they will plan their work orders directly in the maintenance system (i.e. SAP PM). However, when it comes time to plan parts and services, they can enter material masters as they do today or they can press a new “catalog” button that will give them access to the functionality we are discussing in this article.

5. Search & Shopping Cart.

When maintenance planners click the “catalog” button in their work order, they punch-out to the CMP where they land on a search page. From there, they can:

  • Search for parts and services with free text
  • Drill down in the catalog item hierarchy to find a specific part/services
  • Punch-out directly to a vendor catalog (if they know which vendor catalog they want to browse)

With a free text search, if you’ve got a combination of hosted and punch out catalogs (with search indexes) setup, your planner will see all parts and services that match the search term (think Amazon search).

If they select a hosted item, they can add it to their cart then and there. However, if they select an item in a punch-out catalog, they will be sent to the vendor’s site to finalize the selection and, when finished, the item will be added to their cart in the CMP.

This continues until the planner has added everything he needs to his cart. At this point he “checks out” of the CMP. This brings all the items he’s selected back to the SAP Work Order and enters them into the work order just as if he had entered them manually. The end result is a work order with all the parts and services to be purchased/consumed for this work order. Rinse and repeat for each new work order.

I’m assuming here that you would set up catalog items for both inventoried and non-inventoried items (which makes sense as it gives the planner a single user experience). This means that the information on which items are to be consumed/reserved vs. which items are to be bought needs to be maintained in your catalog items as well. Once the line items are in the Work Order, the SAP configuration settings take care of the rest (just as before implementing a CMP).

6. Requisition to Pay.

You would carry out the rest of the purchasing process As-Is in your SAP system using the standard transactions (ME51N, ME21N, MIGO, MIRO, F110). If you have approvals (Release Strategy) configured on your requisition or purchase order, they would trigger as usual. The only other thing we could change here is to put Vendor Network technology in place to help automate the Procure-to-Pay process but that’s a subject for another article…

The same general architecture could apply with any catalog management tool set. Of course, the functionalities and limitations would depend on the specific tool chosen.

Maintenance Catalog Benefits

Tangibly, the main benefit of this improvement in your maintenance purchasing process is an improved search experience for maintenance planners and technicians when looking for parts and services. With better, more intuitive search (all parts in one place, pictures, specs of parts, vendor stock levels, etc.), the maintenance department can find what they are looking for faster and move onto the next task at hand.

And, it is my belief that better search experience will lead to better performance in maintenance department overall:

  • Better search = Faster planning.

Why?

  • Better search = Less duplicate material and service master creation requests and “text-only” items in work orders when frustrated planners give up on their search
  • Less duplicates and “text-only” items = Better, more accurate reporting of consumption and overall maintenance spend (justification)
  • Better reporting = Better maintenance and procurement outcomes because of better decision making enabled by accurate information.

And, as a bonus:

  • Better outcomes = Happier teams where members stay for the long term!

However, to ensure this logic applies, it’s important to ensure you are enabling a critical mass of content with the first maintenance catalogs you put in place. Otherwise, you will lose the engagement of maintenance planners that log in and find a bare catalog platform the first time they try to search parts and services.

It’s also important to think about how you will maintain and support the catalog solution over time. Catalogs and user needs will evolve and change over time. When implemented, catalog management becomes an ongoing activity.

Building the Business Case for a Catalog Management Platform Project

The key to building the business case for implementation of a CMP tool is answering the following questions:

  • How much time are maintenance teams spending on search when planning work orders?
  • How much time would they spend if they found what they were looking for within 30 seconds?

The difference between the two multiplied by your average maintenance planner salary is your benefit in dollars. How much additional maintenance time does that buy? How big is your maintenance backlog? How many more work orders could you get to with this planning time optimization?

Once you start looking at the numbers, a maintenance catalog initiative could be a very interesting proposition indeed!

———————————
What has your experience been with Catalog Management Platforms? Do you see other benefits of using a CMP for maintenance teams? What should folks watch out for when implementing these solutions? Let me know in the comments.

If you liked this post, why not Subscribe

Last Updated on January 8, 2021 by Joël Collin-Demers

]]>
https://www.pureprocurement.ca/ariba-catalogs-for-maintenance/feed/ 0 1022