How to Discover New Suppliers Using Community Intelligence
You would think that in the age of the internet, companies would have no trouble finding suppliers for just about anything they might need for their business. It turns out it’s not so simple. Internet searches can turn up potential suppliers, but are those suppliers financially sound? Can they ship to your desired location? What is their performance record? Do they have a lot of returns, or invoice disputes?
There’s a lot of effort that goes into finding trustworthy suppliers, so much so that companies sometimes engage market research or analyst firms to help. Sometimes, it even turns out that the best supplier is actually one your company is already buying some categories of goods or services from, but the new buyer doesn’t always have the insights to know an existing supplier could fulfill a new need. Ouch!
Thinking About the Problem
We learned a lot about this problem last year while we were working with customers to develop Perfect Fit Insights, and that got us thinking about how we could use some of the work we were doing on Perfect Fit to help solve it.
For several years now, Coupa has published a benchmark report based on community intelligence from the transactional data on cumulative customer spending that has run through our platform. What we were doing with Perfect Fit Insights is bringing those performance benchmarks right into the platform and surfacing them at the relevant transactional moments. For example, if you’re approving expense reports, your company’s average time to approve expense reports is displayed alongside the cross-company best in class average. If you’re under-performing the benchmark, you can drill down to see prescriptions for improvement.
While we were working on Perfect Fit, we were also working on a capability called Risk Aware, which pulls in community intelligence about suppliers, and supplements it with third party information such as credit ratings, financial and judicial information, and public news and sentiment in order to give a more complete picture of supplier health.
We realized that we could mine the same data sets to help our customers find qualified suppliers and leverage the Perfect Fit technology we just built to surface those insights during the sourcing process. That inspired us to develop our second community intelligence-based application, Supplier Insights.
Simple and Well-Defined
Our concept was simple and well defined: Build a faceted search capability that would allow you to type the name of an item or a service that you’re trying to buy, and find a list of potential suppliers. Then you can filter to find those that meet certain criteria that are important to you.
Our plan was to add a “Supplier Insights” tab to the sourcing module to help sourcing managers discover additional suppliers and either email them to ask them for more information, or just automatically invite them to the event.
We knew we had the data we needed, and the ability to surface the insights, but there was some suspense around exactly how we were going to build this functionality, which was still undetermined.
User Centricity and Suite Synergy
In keeping with one of our guiding design principles, user-centricity, we got feedback from some customer power users and internal subject matter experts, and quickly realized there were other users besides sourcing managers who could benefit from this capability. Though it was going to be more work, we were actually very excited about this because suite synergy--maximizing the use of capabilities we build across the platform—is another of our guiding principles.
So, we added a “Supplier Insights” tab to the Suppliers module.
We also embedded it in the procurement workflow in two different places. There’s one placement in the requisitions workflow. In that workflow, when an end user cannot find what they want in a catalog, they can submit a freeform request which is then routed to a buyer for review.
Typically, the user will just indicate which items they need to buy but they will not pick a supplier, since they don’t have any information to help them do that. So, we added an option for the buyer to launch “Supplier Insights” to try to find good matches for the requested item.
Finally, we also added it to search analytics, which provides buyers with all sorts of interesting information about what people are trying to buy. We added the ability to launch Supplier Insights from two different reports—one that shows the most searched for items, and one with the lowest converting catalog items. The buyer can click on any item in the table to start the capability and search for suppliers that might better satisfy end users.
In the finished capability, you can look specifically for new suppliers from the community or you can look specifically for your existing suppliers to see if you can buy more from them. You can filter results based on star ratings from other Coupa customers. You can look for Coupa Advantage suppliers. You can also filter based on ship to countries. Embedding this capability into the procurement and sourcing journeys delivers it to the right users in the context in which they would use it.
We built a simple, intuitive workflow with just a few screens. It’s actually pretty difficult to build such a streamlined experience on the front end. It’s even more work on the back end.
The “Oh, sh*t” Moment
The hardest part was building the index to consolidate all of the supplier data. Our data set had grown considerably since we worked on Perfect Fit, and there was a very high volume of data that had to be not only collected and analyzed, but sometimes also normalized to make sure we wouldn’t be repeating the same supplier under different formulations of their name.
For many years now, we have been investing in building data processing and analysis capabilities using machine learning and artificial intelligence for just this purpose. Here, there was not only a high volume of data, but also a lot of complexity because we wanted to be very careful with it. We needed privacy filters to ensure that we didn’t show any sensitive data. We wanted to make sure that the suppliers returned in results would be suppliers that several Coupa customers had done business with, and that the items had been purchased more than once and by several Coupa customers.
There is a lot of nuanced data processing needed to support a feature like this. When you’re using machine learning, the whole system gets more intelligent over time, but the more data you have, the harder it gets, because the data keeps growing and changing underneath you. On top of that we were really pushing our limits in terms of providing the best matches and the most comprehensive sets of results and do it in real time. It’s not as easy as taking data, processing it and showing it. You have to run some logic to extract just the right bits of information, which turned out to be much harder than we thought it would be.
Every project has it’s “Oh sh*t” moment. This was ours.
We were stuck for about four days, evaluating possible solutions we could buy or build. In the end, we added a new technology stack that would enable us to do faster indexed search. We had to take some extra time to learn how to work with it, but we considered it a good investment because we also saw future applicability to scenarios we are envisioning for different modules of the platform.
Putting the Plane Together
Building something like this is a bit like building an airplane. There are different teams putting together the components that will eventually be assembled into an aircraft. Along the way, we always do demos of different components to different groups, so you can see the wings and the landing gear and other parts being perfected, but there’s still some anxiety about how they’re all going to come together. Nobody has yet seen the whole thing; it still exists only in the imagination.
Once you get to the final assembly line, when all the pieces are nearing completion and it looks like they’re all going to fit, the excitement starts to build. At that point, we scheduled one big demo where everyone who had worked on the product was invited. We were able touch the different pieces and enter in keywords. The results seemed to be relevant, the displays of information seemed complete, and the filtering worked. It’s a moment of deep satisfaction when you see everything coming together like that, and everyone can feel proud of their work, and of being part of the team.
It’s even more satisfying when you can start demoing the whole capability to internal people, partners, and customers who were excited about it but not necessarily able to visualize it, and the feedback is good:
“This is exactly what we expected.”
“This adds tremendous value to the existing workflows in Coupa.”
Only the Beginning
While we considered the project a success, really it is only the beginning. As the system gets smarter, speed and relevance will get even better over time. But, you cannot walk away and blindly trust the machine. It’s very complex and will constantly need some tuning, especially as the data set from which we draw community intelligence grows. We already have plans to surface even more metadata about suppliers, perhaps as soon as the next release.
At Coupa we always like to quote a bit of wisdom from “The One Minute Manager” author Kenneth Blanchard: “None of us is as smart as all of us.”
That is the philosophical underpinning of what we are building on top of community intelligence. It’s also the story of Supplier Insights, and every great product: No one person is capable of doing it on their own. Even if you have very smart people that can do things very well individually, it’s the sum of all of them that delivers something that’s really outstanding.
Based on our growing ability to use machine learning and our enhanced search capabilities, we already have some new ideas about where we can apply community intelligence to bring new insights to our customers that they have never before seen in Coupa, or anywhere else in business spend management.