Using AI to secure the global supply chain

With global supply chains becoming increasingly complex, manufacturers are starting to turn to AI to help untangle the web. The results could have huge national security implications.
Written by Tony Baer (dbInsight), Contributor

According to leading supply chain analyst Louis Columbus, the biggest challenge facing the high tech industry in 2019 will be securing their supply chains. The challenge is that, as supply chains grow more complex, manufacturers may not be able to answer the question with confidence as to who their suppliers really are. In the good old days, supply chain issues centered around product quality and cost; with high tech products today, national security could be at risk. As fresh evidence, Columbus cites a recently published investigation by Bloomberg Businessweek on how Chinese subcontractors snuck rogue spyware chips onto the motherboards for servers allegedly supplied to high profile clients like Apple and Amazon – incidents that the companies have not publicly acknowledged.

PricewaterhouseCoopers (PwC) has launched Know Your Vendor, a platform service for manufacturers to reduce uncertainty surrounding vendor risk and reliability. For complex global supply chains, that's a task that's much harder than it sounds. You might identify your primary suppliers, but when they run out of capacity, most downstream manufacturers may not know who actually supplied a particular subcomponent upstream. That lack of visibility allowed the rogue spyware chip to sneak its way into finished products before the good guys realized.

The PwC service harnesses machine learning to help manufacturers open the window on their supply chains to understand and mitigate risk exposure and potential compliance issues in their supply chains. Specifically, it helps you identify the vendors in your supply chain, which can be a challenge when your primary suppliers have to subcontract out work (which is how those rogue chips turned up in the Bloomberg-cited case). And based on data from a variety of sources, the service helps manufacturers quantify the risk associated with conducting business with those vendors and the resulting impacts on compliance mandates.

The task is certainly complex, but what differentiates its approach is that it utilizes big data and then trains machine learning models to help deliver the conclusions. According to a principal with the project, today's highly complex, often changeable global supply chains make such tasks too big for humans alone to get their arms around. The key to automating this solution requires the familiar combination of data science, data engineering, and domain expertise.

It built its service  on the Koverse Intelligent Solutions Platform. The Seattle-based company, founded by NSA veterans, provides a data integration, indexing, and search platform built atop Hadoop, using Apache Accumulo instead of HBase. Accumulo is a swap-in key-value store replacement running in Hadoop that supports security down to "cell" (column and row) level. It was originally developed for the NSA because, at the time, Apache Hadoop lacked adequate security, and because NSA's requirements for data security were far more stringent than most private sector companies needed. The Koverse platform differs from traditional data warehouses in that it labels and indexes data on compute. That's key because of the diversity of data sources that it ingests, ranging from known sources such as internal financial systems to email, customer or supplier portals, supplier or customer portal data, social media chatter, along with sources from the "dark web."

For instance, a manufacturing customer seeking to identify instance of product counterfeiting could scan customer portals, social media and other sources, using natural language processing to identify every instance of complaints. In this scenario, they might receive complaints from retailers identifying instances of other stores selling the same product for 90% less.

Getting to that point requires a combination of supervised and unsupervised learning in a walk-before-you-run approach. Starting with a known data set, such as an internal financial system where questionable transactions are already flagged, the task is conducting supervised learning to start building the model, then gradually expanding it with adding of new training data sets. Once the initial model hits 90th percentile accuracy rates, the model can then be unleashed in unsupervised mode to new or lesser known data sets and new scenarios.

While getting the model right sounds complex, the surprising thing is that "the math is easy" compared to the task of wrangling data. That's consistent with our findings of data scientists spending upwards of 80 – 90% of their time cleansing and harmonizing data – a task that only in the best of circumstances could be reduced to maybe half their load.

With the service running now for just over a year, there are now tangible results to report. For a large consumer products manufacturer, they were able to literally map the customers supply chain, which in a number of cases identified obscure suppliers with 4th or 5th party subcontracting relationships for which the CPG company was not previously aware. And using machine learning models, the service identified confidence levels regarding the level of risk, and in some cases, identified alternative sourcing strategies that reduced risk and regulatory compliance issues.

While you cannot (and of course should not) take humans completely out of the equation, deciphering global supply chains has grown sufficiently complex that it takes a machine to help figure things out. 

Editorial standards