Decades of Lean and continuous improvement initiatives have profoundly changed the way many companies operate.
We have gone from a traditional organization with a hierarchical structure of leaders who make decisions and implementers who obey, to a network of men and women who each bring their knowledge to improve performance every day a little more.
At the end of the 1990s, I had attended a presentation by a Japanese leader who compared the dominant approaches at that time in the Western countries and with those in use in Japan. He pointed out that in the West we were progressing mainly through successive breaks: major investments, new technologies, which each time brought us to a new level, but that when we were at that level, we tended to stagnate, while preparing for the next breakthrough innovation. We would invest in new equipment, but we would let it deteriorate, get dirty and rust. Conversely, he said with a hint of arrogance, we Japanese when we reach a level we never stop trying to improve, which ensures that we stay ahead of you, and that the step to jump to the next level will be lower.
We now recognize the power of this approach and have adopted it widely. We are smarter together. We also recognize the power of training and managerial support to help all of our teams progress, on the one hand through the knowledge of methods and tools – problem solving, statistics, 5S, DDMRP, etc. – and on the other hand by sharing clear visions on common values and the direction in which the company is heading.
I was thinking about this recently when I heard a company’s objection to DDMRP: the sizing of the buffers, and in particular the red zone, seems too simple, even simplistic. This company prefers to explore more sophisticated techniques: probabilistic approaches, artificial intelligence. It seems more reassuring to them, more scientific.
In addition, this company manages tens of thousands of references, so automation is required to productively handle this complexity. This is legitimate.
Those who know me know that I am a bit of a geek. I have been interested in neural networks since the 90s, I have tried to learn the basics of data science, and I enjoy popularizing these approaches through tools like Power BI to make data sets talk. Any process of continuous improvement starts with observation and measurement, and we are better and better equipped for that.
AI And The Supply Chain
I am convinced that Artificial Intelligence can help improve a supply chain management model. Our work at Demand Driven Technologies confirms this, but we do not think it will be the alpha and omega of the supply chain! While artificial intelligence will prove supply chain performance, there will remain a critical role for human intelligence and involvement.
For example, the industry has already experienced the black box phenomena that planners are often confronted with, and which lead them to elaborate an understandable logic, outside the system, in Excel. Artificial intelligence carries this risk. We have not come all this way to replace the “boss who knows” by “artificial intelligence who knows.”
Let us not forget that the data we manipulate is not only scientific. For example, a supplier’s delivery performance may depend on intangible elements: the quality of the visibility we give it, the attention we pay to it, the quality of the relationship between interlocutors from both parties. If you apply a machine learning algorithm to your supplier’s delivery history to assess variability, or to the customer demand, will it tell the whole story, and will it give you the real levers for improvement?
I have no doubt that Artificial Intelligence can help in decision making, mainly by helping planners to identify exceptions to analyze, and by giving clues on settings to adjust.
However, the real driver of progress is visibility, which feeds into team-driven improvement loops (PDCA). Do not be afraid if the initial setting of your DDMRP buffers is “about right”. This is exactly the logic that has brought considerable progress through the Kanban method: you start pragmatically, and then you improve.
We use a “Smart Buffer Profiler” in our solutions. This wizard analyzes the histories and proposes buffer profiles for all stocked items. Experience shows that this process is very efficient, even on data sets containing tens of thousands of items and establishes relevant sizing. Not exact, but relevant!
But this is only the beginning! The software is not that “smart”, it doesn’t really know your suppliers, your means of production, the life of your items, the behavioral biases of the players in your supply chain, it doesn’t know how to question the constraints of batch size or lead time, etc. You must rely on your teams who do have this knowledge.
Much more than inventory sizing or forecasting algorithms, what you need is visibility, ease of reading and analysis, collaboration and a shared vision of what’s important for your teams to drive continuous improvement, your best ally for an ever more efficient and adaptable supply chain.