Jezper - Fotolia
"What if?" is perhaps one of the more critical and, in some ways, daring games IT, operations, logistics and business managers can play. Performing well as a company today is largely a product of reliably guessing what will happen tomorrow.
Knowing the future is also useful in evaluating upgrades to existing processes and implementing new technologies to game out the practical results of those changes. And to play out all of these what-if scenarios, savvy managers now have powerful analytics tools and methodologies, including planning simulation in business, at their disposal to make the guessing less daring but just as critical.
Forecasting is not nearly enough
In many industries, the practice of envisioning what the world of tomorrow might look like has long been called forecasting -- projecting next year's numbers based on this year's real numbers with some new hypothetical factors mixed in. For example, "What can sales reach if we increase our marketing budget by n dollars? Or how much can we reduce inventory if we implement a new just-in-time production model?"
Forecasting has its place, but it suffers from inherent linearity, rooted in metrics that are well-known in a company's current models. The entire concept of forecasting is essentially incremental, searching for safe places to nudge the status quo where it can be nudged.
But the complexity of the modern enterprise has made that technique less effective as relationships with partnering companies grow more complex and interdependent and data more transitory and intertwined. Time is now the biggest factor in capturing accurate snapshots of the status quo, and technology strangely has been more of a barrier than an accelerator of communications between partnering companies.
Simulation now supplants forecasting. Today, complex models may be altogether nonlinear. Butterfly effects caused by slight disruptions, shortages or slow decision-making in the moment, for example, can rapidly oscillate into serious performance downturns. Effective models of business operations as well as performance simulations -- far more complicated to build and utilize than mere forecasts -- are mission-critical tools today.
In-house simulation of a business's operations -- with its intricate and expensive modeling and data support requirements -- is tough enough. But how can planning a simulation be expanded beyond the enterprise and encompass the world it services?
Partners in simulation
As more companies enter into partnerships, the data needed to run simulation models is no longer sourced in-house but from partnering companies. Simulations that can project tomorrow's status quo -- and the impact of possible methodology and technology changes -- just can't be fueled entirely from home base; they need critical input from outside partnering systems. And, increasingly, the shelf life of that input is decreasing as lead times shrink along with reliability of those numbers from outside the company.
If a business model needs fresh numbers now to feed a planning simulation, then the numbers from partnership companies are likely to be from simulations as well. So, how is it possible to keep those simulations in sync?
The answer is simple and so is the hands-on solution. Partnership companies in areas such as supply chain relationships, healthcare consortiums and finance industry alliances can create simulations of their collective processes together. They can rebuild their worlds together, and learn together how to improve their processes, strengthen their relationships and optimize their collective performance.
Just a short time ago, this level of integration would not have been possible or very expensive and time-consuming. It's now possible for partnership companies to create elaborate, fine-tuned simulations of their collective activity for predictive and remedial purposes, using cloud technology that's convenient, easy to use and readily available. It's just a matter of discussing all that with the partners and pulling the trigger.
Launching a planning simulation
We'll use Microsoft as our example simply because most of its terminology is generally known: A shared Azure instance hosts the simulation resources. Data from partnered clouds and systems enters via a data factory -- a configurable pipeline system. It's deposited in a data lake, where it sits on processing clusters -- powered by Spark or Hadoop -- and can be loaded into data bricks.
These bricks are structures that can be accessed, reworked and otherwise fashioned into a wide range of models, all of which can be manipulated in SQL, Scala, Python or R. Put another way, participating partners can code their portions of the simulations in their language of choice.
Participants each maintain their code in notebooks, and notebooks can call other notebooks to execute complex processes. Again, it doesn't matter if languages differ between notebooks. Large-scale simulations can then be constructed on a common platform, drawing data as needed from external sources; languages can be mixed and matched. Finally, reporting platforms can tap directly into the data lake.
We don't need to single out Microsoft as an example. Data bricks don't have to be staged in Azure; a number of other cloud platforms have similar functionality and user-friendliness for analytics, including Amazon EMR, Domino Data Science Platform and H2O.ai Advanced Analytics Platforms.
The power to share simulation resources is a strategic planning advantage that can reshape industries. For once, the technology is not only available, but it's also less complex and less costly than simulation itself. So now is the time for competitive businesses to consider taking the planning simulation route.