In the early 1990s, executives and managers welcomed information technology — databases, PC workstations, and automated systems — into their offices. They saw the potential for significant business gains. Computers wouldn’t just speed up processes or automate certain tasks — they could upset nearly all business processes and allow executives to rethink operations from the ground up. And so the reengineering movement was born.
Now it’s happening again. Powerful machine-learning algorithms that adapt through experience and evolve in intelligence with exposure to data are driving changes in businesses that would have been impossible to imagine just five years ago. The PCs and databases introduced during the reengineering of the 90s have grown up: the rules-based codes written by engineers are giving way to learning algorithms driven by the machines themselves. As a result, business processes are being machine-reengineered.
Algorithms aim to redesign business processes just like humans did during the original reengineering movement. Then, reengineering was limited by the speed of humans. Managers noted historical trends and revised processes, and engineers developed code that was then baked into computing systems. Every update or response to the market required multiple steps; it cost time and performance. Sometimes, by the time changes were in place, the market had already moved. With machine-reengineering, process changes are constant and driven not just by history but also by the predictive capabilities of machine-learning algorithms. Machine-reengineering asks that people train and actively manage the performance of the algorithms and data models that drive process change, rather than drive process change themselves.