A study at Zara shows what companies must do to optimize human-algorithm interactions.
The human-algorithm relationship is crucial to the tools’ effectiveness.
Image: Shutterstock
Companies are turning to algorithms for recommendations on everything from which job candidate may be the best fit, to what products will tempt their customers. But to get the most out of these advanced analytical tools, companies must remember a key element: the human managers who will be using them, because the human-algorithm relationship is crucial to the tools’ effectiveness.
To understand this interaction, IESE Business School’s Anna Saez de Tejada Cuenca and UCLA Anderson’s Felipe Caro looked at managers’ use of algorithm-based decision support systems (DSS) for seven clearance markdown campaigns at Zara. The algorithm made recommendations, but it was up to the human managers to adopt or override them.
In a pilot test, Zara found that managers who followed the DSS recommendations increased revenue from clearance sales by almost 6%. That led Zara to roll out the use of DSS across its stores and franchises worldwide.
But then something changed: managers started ignoring the DSS recommendations, sometimes more than half the time; moreover, they would lower prices when the system recommended keeping them the same, or they would apply more aggressive markdowns than what was recommended. The result: managers that deviated more often from the DSS advice achieved less revenue.
What happened? Human biases kicked in. Specifically, managers were used to receiving weekly reports showing current inventory levels and they typically made decisions aimed at ensuring that nearly all remaining stock was sold as soon as possible.
[This article has been reproduced with permission from IESE Business School. www.iese.edu/ Views expressed are personal.]