A powerful explosion rocked a port terminal at a BASF facility in Ludwigshafen on October 17, 2016. Three people died and at least 30 were injured, eight seriously. The investigation is only in its initial phase, and only an interim report has been made available so far. Preliminary indications suggest the deadly explosion may have been caused by a worker cutting the wrong pipeline during routine maintenance and repairs.
Management at BASF, especially Chief Executive Kurt Bock and the human resources director, Margret Suckale, have spoken openly about the event, without assigning blame. Both have expressed a concern for the victims and their families and the desire to determine the cause of the accident.
This approach contrasts sharply with the response at Volkswagen, Europe's largest automaker, to its global diesel emissions scandal. In its communications since the emissions fraud was uncovered by U.S. environmental regulators in September 2015, VW has only referred to its "diesel issue." At its core, Volkswagen admitted to having manipulated diesel engines for years to artificially lower their emission levels to meet environmental laws in the United States.
Since the wrongdoing was revealed, the public has found out little about how the breach occurred. We don't even know if the company acted negligently or deliberately. Instead, Volkswagen has focused from the outset on blaming a small group of low-level managers and engineers who are alleged to have operated without the knowledge or direction of the board.
If that indeed is the case, there remains the question of how Volkswagen plans to ensure in the future that information about critical developments reaches the management and supervisory boards so that it can be discussed and dealt with in a constructive manner.
But the effectiveness of such critical crisis communication in large organizations is not limited to Volkswagen alone. There have been a series of occurrences in recent years in which negative developments have "surprised" top management. These include the Siemens corruption scandal and disastrous ThyssenKrupp steel mill projects in Brazil and Alabama, as well as Deutsche Bank's involvement in the LIBOR interest rate benchmarks manipulation ring. After each came to light, senior managers expressed shock and claimed that they had not been informed of the particular problems. In some cases, the CEOs resigned.
But do such departures actually change things within a company? Does the company take effective steps to insure a repeat?
If we look closer at these crises, we see that the causes always involved multiple layers of management and the negative developments did not in creep in and spread unnoticed. What all cases share is that there were no effective controls for detecting and preventing wrongdoing.
But sacking managers is not enough to save the day. Rather, the entire corporate and management culture must be scrutinized and changed. It is more about moving from a culture where errors are hushed up and punished towards an "open error'' management climate based on a reporting system without sanctions that prevents "surprises."
But what actually happens if someone spots an error or wrong decision in a company? ESMT business school in Berlin looked at how errors are dealt with within companies. We asked more than 300 European managers about their experiences. Almost all stated that they saw errors as completely normal and would not think twice about raising errors committed by others in the company, even if these were made by superiors. But 88 percent said they would only discuss these errors with the person involved in private, in spite of their supposed normality.
Interestingly, only 54 percent wanted their own errors pointed out in private. Nineteen percent felt that their errors should be discussed openly with employees and colleagues.
Often, a so-called "open'' discussion of errors is taken to mean a one-to-one conversation, even though if fewer people were involved, it would be a monologue. But why do superiors and employees seek a confidential conversation? Most are conditioned to be ashamed of errors and fear sanctions. So it is little wonder many do not want to admit their errors openly. But from an organization's perspective, this mindset impedes people from learning from their mistakes.
Amy Edmondson, the Novartis professor of leadership and management at Harvard Business School, has found that many workers are only prepared to openly admit negative incidents or developments if they can be sure neither they nor their colleagues will be sanctioned. A corporate culture offering an environment of psychological safety is a prerequisite for such openness, she has written. I would call this a basic requirement but not enough to create a culture in which errors are seen completely differently — namely as something normal. So long as this does not exist, there will always be reasons why employees prefer to remain silent rather than admit or report errors, even with psychological safety.
In steep hierarchies, many employees do not simply keep quiet about their superiors' errors out of fear, but out of respect for people in authority. A higher level position is often associated with authority and expertise. One can imagine a situation in which an employee may discover an error by a superior but questions whether it is really an error or not. Often, they will not question the error for fear of appearing foolish.
After several accidents, the U.S. Federal Aviation Administration and National Aeronautics and Space Administration carried out a general investigation into air accidents in the early 1980s to gather more evidence and analyze their causes. The results indicated that accidents were largely caused by human error. Other investigations conducted by the National Aviation Transportation Board showed that the big majority of air accidents happened when the captain was flying the plane. This finding came as a shock. The captain was, and is, the person with the most flying experience in the cockpit. As it turned out, the problem was the hierarchical disparity between the captain and the rest of the crew, none of whom dared to openly point out his mistakes. An open culture that allows the cockpit and cabin crew to question a captain's decisions and point out errors was found to be lacking.
Under the leadership of the FAA, NASA, and NTSB and with the cooperation of several universities, airlines, and the U.S. Air Force, a so-called "Crew Resource Management'' plan was developed for the airline industry in the early 1980s. It requires that flight crew be trained in flying skills but soft skills such as communication and modern management. The introduction of the program was anything but smooth. Captains were upset. They felt that the new program threatened their authority and decision-making power, and often found the new behavioral training to be an insult. It took almost a decade for them to accept the new protocols and recognize that open communication in the cockpit could benefit everyone, including them.
Following the string of accidents and scandals involving German companies, there is a question of whether such a radical approach can be introduced to boost openness, and lower errors. Aviation is a high-risk industry where errors can have disastrous consequences. In other industries, most managers do not arrive at work every day knowing that they are responsible for the physical safety of hundreds of people. But they are in charge of business processes, the success of their departments, the job security of employees, ethical behavior and the reputation and value of their company. So they have every reason to establish an error management culture in which errors are dealt with openly and put in place with a reporting system that tells them when there are problems.
At the end of the day, communicating openly about errors is relevant to any organization. Errors must no longer be seen as weaknesses, but must be accepted as normal. They should be identified, analyzed and dealt with in a timely manner. BASF has shown it is a role model in how it is going about learning from the October 17 accident and avoiding a potential repeat. Other German companies would be well advised to follow its lead.
(Jan U. Hagen is the Associate Professor and Head of the Practice Group Financial Services, ESMT)