Simply Speaking: Fail better and you can fly higher

What if marketing had a black box? What if marketers ran campaigns as pilots fly planes? We would learn a lot, and our mission-directed behaviours would be seriously exposed. Whilst we would crash a lot, would we learn a lot?

Published: Jun 27, 2022 07:30:19 PM IST
Updated: Jun 27, 2022 07:32:27 PM IST

Marketers should actively look for data on their errors so they can drive improvements. Image: ShutterstockMarketers should actively look for data on their errors so they can drive improvements. Image: Shutterstock

In a knowledge-rich world, progress does not lie in the direction of reading information faster, writing it faster, and storing more of it. Progress lies in the direction of extracting and exploiting the patterns of the world… And that progress will depend on … our ability to devise better and more powerful thinking programs for man and machine.’
 
Herbert Simon, ‘Designing Organizations for an Information-rich World’, 1969.

One of the best accounts of the Middle Eastern part of the Silk Road is that by Ruy Gonzalez de Clavijo (d. 1412), who was sent as ambassador to Tamerlane by King Henry III of Castile and Leon in Spain. In his multiple audiences with the barbarian conqueror, never once did the taciturn Tamerlane look for praise. He inquired about his deficiencies. The Castilian delegates were asked to study the supply chain of the Chagatai Mongols end-to-end and suggest improvements.

Napoleon Bonaparte rarely held a council of war but always had volumes of details written on post facto battle evaluations. What was done wrong? Why and how to prevent the recurrence of any mistakes? So detailed was his grasp of past events that when, in exile after Waterloo, he wrote his memoir—The Memorial of Saint Helena—it transformed him from a bloody-minded despot to a fair-minded constitutionalist who had saved the Revolution and liberated Europe’s peoples. Published in 1823, the memorial rocketed to the top of the century’s list of bestsellers. In it, he wrote at length about his losses and faulty assumptions.

These examples, like numerous others, show us that great organisations and leaders tend to regard success as a journey of introspection. They want to discover their weaknesses and take care of them. They actively look for data on their errors so they can drive improvements. This flips the dynamic from ‘closed and defensive’ to ‘open and adaptive’. In many ways, this is the essence of the scientific method.
 
In 1953 and 1954 the British-made de Havilland Comet 1, the world's first commercially produced jet, crashed repeatedly by breaking up mid-air. The root cause was identified as hairline cracks in the square window frames. The Comet 1 is the reason why all planes have oval windows. More significantly, the chief investigator David Warren suggested that a near-indestructible flight data recorder, later called a ‘black box,' be installed in every aircraft. And it came to be. The black box records thousands of pieces of data per second, including the pilots' conversations in the cockpit, making it easier to determine the exact cause of a crash. When a crash happens, the first thing to look for is the black box. In it, lie the answers.

Also read: The secrets of highly successful young entrepreneurs
 
Air travel is a triumph of man’s scientific progress. With each crash, future flights become safer. This is true of many other areas: nuclear reactor management, air traffic control, missile control on an aircraft carrier, and navigating a submarine. All these are examples of high attention to detail, error-free operations and planning against mistakes.
 
Where does marketing stand in comparison?
 
Alas, marketing cannot claim such a scientific method. We improve only spasmodically and often regress to earlier norms. What if marketing had a black box? What if marketers ran campaigns as pilots fly planes? We would learn a lot and our mission-directed behaviours would be seriously exposed. Whilst we would crash a lot, would we learn a lot? Marketing celebrates awards, impact, and creation of IP. It rarely even mentions failure, forget spotlighting it. In corporations, big and small, we rarely ever take such a process approach to content development or marketing innovation.

Human beings are wired by evolution to be inattentive to detail. We are reflexive, instinctual and lack radical acceptance. Bertrand Russell observed that self-deception is incompatible with the good life. Our feelings defend our inadequacies. Accepting reality is easy only when you like what you see. But we must accept reality when it is bitter, especially when it’s bitter.
 
We live more than we remember. Therefore, keeping a logbook for historic records matters. Whenever you make a big decision, write down what is going through your mind—assumptions, conditions, decisions, and conclusions. If the decision turns out to be a dud, look at your flight data recorder and analyse precisely what led to your mistake. With each crash, remedial measures ought to make your future performance better. Self-criticism should be validated by the view of stakeholders. Invite criticism. Don’t jump into a bunker.

Psychologist and Nobel laureate Daniel Kahneman feels deeply pessimistic about our ability to change our behaviour, but Harvard psychologist Steven Pinker has more optimism—that rationality will overcome our baser instincts is the underlying argument of his 2011 book The Better Angels of Our Nature. This inescapable paradox has been dealt with by Matthew Syed in Black Box Thinking: Why Most People Never Learn From Their Mistakes — But Some Do. His core premise is that detailed independent investigation of our everyday screw-ups can prevent recurrences. If more data creates a better outcome, good evidence-based analysis in all areas of human behaviour should allow us to lift those blinders that prevent us from learning from our mistakes.

Also read: A young person's guide to being successful in the 21st century

We get more complex when we deal not with mere individuals but with groups, hierarchies and institutions. Geographies, divisions, and functions within large corporations typically have collegial protectiveness combined with an attitude of superiority to prevent outside scrutiny and regulation.

Marketers must recognise and showcase cognitive flaws. The empirical method is superior to gut feelings and unconfirmed experience. We cannot take a function-based, indistinct approach to complex issues failing to rise above our own intellectual biases. For too long, marketing has celebrated "I feel it in my bones” as a reason. As a matter of routine, nobody is held responsible for errors and most management signals say ‘give me good news, not bad news’. Neither culture nor incentives focus effort on eliminating marketing errors.

Failure: Why Science Is So Successful by Stuart Firestein, of Columbia University, is about ignorance. When we are blind to evidence, we cannot tell apart failure from success. At the end of the arc, marketing begins to look like fashion or art.

Gertrude Stein said that ‘a real failure does not need an excuse. It is an end in itself.’ Indeed, good failures are those that lead to thoughts, doubts, puzzles, paradoxes, and inconsistencies. Great discoveries arise out of failed attempts that continue to incite and mystify. Success requires embarrassing failures and embracing failures.

To those who claim marketing is alive and cannot be dealt with as a gated, measurable process, I present a most compelling argument derived from evolution. We know that mutations are DNA failures. Most have led to extinction, while only a very small percentage have unpredictably moved us up the evolutionary ladder. We are the ‘work in progress’ of numerous genetic mishaps. No one should expect great marketing to arise without a similarly vast number of intervening failed steps.

Also read: Team success starts with the individual—and with love

Nuclear power stations are safety-critical, complex systems. Three Mile Island, Chernobyl, and Fukushima are examples of occasions when things have gone wrong. The measures for safety are focussed on the system itself. The process aims at a multi-layered, multi-factor defence to prevent mistakes from becoming catastrophes. But control layers can increase the complexity and even the unpredictability of the system. What is needed is a learning culture. It relies on the 'error paradox'.
 
The error paradox was first discovered in healthcare, where it has been found that hospitals that report the most errors and near misses are the safest because the hospitals that are open and honest about their mistakes learn lessons and reform. Hospitals that report fewer errors are not safer but more likely ignoring, covering up or obfuscating issues. This isn’t just about system design, it is about a culture that regards errors not as things to be spun or covered up, but as learning opportunities.
 
To be able to learn from mistakes a very special kind of organisational culture needs to be created. Here are some characteristics that I could list :

1. Organisation-wide orientation

Need a common view of relevant data with interlocking and coherent goals.
 

2. Integration

Integrative policy, management, and communications. Failures in complex projects often occur at interfaces between parts. Stress fractures occur at boundaries.
 

3. Extreme transparency and communication

Horizontally as well as hierarchically communication should flow. More, richer, deeper communication. All must understand what was going on throughout the program.
 

4. Configuration management

There must be a process whereby huge efforts go into the initial design of a complex system whereby interdependencies are tested where possible by relevant people before a change is agreed upon.
 

5. Reinforce open communication

Build novel control centres to reinforce extreme communication. Spend money and time on new technologies and processes to help spread orientation and learning through the organisation.
 

6. Optimised structure

You need a complex mix of centralisation and decentralisation. While overall vision, goals, and strategy usually come from the top, extreme decentralisation must dominate operationally so that decisions are fast and unbureaucratic. Big complex projects must empower people throughout the network and cannot rely on issuing orders down through a hierarchy.
 

7. Fluid and purposeful system management

An obsession with approval processes is a sure sign of buck-passing which works against simplification and focus.
 

8. People and ideas are more important than technology

Remember Colonel Boyd’s dictum holds: people, ideas, technology — in that order.
To conclude, each day, we ought to remind ourselves of Beckett’s famous “Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.”

For great success, we must keep trying to fail because it is the only strategy to avoid repeating the obvious.



Shubhranshu Singh is vice president, marketing - domestic & IB, CVBU, Tata Motors. He writes Simply Speaking, a weekly column on Storyboard18. Views expressed are personal.