Book Read Free

Labyrinth- the Art of Decision-Making

Page 17

by Pawel Motyl


  Let’s return to our business example, though. In the case of the company I was describing earlier, the diffusion of responsibility also led to a serious incident that could have ended in tragedy. The chemicals used in the production process were stored in a warehouse. Out of laziness, one of the employees decided to store one particular fluid close to the entrance, so it would be easier to transport it to the production area. The barrels were placed below a row of shelves where some canisters containing another substance were stored. This was against the rules, because the two chemicals would react if they came into contact, creating an asphyxiating gas. Pretty much all of the warehouse employees knew about this, but in true turkey mode, they assumed that seeing nothing bad had happened so far, this was an acceptable risk. On top of that, due to the collective indifference, nobody felt any need to inform anyone about this dangerous situation.

  Then, one day, while maneuvering in the warehouse, a forklift operator punctured one of the containers, and its contents started to drip onto the barrel below. As bad luck would have it, the liquid dripped directly onto a leaky barrel that no one had seen fit to report (“it’s always been OK”). The two substances reacted, and the resulting compound poisoned four workers, three of whom were hospitalized. According to the doctors who treated them, they owed their lives to the rapid responses of their colleagues, as breathing the gas in for only a few minutes more would have irreversibly damaged their central nervous systems.

  When an internal health and safety team spoke to warehouse workers during their investigation of the accident, they encountered identical attitudes to those exhibited by the thirty-eight people who stood back and let Kitty Genovese die. All the employees questioned denied responsibility:

  “It’s not part of my job description to report such things.”

  “Everybody knew about it. I was sure that the management knew, too, after all, the logistics director was here not long ago.”

  “Someone must have reported it to management, seeing as they did nothing about it, it meant it was okay to store it like that.”

  “No one else did anything about it, so why should I? That’s above my pay grade.”

  “There were more important matters.”

  The board took a lot of convincing that the statements were genuine.

  Decision-making in organizations is a complicated game between often conflicting forces and interests that can pull the decision-maker in different directions. The process is rooted in the evolution of the work environment, replete with conscious and unconscious habits, practices, and behaviors, many of which can seriously compromise the objectivity of the final decision. More and more often, then, the key to making the right choice requires an acute awareness of these covert games and an ability to resist emotional pressures with the hard logic and concrete facts that only an inquiry approach will provide. This combination is what distinguishes authentic leaders—those who, regardless of the circumstances and pressures, have the courage to make Drucker-like correct choices. They not only introduce perfect decision-making processes, or create systems for analyzing data, but also shape an organization’s culture to promote an inquiry approach throughout the company, engage disaffected groups of workers who have unique knowledge bases, and prohibit personal emotions from interfering with a rational, cool-headed assessment of a situation.

  The NASA example shows that even the most professional and competent organization can undergo an invisible and dangerous cultural metamorphosis when it submits to the influence of external and internal pressures, exchanging former behaviors and attitudes for new ones with which everyone feels comfortable, even if that comes at the cost of safety, quality, or profits. This is actually characteristic of two very different types of business: large corporations and relatively small private companies. In the case of corporations, there comes a point at which purely rational arguments are overtaken by a desire for stability and maintenance of the status quo, and procedures and regulations are introduced, which gradually shape a culture of blind obedience. The same thing can happen in privately owned businesses, except that instead of procedures, it is the opinions and views of the owner that prevail. Here, too, cultural change evolves to ensure employees enjoy a quiet life, rather than being blinded by the harsh light of reality. The true purpose of leadership—to keep everyone focused and on task—becomes lost.

  Thus, in both large and small businesses, the cultural re-education process eliminates people bold enough to express their opinions openly, people who by nature are constructive devil’s advocates and whose presence is invaluable in a proper inquiry approach. The phenomenon is lethal, not only for the quality of communication and the depth of discussions, but also for leadership and the future of those with the potential to lead the company through impor-tant changes. Cultural inertia and pressure to conform can kill the engagement of even the most competent person...

  Unless, of course, they’re an authentic leader.

  7

  In Search of Authentic Leaders

  Leadership. There’s probably no other topic in business that exerts such fundamental influence over the quality of the decisions taken within a company and creates such a strong feeling of unease in me. Everyone talks about it. Google will find you over half a billion pages if you search for the word leadership. The number of self-proclaimed experts running training sessions on it seems to grow with every passing hour, at every turn we can apparently learn “how to become an outstanding leader in a weekend,” and more and more people, whom you might not immediately associate with the topic, are crawling out of the woodwork to offer their two cents on the topic. The debate about whether leadership is innate or learned, or whether being a leader is a question of personality or of experience resounds at ever more conferences and seminars, leaps at us from the pages of magazines, rampages through the blogosphere and social media.

  However, while we can gorge ourselves on theories, in one form or another, we are positively starving when it comes to people who can take those theories and turn them into practices. Thus, we find ourselves constantly repeating the mantra about a lack of leaders. We say there are no authorities, no clear vision, no strong characters to inspire us to challenge our own limitations and improve ourselves. According to many, this problem, this lack of real, authentic leaders, has even intensified in recent years. They’re right. Authentic leaders have always been a rare thing in the business ecosystem, and the pace of change in the environment means we need more of them than ever before, especially as a lack of leadership can have only two consequences: if we’re lucky, stagnation and a lack of development; if we’re unlucky, catastrophic errors like those that hit NASA in 1986 and 2003.

  Today, sound management isn’t enough. It’s not enough now to just “do things in the right way”—in the new normal, even the error-free implementation of decisions and following procedures will get you nowhere when you’re faced with a black swan. The second part of Peter Drucker’s definition of leadership has become key: success or failure is decided by the ability to “do the right things,” making the right decisions and the right choices, which we expect our leaders to do. The psychology of change has also begun to play a role, as the vast majority of people are afraid of change and are not keen to leave their comfort zone, which in the face of today’s macroeconomic, political, technological, and social turbulence is a pipe dream. People need a clear vision for sure, especially in times of uncertainty; therefore, they are looking for visionary leaders they can trust.

  Leadership ability has been admired and debated over millennia. In the beginning were the men, and the occasional woman, who changed the world, generally military, political, or religious leaders, or great scientific minds. Their common denominator was the ease with which they attracted a following, even if what they were offering wasn’t immediately attractive (take religions, for example, which involve respecting a set of rules and consciously limiting pleasure today in return for
a very remote promise of reward in the afterlife). The power to inspire and motivate others to behave in specific ways is both fascinating and terrifying—just look at the ease with which Adolf Hitler persuaded the German nation in the 1930s of the need to increase their Lebensraum, their living space, which led to the bloodiest conflict in human history. There are numerous examples of authentic leaders who attracted masses of people to follow them: Benito Mussolini, Chairman Mao, and even Osama bin Laden, to name but three.

  The definition of a leader is only simple at first glance. If we consider the multiplicity of roles a leader has to play, it’s actually extremely difficult to capture the essence of leadership in one or two sentences.

  For many years, the key words were vision and inspiration, and leaders were those who had a clear idea and were able to convincingly present it to others. Over time, those ideas were supplemented by authority, built on a range of factors: competence, personality, experience, and success. The next stage was the aspect of initiating change and leading others through crises, as well as making decisions in situations of uncertainty, which have all taken on a great deal of significance in recent years.

  In a business context, there is a clear distinction between managers and leaders. If we adhere to Drucker’s definition, we could say that managers are an organization’s “stabilizers”—they ensure that processes run as intended, making sure that previously agreed-upon goals are achieved and that tightly defined procedures are followed. Managers ensure that an organization behaves predictably. Leaders, on the other hand, make strategic choices, set out a vision, and inspire others to follow them, often challenging the organizational status quo. Note, though, that leaders often have no formal position of power in an organization’s structure; sometimes they are employees in specialized positions who have earned their place as authorities in the eyes of others. This clearly delineates the different sources of power and influence in the two roles: a manager uses the power invested in them by virtue of position, while a leader influences others by inspiring them to follow. Leadership is becoming more and more diffuse within an organization and is embodied by people at all levels of the hierarchy and in different areas of operation—even if at first sight we perceive the person at the head of the firm to be the leader.

  Of course, for an organization to grow healthily, there must be a balance between these two forces, as we require both inspiring visionaries and thorough practitioners. The first drive development, shaking the company out of its equilibrium, an ever-more dangerous state of being nowadays; the second ensure that plans are precisely executed, goals are achieved, and actions remain transparent.

  From the point of view of decision-making, then, leaders must keep a lot of balls in the air. They are strategists, responsible for formulating a convincing vision and making decisions that affect the long-term direction of development. Leaders mold the organizational culture, thus affecting the attitudes and behavior of others. In many cases, they also become the foundation for innovation as initiators of and the driving force behind organizational change, encouraging the company to develop; they are also responsible for creating tools and nonstandard procedures that improve decision-making processes.

  The Leader as a Visionary and Strategist

  The most obvious role of a leader in the decision-making process is to make the most important, breakthrough decisions. We look to leaders to make the final call on vital matters and expect them to shape the strategic perspective. It is the leaders who are responsible for the long-term consequences that result from decisions.

  From the strategic perspective, the prevailing understanding of the definition of a leader is also the most obvious one: a person from whom we expect a clear vision and direction. Human history is awash with visionary leaders. Think about President John F. Kennedy. In May 1961, he articulated an inspiring vision, which we’ve already discussed, that essentially set in motion the long-term, extraordinarily bold project to put humans on the Moon.

  Life isn’t a bed of roses, of course, and history also offers a plethora of examples of leaders making a strategic decision that was far from successful. Take, for instance, 1964, when President Lyndon Johnson decided to send US troops into Vietnam.

  The 1960s were characterized by an increasingly ruthless arms race accompanied by a political struggle between two superpowers for global domination. One of the regions that underwent powerful changes but remained something of a no-man’s-land in terms of spheres of influence was Southeast Asia. This area had a stormy history: since the second half of the nineteenth century, it had been part of the French colony of Indochina, composed of today’s Cambodia, Laos, and Vietnam, as well as a dozen or so other protectorates, which were gradually incorporated into the expanded colony. The second-most powerful colonial power in the region was the UK, which exerted its influence in Burma (Myanmar) and Siam (Thailand). The golden age of Indochina in the 1920s and ’30s didn’t last long. The German invasion of France in 1940 rocked French authority in the region, and the British had far more important things to worry about by then than some remote Asian colonies. The Japanese took advantage of French weakness and attacked the northern part of Indochina at the end of 1940, moving forward in the following months to eventually take control of the entire area. In a bit of a twist, the French managed to negotiate with the Japanese, who, busy battling the USA, ultimately left the entire French administration in Indochina in place. This lasted until 1945. At that point, the situation turned around—defeated by the Americans, the Japanese, who had just had atom bombs detonated over their cities, were helpless, and the French decided once again to take full control over their colonies. It turned out, though, that the locals had their own ideas about that, and in 1945, Hô Chi Minh declared Vietnam’s independence, while Norodom Sihanouk did likewise in Cambodia. Because the French weren’t inclined to lose the opportunity to regain control over the recently lost territories, an eight-year war broke out (the First Indochina War), which ended in France’s spectacular defeat in the Điện Biên Phủ valley in 1954. 1 The loss led not only to the withdrawal of Europeans from the contested territories, but also to significant political decisions at the Geneva Conference that same year. During the conference, it was decided to split the former territory of Indochina into four independent nations: Laos, Cambodia, and North and South Vietnam.

  In this way, Southeast Asia became a mosaic of highly politically discrete countries. Thailand favored the USA, which ended up launching the majority of its flights over Vietnam from there. Burma was shaken by internal conflict (in 1947, the ruling General Aung San was murdered), fighting off the Chinese Kuomintang, who were advancing more and more boldly into the north of the country, and trying to remain independent of any of the world superpowers. So, while the Burmese were fighting communist movements, they also bluntly refused to join the Southeast Asia Treaty Organization (SEATO), an American initiative set up in 1954. In Laos, which gained independence in 1953, the Pathet Lao communist movement was powerful—so powerful, in fact, that it seized full power in the 1970s. The story was similar, albeit far bloodier, in Cambodia, where at the end of the 1970s, about 25 percent of its 8 million inhabitants were killed in a grisly civil war started by the Khmer Rouge. In the years prior to the war in Vietnam, Cambodia, ruled by Prince Norodom Sihanouk, tried, like Burma, to remain independent and neutral; it had some success in this respect until a certain Saloth Sar, a.k.a. Pol Pot, took power.

  Looking at the culturally diverse and vivid political patchwork of postcolonial Southeast Asia, the schism in the worldview of Vietnam should come as no surprise. The pro-American south, with its capital in Saigon, soon came into conflict with the North Vietnamese Viet Minh movement. The Geneva Accord had assumed there would be only a temporary division of the country along the 17th parallel and had required both sides to conduct elections in preparation for reunification by the end of 1956, but things didn’t quite turn out that way. The ruler of South Vietnam, Ngô Đìn
h Diệm, called off the elections in his part of the country, under the pretext that the communist regime in the North would interfere with the process. The country remained divided and the conflict between Saigon and Hanoi became more apparent, occasionally leading to partisan fighting incited by units from the communist North, which at the same time was trying to foster the communist movement spreading through South Vietnam. The communist partisans grew stronger, which disconcerted the Americans, who were unconcerned about the communists coming to power in little Laos but balked at the prospect of a 40-million-strong Vietnam supported by the Soviets, considering it a serious threat to its global interests. So, when a frightened Ngô Đình Diệm turned to them for help, a significant military force was added to the group of military and political advisors who had already been supporting him: at the end of 1961, the USS Core sailed into the port of Saigon, with over thirty assault helicopters and several hundred crew onboard. The American contingent in South Vietnam thereby crept above a thousand. It didn’t help Ngô Đình Diệm, though. The government in Saigon was growing weaker, and attacks by the communist partisans from the North became more and more audacious. Communist movements in the South grew stronger, and Ngô Đình Diệm paid the ultimate price—he was murdered in November 1963 during a military coup d’état.

 

‹ Prev