Saturday, September 16, 2023

Paul O’Neill - The art of making good mistakes - In Australia we are tearing trees at Daunt Avenues While World is burning 🔥

Feds spread $1 billion for tree plantings among US cities to reduce extreme heat and benefit health AP. We don’t need trees so much as we need forests and (blue zones and Bachs)


These days Australian suburbs are simply wasteland 





Confessions of a McKinsey Whistleblower

The Nation – Inside the soul-crushing, morally bankrupt, top-secret world of our most powerful consulting firm. “…What does McKinsey do? Generally, it deploys teams of sleep-deprived, overeducated young people to solve tough problems for organizations—typically for-profit businesses, though the firm also serves many governments and large nonprofit organizations. If you’re a CEO who wants help evaluating whether to enter a new market or lay off thousands of employees, you might hire McKinsey. McKinsey made the prescient decision to avoid credit for its work, keeping its client and project lists secret. In practice, this has insulated the company from the disasters it was party to, such as the collapse of Enron.

 (This secrecy also serves to deter nearly all current and former McKinsey employees from speaking to reporters, meaning that, despite my best efforts, some of the details in this piece are based solely upon my own recollections.) McKinsey’s recruiting materials offer you the chance to “Change the world. Improve lives.” Naïve as it seems in hindsight, I came to McKinsey believing those words. But after a year and a half there, I eventually understood that not only does McKinsey fail to make the world better—it often colludes with those who make the world worse…”



Paul O’Neill: The art of making good mistakes FT

Do good teams make fewer mistakes? It seems a reasonable hypothesis. 

But in the early 1990s, when a young researcher looked at evidence from medical teams at two Massachusetts hospitals, the numbers told her a completely different story: the teams who displayed the best teamwork were the ones making the most mistakes. What on earth was going on?

 

The researcher’s name was Amy Edmondson and, 30 years after that original puzzle, her new book Right Kind of Wrong unpicks a morass of confusion, contradiction and glib happy talk about the joys of failure. She solved the puzzle soon enough. The best teams didn’t make more errors; they admitted more to making errors. Dysfunctional teams admitted to very few, for the simple reason that nobody on those teams felt safe owning up.


The timeworn euphemism for a screw-up is a “learning experience”, but Edmondson’s story points to a broad truth about that cliché: neither organisations nor people can learn from their mistakes if they deny that the mistakes ever happened. Such denial is common enough, particularly at an organisational level, and for the obvious backside-covering reasons. But it can be easy to overlook the implications. 

For example, Edmondson recalls a meeting with executives from a financial services company in April 2020. With hospitals across the world overwhelmed by Covid-sufferers in acute respiratory distress, and many economies in lockdown, they told Edmondson that their attitude to failure had changed. 

Normally, they explained, they were enthusiastic about sensible risk-taking and felt it was OK to fail if you learnt from that failure. Not during a pandemic, however. They had decided that failure was temporarily “off-limits”. What nonsense. The moment that Covid turned the world upside-down was exactly the time to take calculated risks and learn quickly, not to mention a time when failures would be inevitable. 

Demanding perfection against such a backdrop guaranteed ponderousness and denial. It can be wise to aim for perfection, explains Edmondson, but not without laying the groundwork for people to feel safe in admitting mistakes or in reporting mistakes from others. 

For example, when Paul O’Neill became the boss of the US aluminium company Alcoa in 1987, he set the apparently unachievable target of zero workplace injuries. That target lifted the financial performance of Alcoa because it helped to instil a highly profitable focus on detail and quality. 

The case is celebrated in business books. But it would surely have backfired had O’Neill not written to every worker, giving them his personal phone number and asking them to call him if there were any safety violations. Another famous example is Toyota’s Andon Cord: any production line worker can tug the cord above their workstation if they see signs of a problem. (Contrary to myth, the cord does not immediately halt the production line, but it does trigger an urgent huddle to discuss the problem. 

The line stops if the issue isn’t resolved within a minute or so.) The Andon Cord is a physical representation of Toyota’s commitment to listen to production-line workers. 

We want to hear from you, it says. Creating this sense of psychological safety around reporting mistakes is essential, but it is not the only ingredient of an intelligent response to failure.


The art of making good mistakes Neither organisations nor people can learn from their errors if they deny mistakes ever happened © Guilllem Casasus

Do good teams make fewer mistakes? It seems a reasonable hypothesis. But in the early 1990s, when a young researcher looked at evidence from medical teams at two Massachusetts hospitals, the numbers told her a completely different story: the teams who displayed the best teamwork were the ones making the most mistakes. What on earth was going on? The researcher’s name was Amy Edmondson and, 30 years after that original puzzle, her new book Right Kind of Wrong unpicks a morass of confusion, contradiction and glib happy talk about the joys of failure. She solved the puzzle soon enough. 

The best teams didn’t make more errors; they admitted more to making errors. Dysfunctional teams admitted to very few, for the simple reason that nobody on those teams felt safe owning up. The timeworn euphemism for a screw-up is a “learning experience”, but Edmondson’s story points to a broad truth about that cliché: neither organisations nor people can learn from their mistakes if they deny that the mistakes ever happened. 

Such denial is common enough, particularly at an organisational level, and for the obvious backside-covering reasons. But it can be easy to overlook the implications. For example, Edmondson recalls a meeting with executives from a financial services company in April 2020. With hospitals across the world overwhelmed by Covid-sufferers in acute respiratory distress, and many economies in lockdown, they told Edmondson that their attitude to failure had changed. 

Normally, they explained, they were enthusiastic about sensible risk-taking and felt it was OK to fail if you learnt from that failure. Not during a pandemic, however. They had decided that failure was temporarily “off-limits”. What nonsense. The moment that Covid turned the world upside-down was exactly the time to take calculated risks and learn quickly, not to mention a time when failures would be inevitable. 

Demanding perfection against such a backdrop guaranteed ponderousness and denial. It can be wise to aim for perfection, explains Edmondson, but not without laying the groundwork for people to feel safe in admitting mistakes or in reporting mistakes from others. For example, when Paul O’Neill became the boss of the US aluminium company Alcoa in 1987, he set the apparently unachievable target of zero workplace injuries. That target lifted the financial performance of Alcoa because it helped to instil a highly profitable focus on detail and quality. The case is celebrated in business books.

 But it would surely have backfired had O’Neill not written to every worker, giving them his personal phone number and asking them to call him if there were any safety violations. Another famous example is Toyota’s Andon Cord: any production line worker can tug the cord above their workstation if they see signs of a problem. (Contrary to myth, the cord does not immediately halt the production line, but it does trigger an urgent huddle to discuss the problem. The line stops if the issue isn’t resolved within a minute or so.) The Andon Cord is a physical representation of Toyota’s commitment to listen to production-line workers. 

We want to hear from you, it says. Creating this sense of psychological safety around reporting mistakes is essential, but it is not the only ingredient of an intelligent response to failure. Recommended Management Psychological safety: the art of encouraging teams to be open Another is the data to discern the difference between help and harm. In the history of medicine, such data has usually been missing. Many people recover from their ailments even with inept care, while others die despite receiving the best treatment. 

And since every case is different, the only sure way to decide whether a treatment is effective is to run a large and suitably controlled experiment. This idea is so simple that a prehistoric civilisation could have used it, but it didn’t take off until after the second world war. As Druin Burch explains in Taking the Medicine, scholars and doctors groped around for centuries without ever quite seizing upon it. 

A thousand years ago, Chinese scholars ran a controlled trial of ginseng, with two runners each running a mile: “The one without the ginseng developed severe shortness of breath, while the one who took the ginseng breathed evenly and smoothly.” With 200 runners they might have learnt something; comparing a pair, the experiment was useless.

 The Baghdad-based scholar Abu Bakr al-Razi tried a clinical trial even earlier, in the 10th century, but succeeded only in convincing himself that bloodletting cured meningitis. 

One plausible explanation for his error is that he didn’t randomly assign patients to the treatment and control group but chose those he felt most likely to benefit. In the end, the idea of a properly randomised controlled trial was formalised as late as 1923, and the first such clinical trials did not occur until the 1940s. As a result, doctors made mistake after mistake for centuries, without having the analytical tool available to learn from those errors. 

Nearly 2,000 years ago, the classical physician Galen pronounced that he had a treatment which cured everyone “in a short time, except those whom it does not help, who all die . . . it fails only in incurable cases”. Laughable. But how many decisions in business or politics today are justified on much the same basis? 

A culture in which we learn from failure requires both an atmosphere in which people can speak out, and an analytical framework that can discern the difference between what works and what doesn’t. Similar principles apply to individuals. 

We need to keep an open mind to the possibilities of our own errors, actively seek out feedback for improvement, and measure progress and performance where feasible. We must be unafraid to admit mistakes and to commit to improve in the future. That is simple advice to prescribe. It’s not so easy to swallow. 


Tim Harford’s new book for children, “The Truth Detective” (Wren & Rook), is now available 

Follow @FTMag to find out about our latest stories first



*How to Know a Person*

The author is David Brooks, and the subtitle is The Art of Seeing Others Deeply and Being Deeply Seen.  I think of it as a book on how to appreciate others, even if you do not necessarily deeply know them, which is slightly different from David’s subtitle.  (Am I too skeptically Freudian when it comes to “knowing people”?)  An excellent book, I read it straight through, and I view it as a milestone in David’s career.  Does that mean I appreciate him?  Know him even?  Maybe just the former!

Due out October 24, you can pre-order here.

As I wrote to a friend: “If those who needed it would heed it, it would be one of the most useful books.”  The rest is up to you.