Wired scientist

 

Wired did a whole piece on Seth’s comment yesterdayHere’s what doesn’t work: hacking around and ignoring what doesn’t work” titled “Accept Defeat: The Neuroscience of Screwing Up” by Jonah Lehrer. He was particularly looking at the dynamic of failures in scientific experimentation. In principle, such investigations should embody the very core of embracing failure which permeates the Scientific Method, but instead even a scientist’s human nature can get in the way. Or in scientific terms, their neuroscience can get in the way of their science.

I’ve excepted some rather long pieces from the article because (a) the article itself is long, and (b) it has lots of brilliant stuff. The elevator summary is the following: Despite over half of all useful data from experiments coming from unexpected results, it often gets neglected because our brains are ‘wired’ (pun intended) to miss failures which such data is perceived as. A powerful remedy is to assemble research teams from diverse disciplines which helps to alleviate blind spots by introducing different perspectives.

  • Insights from Aberrations – “Kevin Dunbar is a researcher who studies how scientists study things — how they fail and succeed…Dunbar decided to launch an ‘in vivo’ investigation, attempting to learn from the messiness of real experiments…Dunbar came away from his in vivo studies with an unsettling insight: Science is a deeply frustrating pursuit. Although the researchers were mostly using established techniques, more than 50 percent of their data was unexpected. (In some labs, the figure exceeded 75 percent.) ‘The scientists had these elaborate theories about what was supposed to happen,’ Dunbar says. ‘But the results kept contradicting their theories. It wasn’t uncommon for someone to spend a month on a project and then just discard all their data because the data didn’t make sense.’ Perhaps they hoped to see a specific protein but it wasn’t there. Or maybe their DNA sample showed the presence of an aberrant gene. The details always changed, but the story remained the same: The scientists were looking for X, but they found Y…The lesson is that not all data is created equal in our mind’s eye: When it comes to interpreting our experiments, we see what we want to see and disregard the rest.’”
  • Objectivity Myth – “The reason we’re so resistant to anomalous information — the real reason researchers automatically assume that every unexpected result is a stupid mistake — is rooted in the way the human brain works. Over the past few decades, psychologists have dismantled the myth of objectivity. The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend we’re empiricists — our views dictated by nothing but the facts — we’re actually blinkered, especially when it comes to information that contradicts our theories. The problem with science, then, isn’t that most experiments fail — it’s that most failures are ignored.”
  • Diversity Difference – “But not every lab meeting was equally effective. Dunbar tells the story of two labs that both ran into the same experimental problem:…’One of the labs was full of people from different backgrounds,”’Dunbar says. …’The other lab, in contrast, was made up of E. coli experts. ‘They knew more about E. coli than anyone else, but that was what they knew,” he says. Dunbar watched how each of these labs dealt with their protein problem. The E. coli group took a brute-force approach, spending several weeks methodically testing various fixes. ‘It was extremely inefficient,’ Dunbar says. ‘They eventually solved it, but they wasted a lot of valuable time.’ The diverse lab, in contrast, mulled the problem at a group meeting. None of the scientists were protein experts, so they began a wide-ranging discussion of possible solutions. At first, the conversation seemed rather useless. But then, as the chemists traded ideas with the biologists and the biologists bounced ideas off the med students, potential answers began to emerge. ‘After another 10 minutes of talking, the protein problem was solved,’ Dunbar says. ‘They made it look easy.’ When Dunbar reviewed the transcripts of the meeting, he found that the intellectual mix generated a distinct type of interaction in which the scientists were forced to rely on metaphors and analogies to express themselves. (That’s because, unlike the E. coli group, the second lab lacked a specialized language that everyone could understand.) These abstractions proved essential for problem-solving, as they encouraged the scientists to reconsider their assumptions. Having to explain the problem to someone else forced them to think, if only for a moment, like an intellectual on the margins, filled with self-skepticism. This is why other people are so helpful: They shock us out of our cognitive box. ‘I saw this happen all the time,’ Dunbar says. ‘A scientist would be trying to describe their approach, and they’d be getting a little defensive, and then they’d get this quizzical look on their face. It was like they’d finally understood what was important.”
  • Silver Linings – “What turned out to be so important, of course, was the unexpected result, the experimental error that felt like a failure. The answer had been there all along — it was just obscured by the imperfect theory, rendered invisible by our small-minded brain. It’s not until we talk to a colleague or translate our idea into an analogy that we glimpse the meaning in our mistake. Bob Dylan, in other words, was right: There’s no success quite like failure.”
Advertisements