We live in a society where it’s almost impossible to give science too much credit. Ever since the atom bomb and the space race, it’s just been taken for granted that civilization advances through the progress of science. Science—we are told—grows our food, cures our diseases, creates our new technologies, and just generally propels the human race forward.
If science is the engine of progress, then those who have not been captured under its spell must be dusty relics of prejudice and caprice. Fields under the sway of hidebound tradition must be bulldozed and renovated in the image of science. Thus doctors, instead of making decisions by random whim, must be forced to practice “evidence-based medicine” where all their prescriptions are backed by randomized controlled trials. Policymakers, instead of just being bleeding-heart do-gooders, must temper their enthusiasm for regulation by doing cost-benefit analyses to see if their proposals make sense. Managers, instead of following their intuition, must subject their strategies to rigorous experiment—through A/B tests in the market.
But what’s weird about this mania for science is how unscientific it all is. As far as I know, no studies have shown that evidence-based medicine leads to better patient outcomes or that companies which practice comprehensive A/B testing are more profitable than those which follow their intuition. And the evidence that science is responsible for stuff like increased life expectancy is surprisingly weak.
But there’s such a mania for science that even asking these questions seems absurd. How could there possibly be evidence against evidence-based medicine? The whole idea seems like a contradiction in terms. But it is not.
Recent decades have seen science encroach on the kitchen, with scientific approaches to cooking and cuisine. Where other chefs might simply follow instructions they found on a yellowing scrap of paper, the new modernists seek to understand the physics behind their actions. This approach has led to some interesting new techniques, but it’s also led us to understand that some of those silly traditions aren’t so silly after all.
Eggs, for example, were often beaten in copper bowls. Why copper bowls? Chefs might have been able to give you some kind of reason, but it would have sounded silly to scientific ears. But the modernists discovered that the ions in the copper ended up forming complex bonds with the conalbumin in the eggs.
This was not something that chefs had ever established as scientific knowledge—no aproned Isaac Newton ever discovered this was the right way to cook the eggs—but it was knowledge chefs had nonetheless. It was, in Polyani’s phrase, tacit knowledge, part of the things society genuinely knew but was never able to write down or clearly prove.
Scientism systematically destroys tacit knowledge. If chefs were forced to follow “evidence-based cooking”, not using anything special like a copper bowl until their was a peer-reviewed double-blind randomized controlled trial proving its effectiveness, the result surely would be worse food. So why is it crazy to believe the same attitude leads to worse medicine?
In business, too, scientism could be quite destructive. Can Steve Jobs provide a proof for the rightness of every iPhone feature? Can Doug Bowman do a scientific experiment to justify his every shade of blue? Forcing them to could well make their work far worse instead of better.
Scientism even fails just within our own heads. If you’re struggling with a decision, we’re taught to approach it more “scientifically”, by systematically enumerating pros and cons and trying to weight and balance them. That’s what Richard Feynman would do, right? Well, studies have shown that this sort of explicit approach repeatable leads to worse decisions than just going with your gut. Why? Presumably for the same reason: your gut is full of tacit knowledge that it’s tough to articulate and write down. Just focusing on the stuff you can make explicit means throwing away everything else you know—destroying your tacit knowledge.
Of course, there’s no guarantee that just trusting your gut will work either. Intuition and tradition are often just as wrong as scientific cluelessness. And in the cases where they genuinely have little to contribute, throwing them away (or quarantining it until it’s proven by scientific test) might not be such a bad idea. But I’ve always just assumed that this was always true—that tradition and intuition had nothing to contribute, unless carefully coached by scientific practice. That science was the only way to get knowledge, rather than just another way of codifying it. Now, instead of throwing it all away, I’m now thinking I ought to spend more time finding ways to harness all that tacit knowledge.
Scientism systematically destroys tacit knowledge. If chefs were forced to follow “evidence-based cooking”, not using anything special like a copper bowl until their was a peer-reviewed double-blind randomized controlled trial proving its effectiveness, the result surely would be worse food.
Really? Why? I think you’re hiding some sort of massive background assumption here, and it ought to be dragged out into the light and examined. (Much like how experiments drag out all sorts of unfounded beliefs, or evidence-based medicine examines expensive or deadly quackery.)
‘One day when I was a junior medical student, a very important Boston surgeon visited the school and delivered a great treatise on a large number of patients who had undergone successful operations for vascular reconstruction. At the end of the lecture, a young student at the back of the room timidly asked, “Do you have any controls?” Well, the great surgeon drew himself up to his full height, hit the desk, and said, “Do you mean did I not operate on half the patients?” The hall grew very quiet then. The voice at the back of the room very hesitantly replied, “Yes, that’s what I had in mind.” Then the visitor’s fist really came down as he thundered, “Of course not. That would have doomed half of them to their death.” God, it was quiet then, and one could scarcely hear the small voice ask, “Which half?”’
posted by gwern
on August 10, 2012 #
I want to say a lot but the loudest thoughts are asking a series of questions that play Battleship around the central theme I’m trying to tease out: where did this confusion come from? How could someone believe that reliance on the scientific method (or rationality in general) means embracing a belief that the scientific method (or rationality) is the only path to knowledge?
You seem to take a narrow view of what science is and how it’s done. Is this rhetorical? Are you representing an extreme reductivist worldview to try to make a point?
posted by Jrbl
on August 10, 2012 #
I meant in the short-term (because AFAIK there are no peer-reviewed double-blind RCTs of food science), not in the long-term. Do you disagree with that or was I just unclear?
Yes, you were unclear and still are. See, this is exactly what I mean by unstated assumptions: why does this matter in the short-term at all? If one does a quality RCT and can then reject use of copper bowls, what’s the problem? Aren’t we unambiguously better off? Yes, of course we are since we need no longer worry about using a non-copper bowl and can save money by not buying niche copper bowls.
Why wouldn’t we be better off in the short term and long-term both? The only explanation I can come up with is that you are appealing to some sort of hidden background premise like ‘I believe that anything without RCT backing ought to be immediately banned and not allowed at all’, which would indeed crush the space of possible bowls or foods down to a tiny one and make us much worse off.
Stated baldly like this, this background premise is obviously a terrible idea and doesn’t map onto the thinking of any of decent proponents for any ‘scientism’.
It might make sense if a field is so very contaminated by bad ideas that adding up all the practices, we find that the field makes things worse on net. Then by essentially banning the field, we change our net loss to neither loss nor gain, which at least is an improvement. But such fields are very rare!
Even medicine manages to improve health on net. Consider Hanson’s notorious views on health: he thinks we waste a lot of money and are often made sicker by health care… but only around a third of it is wasted, and spending no money on health care would be much worse than the current situation. So a blanket ban on non-RCT-proven stuff would make us worse off if there were no RCT-proven stuff, but of course there is a lot of such stuff, so to conclude that such a ban would be a good or bad idea would require a lot of investigation to see what % of current spending is RCT-approved and what % is not approved and what the final bang per buck estimate might look like… Spending much more on EBM would be a good idea, and better heuristics like ‘new treatments need a higher burden of evidence’ may also be a good idea - but when we stop looking at an apparent strawman, the issue is much subtler.
In the case of food, we generally have a pretty high confidence that many foodstuffs are tasty and you don’t want to burn them, and things like that. Examples like copper bowls are by far the minority in cooking, so even if every such example were really false, your standard cookbook would still be better than nothing.
(Exercise: grab a cookbook like Joy of Food; open to a random page, and see what % of wordcount is discussing apparently unjustified tips like copper bowls. I remember spotting one recently, in the yogurt recipe, cautioning me to not add in too much starter because this would be bad for some unknown reason. I remember it, you see, precisely because it seemed so arbitrary and out of line with the rest of the straightforward instructions. But even this is pretty easily RCTed, if anyone care! And I’d bet that there’s already results on this either in the biology literature or the yogurt makers’ internal studies, if one knew where to look, since improving quality by not messing up starter culture volumes is an obvious optimization.)
I don’t know if you got that from me, but I frequently quote it.
I actually got it from Tufte’s Data Analysis for Politics and Policy, sorry.
posted by gwern
on August 10, 2012 #
These people very much exist, and they tend to troll threads to elicit responses. A recent HN thread had users debating the value of philosophy since it wasn’t scientifically testable. When others tried to make an appeal for intellectual honesty, it was ignored.
posted by MG
on August 10, 2012 #
The problem is, we have a bunch of bogus tacit knowledge as well.
Take, for example, the Monty Hall problem. For anyone not familiar with it, it’s a game show. You have three doors, behind two are goats and behind the third is a prize. You select one of the doors, and the host opens an unselected door with a goat behind it. The host then offers to let you change your choice.
It is, strangely, significantly more favorable to switch. Your selected door has a 33% chance of containing a prize, and the other door has a 66% chance of having the prize. You can find this out with some math, or you can just set up a computer simulation and watch what happens, but it runs completely counter to most people’s intuition about the world.
posted by emcmanis
on August 10, 2012 #
As per your argument, the evidence-based approach introduced by science is leading people to inhibit their tacit knowledge attempts. Even thought it happens to be true, it doesn’t look like science or scientists fault. What’s beautiful about science though, is the fact that whenever a theory fails, that’s taken as progress. That’s knowledge being crafted. And the interesting point here is that (as far as the human history can tell us) virtually all progress in science has started from tacit knowledge based insights. (Therefore, there is still there is a clear difference between intuition that leads to empirical learning and random emotional based non-sense. )
posted by Vini
on August 11, 2012 #
science helps to cut the downside, sloppy mistakes and prejustice. but it is not so good at making things from good to great.
how to use gut feeling when sending some rover to mars?
posted by toivo
on August 11, 2012 #
I very much agree with you. In a sense our culture over-emphasises rational knowledge and the scientific method. However, there is a whole history of ‘tacit knowledge’ and ‘unconscious empiricism’ or what could be described as ‘knowing without understanding’, similar to your example with the copper bowl. The ancient Romans used cement without understanding how it worked, the same is true for breeding livestock and plants or brewing beer. Just think of the highly complicated process of indigenous drug preparation involving enzyme inhibitors. I would say that its a good time (As described in Sennett’s ‘the craftsman’) to remember that the scientific method is not the only way of approaching problems.
posted by Michael Hohl
on August 13, 2012 #
you’re right to doubt your faith in science. as someone raised in a “scientism” family, and who studied math/science, i’ve been thinking about this for a long time. i haven’t written anything on it for fear of the knee-jerk-ish responses like the ones you’ve gotten.
i am a skeptic above all else. and while i think the scientific method has value and its place, there’s a huge amount of knowledge embedded in cultural practices, knowledge that has been built up over time - generations, millenia - that sometimes, even science cannot explain. it’s not just tacit knowledge that a person has, but tacit knowledge that a community or culture has.
some thoughts i’ve had on this that are not fully-formed:
1) the scientific method can only tell us about things where situations can actually be controlled. but real life, involving humans in particular, is much more complex than that. just as an example - it’s pretty much impossible to do a RCT study of a food substance, because human beings do not live in labs and do not eat the same exact meals and do the same exact things day in, day out. (if you get into the literature, lots has been written on this).
2) the law of unintended consequences. the idea that clinical trials will tell us everything that might possibly go right or wrong with a medication continues to be disproven. one of the most egregious examples, in my opinion, are the osteoporosis drugs which have been shown to actually increase certain bone fractures. (i’m paraphrasing - google for the details.)
3) however much we think we can know through science, we will never know what we DON’T know. for example: we have some notion about what-all nutrients our bodies need. we have RDAs for vitamins and minerals. but what other substances exist in foods that we need, but we haven’t yet discovered? the notion of antioxidants being something our bodies need is a very recent thing. or what if it’s about certain relationships between substances in foods, or certain balances between substances? i could go on, but as i said, these are fuzzy, half-formed ideas… can’t really get it all out of my head yet.
4) from my personal exploration of the relationship between food and health, i’ve come to the conclusion that we as humans have co-evolved with the foods that we eat. but this is such a complex relationship, and the “effect” that any one food might have on our body is so difficult to tease out from the effect any other might have… i just don’t see how “science” or the scientific method could ever really get to explaining this. especially since (my theory, based in some evidence, is) that this relationship is constantly shifting anyway.
commenting on some of the comments:
“Examples like copper bowls are by far the minority in cooking,”
how do you know this? there are actually plenty more examples of this than you may realize. there many even be examples that you don’t know about.
“I’d bet that there’s already results on this either in the biology literature or the yogurt makers’ internal studies, if one knew where to look, since improving quality by not messing up starter culture volumes is an obvious optimization.”
having spent a lot of time making yogurt and trying to learn more about/optimize the process (and understand the reasoning behind “standard practice”), i can tell you that precious little about the process has been studied in a way that would be useful to non-corporations, or in general, in an unbiased way. and if anything has been done internal to companies, good luck finding that data. “science” in this country is incredibly shaped by society.
the monty hall problem is a math/probability problem. it doesn’t really involve the scientific method, which is what’s really being discussed here.
posted by a.
on August 15, 2012 #
I think you are confusing science with something else.
Why would science tell cooks to stop using copper bowls to whisk eggs? A good scientist would do tests first, and those tests would have established that whisking eggs in copper bowls DOES make sense. It would not make sense to tell a cook to stop using copper just because no formal scientist had done a test that it was better.
Being pro-science doesn’t mean not to ever do anything which hasn’t been double blind tested first. That’s an absurd argument.
“that tradition and intuition had nothing to contribute, unless carefully coached by scientific practice. “
What do you think tradition is? While much is hocus pocus, some of it is just things done over and over again and refined over time because they worked. Look at the development of agricultural crops over the last 10,000 years - that’s science in action, before the development of the formal scientific method.
Where reality intrudes heavily, tradition is bound to be shaped by it, and follow something akin to the scientific principles.
posted by KO
on August 16, 2012 #
“I’ve always just assumed that this was always true—that tradition and intuition had nothing to contribute, unless carefully coached by scientific practice.”
That assumption is what you’re wrestling with, and ‘scientism’ I hear as being a name for that assumption. I guess I think of intuition and tradition as deep wells that we can draw on as we choose, and when the bucket comes up we depend on our empirical faculties to judge its content.
Scientism, though, would pave over the well.
posted by Brian
on August 17, 2012 #
KO says Swartz is “confusing science with something else” — I believe that “something else” is Frequentism [1], specifically hypothesis testing where the null hypotheses are completely skeptical of “tacit knowledge.”
Certainly some fields of science follow this approach more than others (though I’d be hard pressed to produce an ordered list). On the other hand, as a practicing Bayesian [2], I stand with others’ objection that what Swartz describes is not scientism.
[1] http://en.wikipedia.org/wiki/Frequentist_inference
[2] http://en.wikipedia.org/wiki/Bayesian_inference
posted by A Strauss
on August 19, 2012 #
You can also send comments by email.
Comments
Really? Why? I think you’re hiding some sort of massive background assumption here, and it ought to be dragged out into the light and examined. (Much like how experiments drag out all sorts of unfounded beliefs, or evidence-based medicine examines expensive or deadly quackery.)
posted by gwern on August 10, 2012 #
I want to say a lot but the loudest thoughts are asking a series of questions that play Battleship around the central theme I’m trying to tease out: where did this confusion come from? How could someone believe that reliance on the scientific method (or rationality in general) means embracing a belief that the scientific method (or rationality) is the only path to knowledge?
You seem to take a narrow view of what science is and how it’s done. Is this rhetorical? Are you representing an extreme reductivist worldview to try to make a point?
posted by Jrbl on August 10, 2012 #
Yes, you were unclear and still are. See, this is exactly what I mean by unstated assumptions: why does this matter in the short-term at all? If one does a quality RCT and can then reject use of copper bowls, what’s the problem? Aren’t we unambiguously better off? Yes, of course we are since we need no longer worry about using a non-copper bowl and can save money by not buying niche copper bowls.
Why wouldn’t we be better off in the short term and long-term both? The only explanation I can come up with is that you are appealing to some sort of hidden background premise like ‘I believe that anything without RCT backing ought to be immediately banned and not allowed at all’, which would indeed crush the space of possible bowls or foods down to a tiny one and make us much worse off.
Stated baldly like this, this background premise is obviously a terrible idea and doesn’t map onto the thinking of any of decent proponents for any ‘scientism’.
It might make sense if a field is so very contaminated by bad ideas that adding up all the practices, we find that the field makes things worse on net. Then by essentially banning the field, we change our net loss to neither loss nor gain, which at least is an improvement. But such fields are very rare!
Even medicine manages to improve health on net. Consider Hanson’s notorious views on health: he thinks we waste a lot of money and are often made sicker by health care… but only around a third of it is wasted, and spending no money on health care would be much worse than the current situation. So a blanket ban on non-RCT-proven stuff would make us worse off if there were no RCT-proven stuff, but of course there is a lot of such stuff, so to conclude that such a ban would be a good or bad idea would require a lot of investigation to see what % of current spending is RCT-approved and what % is not approved and what the final bang per buck estimate might look like… Spending much more on EBM would be a good idea, and better heuristics like ‘new treatments need a higher burden of evidence’ may also be a good idea - but when we stop looking at an apparent strawman, the issue is much subtler.
In the case of food, we generally have a pretty high confidence that many foodstuffs are tasty and you don’t want to burn them, and things like that. Examples like copper bowls are by far the minority in cooking, so even if every such example were really false, your standard cookbook would still be better than nothing.
(Exercise: grab a cookbook like Joy of Food; open to a random page, and see what % of wordcount is discussing apparently unjustified tips like copper bowls. I remember spotting one recently, in the yogurt recipe, cautioning me to not add in too much starter because this would be bad for some unknown reason. I remember it, you see, precisely because it seemed so arbitrary and out of line with the rest of the straightforward instructions. But even this is pretty easily RCTed, if anyone care! And I’d bet that there’s already results on this either in the biology literature or the yogurt makers’ internal studies, if one knew where to look, since improving quality by not messing up starter culture volumes is an obvious optimization.)
I actually got it from Tufte’s Data Analysis for Politics and Policy, sorry.
posted by gwern on August 10, 2012 #
These people very much exist, and they tend to troll threads to elicit responses. A recent HN thread had users debating the value of philosophy since it wasn’t scientifically testable. When others tried to make an appeal for intellectual honesty, it was ignored.
posted by MG on August 10, 2012 #
The problem is, we have a bunch of bogus tacit knowledge as well.
Take, for example, the Monty Hall problem. For anyone not familiar with it, it’s a game show. You have three doors, behind two are goats and behind the third is a prize. You select one of the doors, and the host opens an unselected door with a goat behind it. The host then offers to let you change your choice.
It is, strangely, significantly more favorable to switch. Your selected door has a 33% chance of containing a prize, and the other door has a 66% chance of having the prize. You can find this out with some math, or you can just set up a computer simulation and watch what happens, but it runs completely counter to most people’s intuition about the world.
posted by emcmanis on August 10, 2012 #
As per your argument, the evidence-based approach introduced by science is leading people to inhibit their tacit knowledge attempts. Even thought it happens to be true, it doesn’t look like science or scientists fault. What’s beautiful about science though, is the fact that whenever a theory fails, that’s taken as progress. That’s knowledge being crafted. And the interesting point here is that (as far as the human history can tell us) virtually all progress in science has started from tacit knowledge based insights. (Therefore, there is still there is a clear difference between intuition that leads to empirical learning and random emotional based non-sense. )
posted by Vini on August 11, 2012 #
science helps to cut the downside, sloppy mistakes and prejustice. but it is not so good at making things from good to great.
how to use gut feeling when sending some rover to mars?
posted by toivo on August 11, 2012 #
I very much agree with you. In a sense our culture over-emphasises rational knowledge and the scientific method. However, there is a whole history of ‘tacit knowledge’ and ‘unconscious empiricism’ or what could be described as ‘knowing without understanding’, similar to your example with the copper bowl. The ancient Romans used cement without understanding how it worked, the same is true for breeding livestock and plants or brewing beer. Just think of the highly complicated process of indigenous drug preparation involving enzyme inhibitors. I would say that its a good time (As described in Sennett’s ‘the craftsman’) to remember that the scientific method is not the only way of approaching problems.
posted by Michael Hohl on August 13, 2012 #
you’re right to doubt your faith in science. as someone raised in a “scientism” family, and who studied math/science, i’ve been thinking about this for a long time. i haven’t written anything on it for fear of the knee-jerk-ish responses like the ones you’ve gotten.
i am a skeptic above all else. and while i think the scientific method has value and its place, there’s a huge amount of knowledge embedded in cultural practices, knowledge that has been built up over time - generations, millenia - that sometimes, even science cannot explain. it’s not just tacit knowledge that a person has, but tacit knowledge that a community or culture has.
some thoughts i’ve had on this that are not fully-formed:
1) the scientific method can only tell us about things where situations can actually be controlled. but real life, involving humans in particular, is much more complex than that. just as an example - it’s pretty much impossible to do a RCT study of a food substance, because human beings do not live in labs and do not eat the same exact meals and do the same exact things day in, day out. (if you get into the literature, lots has been written on this).
2) the law of unintended consequences. the idea that clinical trials will tell us everything that might possibly go right or wrong with a medication continues to be disproven. one of the most egregious examples, in my opinion, are the osteoporosis drugs which have been shown to actually increase certain bone fractures. (i’m paraphrasing - google for the details.)
3) however much we think we can know through science, we will never know what we DON’T know. for example: we have some notion about what-all nutrients our bodies need. we have RDAs for vitamins and minerals. but what other substances exist in foods that we need, but we haven’t yet discovered? the notion of antioxidants being something our bodies need is a very recent thing. or what if it’s about certain relationships between substances in foods, or certain balances between substances? i could go on, but as i said, these are fuzzy, half-formed ideas… can’t really get it all out of my head yet.
4) from my personal exploration of the relationship between food and health, i’ve come to the conclusion that we as humans have co-evolved with the foods that we eat. but this is such a complex relationship, and the “effect” that any one food might have on our body is so difficult to tease out from the effect any other might have… i just don’t see how “science” or the scientific method could ever really get to explaining this. especially since (my theory, based in some evidence, is) that this relationship is constantly shifting anyway.
commenting on some of the comments:
“Examples like copper bowls are by far the minority in cooking,”
how do you know this? there are actually plenty more examples of this than you may realize. there many even be examples that you don’t know about.
“I’d bet that there’s already results on this either in the biology literature or the yogurt makers’ internal studies, if one knew where to look, since improving quality by not messing up starter culture volumes is an obvious optimization.”
having spent a lot of time making yogurt and trying to learn more about/optimize the process (and understand the reasoning behind “standard practice”), i can tell you that precious little about the process has been studied in a way that would be useful to non-corporations, or in general, in an unbiased way. and if anything has been done internal to companies, good luck finding that data. “science” in this country is incredibly shaped by society.
the monty hall problem is a math/probability problem. it doesn’t really involve the scientific method, which is what’s really being discussed here.
posted by a. on August 15, 2012 #
I think you are confusing science with something else.
Why would science tell cooks to stop using copper bowls to whisk eggs? A good scientist would do tests first, and those tests would have established that whisking eggs in copper bowls DOES make sense. It would not make sense to tell a cook to stop using copper just because no formal scientist had done a test that it was better.
Being pro-science doesn’t mean not to ever do anything which hasn’t been double blind tested first. That’s an absurd argument.
“that tradition and intuition had nothing to contribute, unless carefully coached by scientific practice. “
What do you think tradition is? While much is hocus pocus, some of it is just things done over and over again and refined over time because they worked. Look at the development of agricultural crops over the last 10,000 years - that’s science in action, before the development of the formal scientific method.
Where reality intrudes heavily, tradition is bound to be shaped by it, and follow something akin to the scientific principles.
posted by KO on August 16, 2012 #
“I’ve always just assumed that this was always true—that tradition and intuition had nothing to contribute, unless carefully coached by scientific practice.”
That assumption is what you’re wrestling with, and ‘scientism’ I hear as being a name for that assumption. I guess I think of intuition and tradition as deep wells that we can draw on as we choose, and when the bucket comes up we depend on our empirical faculties to judge its content.
Scientism, though, would pave over the well.
posted by Brian on August 17, 2012 #
KO says Swartz is “confusing science with something else” — I believe that “something else” is Frequentism [1], specifically hypothesis testing where the null hypotheses are completely skeptical of “tacit knowledge.”
Certainly some fields of science follow this approach more than others (though I’d be hard pressed to produce an ordered list). On the other hand, as a practicing Bayesian [2], I stand with others’ objection that what Swartz describes is not scientism.
[1] http://en.wikipedia.org/wiki/Frequentist_inference [2] http://en.wikipedia.org/wiki/Bayesian_inference
posted by A Strauss on August 19, 2012 #
You can also send comments by email.