Saturday, 24 July 2010

Max Delbruck's opinion on the moral qualities of science

Max Delbruck - 1906-1981. Nobel Prize 1969.

Question (1971): "Does scientific research by itself foster high moral qualities in men?"

Delbruck's answer: "Scientific research by itself fosters one high moral quality: that you should be reasonably honest.

"This quality is in fact displayed to a remarkable extent.

"Although many of the things that you read in scientific journals are wrong, one does assume automatically that the author at least believed he was right." 

(Quoted p282 in Thinking about Science: Max Delbruck and the origins of molecular biology. EP Fischer &  C Lipson. 1988)

***

Comment: that was written in 1971, by a man who was one of the most well-connected of twentieth century scientists, a kind of godfather to molecular biology, and a person of great personal integrity.

So Delbruck was in a position to know what he was talking about.

And, in 1971, he was able to state that scientific research by itself fosters the high moral quality that you should be reasonably honest. And that this quality is *in fact* displayed to a remarkable extent. And that when reading journals scientists could and did assume that the authors were telling the truth as they saw it.

Only 40 years ago Delbruck could state that scientists were in fact, in reality, in practice - honest...

4 comments:

dearieme said...

When I started my career and first refereed amnuscripts, it never occurred to me that I would need to check that the authors had written an abstract that was an honest summary of their paper. By the time I semi-retired, there was a box to tick to that effect on the referees' forms - though, of course, the form would refer to "accurate" rather than "honest". Which was itself accurate, but perhaps not entirely honest.

Bill said...

Consider p<0.05 As every competent person knows, two papers with the same results, one with p=0.045 and one with p=0.055 contain essentially the same information, cause a rational person to move their beliefs by the same amount, and are similarly worthy of publication. But they are not remotely similarly publishable.

It is also the case that, if you get p=0.055, there is some way you can torture your data or analysis to get it down to p=0.045. You can find a reason to throw out some cases. You can use a slightly different statistical method. You can control for a slightly different set of variables or in a slightly different way. Or you can just lie straightforwardly, either by changing data or by lying about the p value.

But, why is there the p<0.05 standard, given its transparent stupidity, uselessness, and incitement to dishonesty? Because careerists need concrete goals to plod towards. Referees and editors want concrete standards to use to reject papers.

The cognitive dissonance just on this one aspect of scientific work is amazing. If you approach things one way---with a smile and a nod and a wink and an absence of judgment---you can get working "scientists" to admit that "everyone" tortures their data to get to the magical p<0.05. If you approach things another way---question the utility of p<0.05 directly or don't, yourself, bother to do the torturing to get from p=0.06 to p<0.05---you get outraged condemnation and intimations of incompetence.

The lying and the careerism are intimately causally entwined throughout the enterprise.

a Finn said...

What does honesty mean?

Honest and upright low IQ people might be the most honest people there are. They mostly know what they don't know and they are often ready to admit they don't know something. They don't make inflated logical claims, especially when they know higher IQ people would prove them to be false. They use simple logics that deliver practical results in their life. These simple logics are not connected to intricate theorizing.

I illuminate this topic a little with an example. I debate often in a certain Finnish blog. One of the debaters is a mathematically oriented computer program reseacher. No matter what the topic is, he (X) produces complex and beautiful logics starting from his point of view. So impressive were they, that often people who had the opposite opinion, would immeadiately change they views to conform to X's opinions (his openness about his scientist status of course helped; authority psychological principle). I happen to see through those logics, so I again and again pointed out the errors and presented opposite logics. After a while people started to wait my answers to evaluate X's comments.

Rarely was there any feeling that X lied. He just produced the maximum verbal complexity he could, and it was enough to mesmerize him too. To see errors in a logic, especially profound errors, not just small errors, one have to rise above the said logic with one's knowledge and/or IQ. While there are many people who can point logical errors in lower IQ people's logic, there are considerably less people who can point profound errors in scientists' logics.

The other extreme is the politicized science, all those marxist and liberal etc. scientists. We should not underestimate self-deception capabilities of humans. Logics are often elaborate dressings rationalizing income sources and status, hiding the truth from consciousness. But here the distortions, lies, framing of issues, compulsory views etc. are so vivid that intelligent people can't fail to see them at least sometimes as they are. Then the disturbing aspects might require self-soothing explanations and convincing. It is easy to see that something has gone seriously wrong, when e.g. University of Berkeley is called even in the liberal media as the "People's Republic of Berkeley".

Continued ...

a Finn said...

Part 2.

People try to be consistent and continue what they have publicly said and done. Consistency is interpreted as a sign of sanity, reliability, logic etc., so people often defend energetically their views even if the views are faulty.

Rationality is generally the lowest in psychological importance; urgency; as a shaper of self-image and identity; etc. Generally the psychological importance order is 1. What people do (and perhaps say); 2. Emotions; 3. Rationality.

This is rational, because that is generally the importance order of survival knowledge to the people. Let's say that a person is a ropedancer in high altitudes. He has danced on ropes many times and he feels fear when he does so. He has overcome that fear, which exhorts the opposite of ropredancing, many times. His ropedancing is here the most reliable knowledge about his capabilities and it thus wins out over fear. If he would be doing ropedancing at high altitudes first time, overcoming fear might require more psychological effort and/or preparation. Fear and other emotions are mostly independent of will and serve as a reality check in many situations. Rationality is a survival tool, but also dangerous teller of fairy tales. A person might rationally convince himself that he is a capable ropedancer at high altitudes, if he would just do it, although he has never walked on ropes. When he climbs to the high altitudes and the rope is before him, sudden fear serves as a reality check and warner. Thus generally emotions wins out over rationality.

Despite it's generality, it is possible to people to change that order.

People can't do even boring, ostensibly non-emotional and rational assorting of technical papers without the help of emotions (D'Amasio, 1994).

This psychological importance order corresponds also to the evolutionary development order of action, emotions and rationality.