ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

Peer Review and the Scientific Process

<< < (46/47) > >>

IainB:
@Renegade: Yes, as you say, of course it's garbage science, but some people might quite rightly ask: "What's wrong with publishing it?" The alternative might be, for example, to (say) call it "fake news" or something, and then censor and censure the authors and put them in prison for nothing more than being nutjobs or simply exceedingly barmy.

Fortunately we have stopped the inhumane process of locking all barmy people and nutjobs up. It's not their fault that they seem to occupy a widely different reality to the rest of us.
As a child, I was fascinated by the concept of monasteries and convents. I asked my mother why these men and women would lock themselves up like that with groups of people who all believed the same sorts of things. She said that they had anomalous beliefs and thus had difficulty fitting in with a society that did not share their beliefs, and found that they were happier in the sort of "tribal lockups" (echo chambers) afforded by the convent/monastery. There they were happy and their collective energies could often be directed to doing useful/good things for others - enabling them to live potentially productive/useful lives -  rather than them being inhumanely locked up and whiling their lives away in the quiet Hell of a lunatic asylum.

The problem comes though, when people with strongly-held anomalous beliefs and views take to a public stage and adopt a charismatic messianic stance, and people start to fall under their spell. So an Austrian nutjob like Hitler, for example, could lobotomise a whole nation of apparently intelligent German people, who would then blindly follow him to perdition. He could do no wrong, and therefore, by following him - and by extension - neither could they (QED) - as the Nuremberg trials revealed. (Ring any bells?)

Such leaders do this by creating a hateful false and artificial dichotomy between "people who think correctly and believe correctly as we do" and those who don't - the "Others"). The "Other" will be variously regarded as being stupid, evil, deplorable unbelievers, or similar, and thus they are not real people deserving of a life of freedom and tolerance, and so must be variously converted, rehabilitated, re-educated, imprisoned, beaten up, or (at worst) tortured and beheaded/killed and have their property expropriated. Since the "Other" is not a real person like we are, then there is no ethical wrong in treating them like the scum/deplorables/unbelievers that they are - it is entirely legitimate to do so. This is simple fascism, and it seems to have invaded and permeated religion, Science and politics alike - including (say) from recent scary examples in the US, the EU, and of course the Middle East.

IainB:
Fascinating paper here with more on the “replication crisis”. Helps to explain a lot as to how we can erroneously take something as "fact" when it is not proven.
Research: Publication bias and the canonization of false facts
Silas Boye Nissen Tali Magidson Kevin Gross  Carl T Bergstrom
University of Copenhagen, Denmark; University of Washington, United States; North Carolina State University, United States
DOI: http://dx.doi.org/10.7554/eLife.21451
Published December 20, 2016
Cite as eLife 2016;5:e21451

Abstract
Science is facing a “replication crisis” in which many experimental findings cannot be replicated and are likely to be false. Does this imply that many scientific facts are false as well? To find out, we explore the process by which a claim becomes fact. We model the community’s confidence in a claim as a Markov process with successive published results shifting the degree of belief. Publication bias in favor of positive findings influences the distribution of published results. We find that unless a sufficient fraction of negative results are published, false claims frequently can become canonized as fact. Data-dredging, p-hacking, and similar behaviors exacerbate the problem. Should negative results become easier to publish as a claim approaches acceptance as a fact, however, true and false claims would be more readily distinguished. To the degree that the model reflects the real world, there may be serious concerns about the validity of purported facts in some disciplines.

DOI: http://dx.doi.org/10.7554/eLife.21451.001
(...read more at the link)

--- End quote ---

I think this could also be relevant to the condition Scott Adams refers to with a new term -  "cognitive blindness":
"It probably does have a name. It’s a mix of cognitive dissonance and confirmation bias at the least, but a special case in my opinion."
...
"It isn’t that you disagree with the strong form of the argument on the other side so much as you don’t know it exists no matter how many times it is put right in front of you."

--- End quote ---

Such blindness may extend to self-serving fraud. In this example report, scientists/management investigated at the US DOE had apparently planned a scheme that deliberately withheld relevant information from Congress, so that members of Congress could not see the strong form of an argument that risked disproving the DOE's management's preferred "truth" and politicised line of research funding: Dept of Energy gov’t scientist fired for answering Congressional questions contrary to DOE management views.

Renegade:
Such blindness may extend to self-serving fraud. In this example report, scientists/management investigated at the US DOE had apparently planned a scheme that deliberately withheld relevant information from Congress, so that members of Congress could not see the strong form of an argument that risked disproving the DOE's management's preferred "truth" and politicised line of research funding: Dept of Energy gov’t scientist fired for answering Congressional questions contrary to DOE management views.
-IainB (January 15, 2017, 10:34 AM)
--- End quote ---

I think you're right about outright fraud, but what might be even worse is simple error in "settled science".

I'm quite a fan of the Thunderbolts Project.

http://www.thunderbolts.info/

http://www.holoscience.com/

https://www.youtube.com/channel/UCvHqXK_Hz79tjqRosK4tWYA

If you go through their materials, it's flat out terrifying that we've made so many fundamental errors in science.

The most recent video walks through some issues:

https://youtu.be/Bl4fVY2d5ok

In particular, here:

https://youtu.be/Bl4fVY2d5ok?t=400

The 3 points he makes at 7:30 can't be ignored.

We should re-examine basic science as it appears that much of what we think we know is simply wrong.

IainB:
@Renegade:
The Thunderbolts Project looked interesting. Thanks. Good questioning approach as well - so, useful for educational purposes and for stimulating the enquiring mind (like my 15 y/o daughter's).

If the scientific error seems to be present as:
(a) deliberate error in cases of outright fraud - apparently in some areas of commercially-driven science and at least that one US government department (QED).
(b) errors in the application of the scientific method - e.g., in cases of gross error in other supposedly scientific institutions, "settled science", etc.

 - then, I find it not so much "terrifying" as simply appalling that, when one goes go through their materials, one can see so many egregious fundamental errors in scientific conclusions.

The quote they give is rather pertinent:
"l know that most men, including those at ease with problems of the greatest
complexity, can seldom accept even the simplest and most obvious truth, if
it be such as would oblige them to admit the falsity of conclusions which
they have delighted in explaining to colleagues, which they have proudly
taught to others, and which they have woven, thread by thread,
into the fabric of their lives." — Leo Tolstoy

--- End quote ---

We seem to invest so much of ourselves into our beliefs/paradigms that we inhibit our ability to think freely/critically, where nothing must be allowed to contradict those beliefs - which have (to us) become seemingly sacred truths. Edward De Bono called this "intellectual deadlock", and saw it as the single biggest obstacle to one being able to improve one's thinking skills (ref. "Teaching Thinking", by Edward De Bono). He theorised that, at bottom, it was the ego protecting itself.

As far as De Bono could see, this self-crippling phenomenon had a tendency to occur in the minds of most people, but from experience was more likely in intelligent people  - regardless of profession (i.e., whether they were business people, politicians, scientists or peer reviewers. etc.).

EDIT: 2017-08-03 - Major correction to wording in the last para re "fallacy".
And this is before one considers the self-evident fallacy that peer review proves the truth of some research, when in fact it proves nothing except that a review has taken place (QED). The material reviewed could still be false/flawed.

Renegade:
Ego is definitely an issue. And a difficult one. Admitting that you're wrong is hard. The first point in being honest, is being honest with oneself.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version