It’s been a while since I last posted, and I collected a number of interesting links to subjects that I frequently discuss here, which I believe might interest my two regular readers.
By all means click on the links and read the original articles, they have much more details and additional sources in them, what I provide below are just a few samples. We begin with an article that shows how gullible conspiracy theorists are, and the lack of criticism they show when faced with stuff that confirms their crazy beliefs:
Facebook conspiracy theorists fooled by even the most obvious anti-science trolling: study
"Anti-science conspiracy theorists are so credulous they can’t determine when they’re being purposefully duped, according to a new study. A team of Italian and American researchers tested the social media biases feeding belief in conspiracy theories such as chemtrails, shape-shifting reptilian overlords, and the Illuminati, reported Motherboard. The researchers found that adherents to conspiracy theories are highly receptive to claims that support their views and rarely engage with social media pages that question their beliefs. The ongoing measles outbreak linked to unvaccinated children has exposed one danger posed by hostility toward science, which is promoted in large part through social media."
Continuing, a suite of articles which examine why some people put excessive trust on bogus claims and/or unduly doubt perfectly trustworthy science, including a discussion of the phenomenon of backfiring, that is, when people are confronted to actual data that contradicts their beliefs, they tend to dig in the heels even further on the bogus stuff.
Why Do Many Reasonable People Doubt Science?
"We live in an age when all manner of scientific knowledge—from the safety of fluoride and vaccines to the reality of climate change—faces organized and often furious opposition. Empowered by their own sources of information and their own interpretations of research, doubters have declared war on the consensus of experts. There are so many of these controversies these days, you’d think a diabolical agency had put something in the water to make people argumentative. And there’s so much talk about the trend these days—in books, articles, and academic conferences—that science doubt itself has become a pop-culture meme. In the recent movie Interstellar, set in a futuristic, downtrodden America where NASA has been forced into hiding, school textbooks say the Apollo moon landings were faked. In a sense all this is not surprising. Our lives are permeated by science and technology as never before. For many of us this new world is wondrous, comfortable, and rich in rewards—but also more complicated and sometimes unnerving. We now face risks we can’t easily analyze."
Why Do People Believe Stupid Stuff, Even When They’re Confronted With the Truth?
"Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper."
Why People "Fly from Facts"
"As public debate rages about issues like immunization, Obamacare, and same-sex marriage, many people try to use science to bolster their arguments. And since it’s becoming easier to test and establish facts—whether in physics, psychology, or policy—many have wondered why bias and polarization have not been defeated. When people are confronted with facts, such as the well-established safety of immunization, why do these facts seem to have so little effect? Our new research, recently published in the Journal of Personality and Social Psychology, examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts. But we find that people sometimes go one step further and, as in the opening example, they reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue."
Lest this leads to the idea that everybody is just speaking out of their own set of biases, there is such a thing as expertise, and experts do have the right to claim a better understanding (and consequently reasoning) of issues within their domain of expertise.
On the “right” to challenge a medical or scientific consensus
"There’s a not-insignificant difference between saying ‘you have no business challenging scientific experts’ and ‘you have no right to challenge scientific experts.’ The first is a warning to lay people and people without the appropriate expertise about why they should be very careful challenging a scientific consensus without saying that they have no right to make such challenges. What Mooney calls for is the recognition that there is such a thing as expertise and challenging it requires more than just a Google education." [This article earns bonus points: it cites the works of the sociologist ofscience Harry Collins, one of my key inspirations]
And the fact is, there is actual empirical, experimental evidence that we should trust experts in issues within their domains:
The science of why you really should listen to science and experts
"[J]ust as it was once academically fashionable to dis experts, the worm is now turning, and many are now standing up for them again. And to that trend, we can now add empirical evidence in experts’ favor, thanks to a fascinating new study out by Yale law professor and science communication researcher Dan Kahan and a team of researchers and legal scholars. (…) Kahan thinks the findings are generalizable beyond judges to other kinds of domain experts – at least when they are in their realm."
The next article examines the problem of how journalists should deal with pseudoscientific claims in their reporting, and although I do not quite agree with is, it’s important that at least some journalists are concerned with it:
Reporting on quacks and pseudoscience: The problem for journalists
"(…) How to report on popular purveyors of scientific nonsense without ended up giving them even more exposure — that is, spreading the disease of misinformation in the process of trying to wipe it out. (…) Yet the very act of debunking carries the implication that the quack is important or credible enough to warrant attention in the first place. At Vox, Belluz offers a few rules for crank watchers. She advises going after not only cranks, but their enablers: Who launched the career of Dr. Oz? It was Oprah Winfrey. She largely escapes blame for Oz’s promotion of quackery, but as Brendan Nyhan of Dartmouth told Belluz, we should be ‘not just naming and shaming the public figures who mislead people but the institutions that give them platforms.’ Belluz argues, sagely, that we should ‘avoid giving equal weight to both sides of an argument that aren’t actually equal according to science.’ Case in point: climate change, which is accepted by the vast majority of scientific experts. Nor should journalists elevate every ginned-up scientific ‘controversy’ into one warranting intensive debunking — that’s a prime means by which fringe claims gain a foothold in popular consciousness. The media should reserve its fire for pseudoscience promoted by public figures and government institutions."
Another threat to science comes from the deliberate manipulation of uncertainty by commercial interests that see scientific knowledge as a threat to their bottom line, with the tobacco industry being the first and foremost example – what Shermer calls "pseudoskepticism", others have named "manufactroversy":
What Can Be Done about Pseudoskepticism?
"What do tobacco, food additives, chemical flame retardants and carbon emissions all have in common? The industries associated with them and their ill effects have been remarkably consistent and disturbingly effective at planting doubt in the mind of the public in the teeth of scientific evidence. Call it pseudoskepticism. (…) Manufacturing doubt is not difficult, because in science all conclusions are provisional, and skepticism is intrinsic to the process. But as Oreskes notes, ‘Just because we don’t know everything, that doesn’t mean we know nothing.’ We know a lot, in fact, and it is what we know that some people don’t want us to know that is at the heart of the problem."
Finally, a way anyone can help fighting the rising tide of ignorance, misinformation and crazy conspiracy theories: get informed, but please, decent sources. A first step might be looking into the subjects listed in the article below: Proof; Theory; Quantum Uncertainty and Quantum Weirdness; Learned vs. Innate; Natural; Gene; Statistically Significant; Survival of the Fittest; Geologic Timescales; and Organic. Quantum theory and the fallacy of the "Natural" are two of my pet peeves, by the way:
10 Scientific Ideas That Scientists Wish You Would Stop Misusing