Because I suspected that “flat earth” (and for that matter, UFOs, despite their recent popularity) was a psyop used to discredit legitimate conspiracy theories, I never delved further into the details. Did flat earthers envision the earth as a rectangle floating in space? When an online acquaintance recommended the 2018 documentary Behind the Curve, I took the opportunity to learn more, although from the beginning I was wary of the film’s true purpose. "The NASA con job is somehow connected to various bugaboos: dangerous vaccines, chem trails, GMO foods, and a “transgender push in the media,” as one young guy puts it," a film reviewer states in a piece I found online. “Here we go,” I thought.
One of the first things that the documentary illuminates is that flat earthers view the earth as a dome, or a kind of “snow globe.” Disappointingly, the documentary provides no further explanation of flat earth theory nor any scientific refutation of the theory. Nor does it really explore what flat earthers believe the motivations behind an “earth is round” conspiracy might be; in most conspiracies, these are typically money, power, and/or cover-up of a crime. Instead, we get a lot of talking head analysis of the psychology of flat earthers. As I predicted, the talking heads associate flat earthers with other beyond-the-mainstream-pale types such as “anti-vaxxers” and JFK assassination conspiracy theorists (this despite the fact that more than 60% of Americans believe that Oswald did not act alone). Surprisingly, no-one in the film ever brings up the moon landing.
I did discover that flat earth theory is a lot more popular than I assumed. There are flat earth conventions, dating sites, and celebrities. There’s a flat earther radio personality named Patricia Steer who has been accused by other flat earthers of being a CIA asset (and frankly if I were to cast a honeypot agent, she would fit the bill).
Watching this documentary in 2023 was particularly thought-provoking, as many recently discredited “conspiracy theories” slowly gain credence within the mainstream press, and the public is asked, by some of the same outlets that initially discredited those conspiracy theories, to accept 33 gender identities as fact. A further accusation leveled against the flat earthers in the documentary is that they work backwards from their conclusion. This in an era when “captured industries” stand accused of the same thing, due to the corrupting power of funding sources. One of the flat earthers charges mainstream scientists with “scientism,” another timely concept.
At the end of the documentary, the flat earthers conduct an elaborate experiment to prove their hypothesis. I won’t spoil the ending, but it did make me think once again that examining conspiracy theories could be a useful tool for motivating student learning. What experiments might students dream up to test flat earth?
Due to years of media fear mongering around “mis” and “dis”information, I am finding that people are increasingly intolerant of views that dissent from mainstream consensus. In this environment, I do find the flat earthers rather endearing. The true believers, anyway.
Top Image: A snow globe that contains a spiral galaxy, digital art.png/ Wikimedia Commons
"I am finding that people are increasingly intolerant of views that dissent from mainstream consensus."
Your intution contradicts lots of converging evidence of *greater* mistrust of mainstream institutions -- media, higher ed, government, etc. -- in the US especially, and, I think, to varying degrees, throughout much of industrialized West. (In fact, the most profitable way I've found to read most of your contributions to this Substack is as a colorful illustration of this troubling phenomenon.) And in the US especially, in which affective polarization is so high, it's far from clear what is meant by "mainstream consensus" in abstraction from particular issues (many of which themselves are -- increasingly? -- fraught with contention).
As for making sense of flat-earthers and others who espouse fringe beliefs, I think Hugo Mercier hits the nail on the head:
(1) We should wonder about the *social goals* such people are pursuing (e.g. bridge burning, commitment-signaling):
(2) We should generally default to assuming that such (more or less outlandishly) fringe "beliefs", in contrast to our mundane beliefs, are usually cognitively insulated from the inferential and behavior-guiding processes that might otherwise lead to costly actions:
There are also some interesting personality dimensions conspicuously associated with folks who espouse fringe beliefs, such as narcissism and -- I think this one is especially salient -- need for uniqueness:
Your last point has been my general interpretation of every individual espousing these kinds of fringe theories I have personally run across save one, and he was clearly schizophrenic to the point of hearing voices and talking to invisible creatures; I give him a pass.
Philosopher Lisa Bortolotti (author of Why Delusions Matter, my favorite book of the year so far) argues, quite compellingly to me, that we should resist the temptation to pathologize misbelievers. The cognitive biases at work in their belief-formation processes are, she argues, *not* equivalent to dysfunctional processes, but generally continuous with biases shared by all of us; they can serve important needs that are sometimes produced by challenging circumstances. So she counsels empathic engagement.
Today, coincidentally, she has an open-access paper featuring her ideas from the book that apply to conspiracy misbelievers (including vaccine hesitancy and anti-vaccination beliefs):
"Is it pathological to believe conspiracy theories?"
I read the article. It is quite interesting. And - methodologically - I agree with a lot of it. However, I can't help feeling it suffers from one amazingly huge flaw that runs all through it, bordering on a kind of hubris. I'm surprised she didn't address it, frankly. Here's a paragraph that illustrates it nicely:
////
"Let's consider the first position, where the deficit is an inability to inhibit an implausible hypothesis. In the two-factor theory, the first factor tells us why people come up with a hypothesis with its distinctive content to account for a significant event and do not accept the “official” explanation of the event. The first factor is often taken to be a form of epistemic mistrust towards the sources of mainstream information. The second factor tells us why the hypothesis, once formulated, is not rejected and is instead adopted as a belief in spite of its implausibility. In the case of conspiracy beliefs emerging at a critical time, the content of the hypothesis is that, even if authorities and other citizens do not acknowledge it, there is a plot by someone powerful and ill-intentioned which explains the crisis (e.g., the Chinese created the coronavirus in a lab in Wuhan). On this view, the hypothesis about the conspiracy is formulated because people do not trust the official account of the event and it is endorsed because people are unable to reject it on the basis of its implausibility."
////
She offers up an illustration of a conspiracy theory - in this case that the coronavirus was created in a lab in Wuhan - and assumes, it is clear as day that she assumes it to be implausible and so she is trying out these various theories to explain why it is people who ascribe to this idea in spite of an "official account" which denies it. She picks the theory she favors. But she never once entertains the idea that it might be in fact true. In by all reputable accounts at this point, it probably is true. And there actually was a conspiracy to cover it up. That is pure hubris. I actually love it. It's perfect.
And I don't as a rule subscribe to conspiracy theories. In fact, I'm shooting them down from my relatives and associates frequently.
Nice catch, Jeff. I had similar reservations about that, too -- both in the paper and in her book from which the paper is closely drawn. My guess is that she may have written that when it the credibility of the Wuhan lab origin hypothesis seemed to her of much lower probababilty than many experts now currently accept.
So my approach to that part of the text was to ask myself if it undermines her gloss on that version of the two-factor model. And then ask whether it reflects bias pervading the rest of the paper (or book). My impression is that it doesn't, on either count. But yes, there's irony there you picked up on!
By the way, in case you're interested, another paper today co-authored by Bortolotti that focuses on one of the central goals of her book (and why I admire it so much):
I have a hard time with the idea of “misbeliefs," particularly when they’re lumped together - “The battle against fake news, conspiracy theories, anti-vax rumors, and other popular misbeliefs.” How many news sources need to be shown to be untrustworthy; how many conspiracies must be brought to light; how many questions about vaccines need to be raised for these to stop being considered misbeliefs? As librarians, shouldn’t we encourage people to dig into ideas that contradict their own and that contradict the dominant narrative?
I thought this was a good conversation between Michael Shellenberger and Bret Weinstein regarding conspiracy theories--https://www.youtube.com/watch?v=mY591Ax0Bms
We have very different priors. I think our priority should be to support people getting to more reliable information and information sources, not furthering distrust.
From my vantage, you and Susan seem to share the same basic pieties about human gullibility with those on the "fighting misinformation" crusade. They think we should increase distrust by promoting so-called "critical thinking skills" and "intellectual virtues", and/or various censorship or -adjacent social media policies; you two seem to think the masses are gullible to "the dominant narrative", "mainstream media", etc. I think all of you are wrong about our psychology and the general thrust of the misinformation research base that has emerged over the past 5 years. So if anyone around here is a "heterdox thinker" or whatever, it might well be me, the sheeple guy who pushes against "the dominant narrative" of epistemic individualism/superheroics/cosplay, etc.
Here, for any curious readers who stumble upon this, is a sample of how my priors are shaped (and why I think so much content here is far from anything I would regard as "heterodox"):
I agree that we should support people getting to more reliable information and information sources, absolutely! Which is why I said we should encourage people to dig into ideas. I just happen to think that looking at alternative sources and challenging assumptions is a vital part of that journey.
But to clarify, I never said the majority of what your comment accuses me of. I agree that the "fighting misinformation crusade" is misguided. And I'm not sure who "all of you" are - I for one never mentioned psychology or anything about the masses being gullible. Rather, I pointed out an issue with the idea of "misbeliefs" and asked whether librarians should encourage people to look into ideas from different viewpoints.
You may be right in terms of a new divide between the section of the population who are becoming increasingly intolerant of non-mainstream points of view and the section of the population who are increasingly losing trust in institutions and the media. I have certainly encountered both.
I don't see how a general trend in increasing intolerance of non-mainstream points of view would be consistent with the well-evidenced trend of greater mistrust of mainstream institutions. Maybe a lack of imagination on my part.
One case of intolerance to non-mainstream points of view during the pandemic that springs to mind:
"Here we show that individuals who are vaccinated against COVID-19 express negative attitudes against unvaccinated individuals in the form of antipathy, stereotypes, support for exclusion from family relationships and support for removal of political rights. In total, these four forms of discriminatory attitudes are consistent with the observation of prejudice according to standard definitions in social psychology."
Perhaps this piece of "mainstream media" from today provides some support for the conjecture about a trend in increasing intolerance of views that dissent from mainstream consensus (though it's a laudatory variety of intolerance for a purveyor of misinformation that I endorse):
"Anti-vaccine activists, some who work for Kennedy’s nonprofit group Children’s Health Defense, sat in the rows behind him, watching as he insisted 'I have never been anti-vaxx. I have never told the public to avoid vaccination.'”
I do agree with her here--"Pathologisation has consequences and should not be embraced lightly: if the belief is considered to be the outcome of a dysfunction, then the person reporting it may not be taken seriously. Rather than the belief being challenged and argued against, it may be merely treated as the symptom of a disorder—as something to get rid of by attempting to restore functionality in the cognitive mechanisms responsible for the adoption and maintenance of beliefs." The best way to react to a conspiracy theory is to provide evidence to the contrary, if that exists. Something the film failed to do.
True. However some are constructed in such a way that they are impossible to disprove - in the sense of Karl Popper: some theories are unfalsafiable. The flat earther's arguments have generally struck me that way. I suspect most of the people that advocate the unfalsafible variants know this on some level and use the beliefs to stand out and be seen. Attempting "empathic engagement" is precisely what they want. There is an analogy with hypochondria where what is sought is medical engagement.
I would think at this point that "round earth" could be proven, but certainly there is a lot about space that is unknown. I did watch a video on why "flat earth" matters, and to summarize a few points-- because round earth and associated beliefs deny a more religious view of the universe with a creator and humans at the center; because "scientism" should be rejected; because one should not base one's life on lies; because the experts are fallible and expertise is often wrong; because believing that there is a lot we don't know yet makes for a more exciting and engaging life. I am most sympathetic to the latter two, but of course, there IS a lot we don't know about space (and biology, and physics, etc.) yet. On another note, there are a few aspects about quantum physics that I remember thinking at the time might be planted misdirections (although I can't recall exactly what those are now).
Someone sent me a more scientific film about flat earth but I have such a poor background in science that it's difficult for me to evaluate much-https://www.youtube.com/watch?v=Jv-G-edMc5E.
I've always been struck by the work of Kurt Gödel. I don't claim expertise in that field but I've read a fair bit about it and it seems to me that it has philosophical and perhaps even religious implications far beyond it's mathematical origins. His two incompleteness theorems, I'm referring to. He was a mathematician at Princeton, back in the 1930s, working in the same area Bertrand Russell and Alfred Whitehead were studying: mathematical systems of knowledge. Russell and Whitehead were convinced that one could - theoretically, at least - come up with a way of completely describing the set of all possible rules for a given axiomatic system. It could be infinite. Even uncountably infinite. But one ought to be able to come up with a way to describe the compass and requirements of such a system in terms of completeness (any valid rule is in the system) and consistency (one rule can't contradict another rule). What Gödel did was come up with an insanely clever way of describing the mechanics such a system would have to have. And he realized in doing so that it could either be complete but would necessarily contain inconsistent elements. Or it could consistent, but then would be missing valid elements. It could not be both. He presented his original idea in a roundtable discussion at a 1930's conference in Germany - not a formal presentation. It wasn't published until the following year. People were resistant to it at first. Especially Russell and Whitehead, which is understandable, since they had been working on proof of the opposite result with much fanfare for many years. David Hilbert understood it very quickly. So did von Neumann who had been in the Russell-Whitehead camp but proved the hard part of it to himself in the months after the conference and sent it to Gödel. There have been many books and essays written about it since then, most notably the best seller, "Gödel, Escher, Bach" back in the eighties. For me (a math grad-student for several years before finishing in electrical engineering) I have always thought of parallels in general theories of knowledge. We always strive for consistency - it's the essence of syllogistic reasoning. But the whole thrust of the scientific enterprise is the quest for completeness. We are always trying to find the holes and fill them. And yet Gödel seems to have shown that in some sense there are limits that we have to decide what will have to give. It wouldn't surprise me to read someday that the whole modern physics/quantum mechanics endeavor is subject to the same constraints. The Heisenberg uncertainty principal seems like it could be looked at this way. And the business about electrons orbiting atomic nuclei. You can apparently measure very accurately where one is at a given time, but not how fast it's going or which direction. Or you can measure speed and direction but not where it is. Similar idea, seems to me. Or analogous, at any rate. And when I read a lot of Buddhism I find myself thinking about this theory from the other end: completeness without consistency.
Peter Turchin has a good working definition of "conspiracy theory" that he references in his excellent new book "End Times."
1. The conspiracy/effort is believed to be masterminded consciously by a relatively small and anonymous group;
2. Their motivations are vague or shifting;
3. It's plausible enough to be believed before one does any real research or if only certain facts are presented.
He warns that they historically pop up and eventually take on political import as societies de-stabilize and authorities and "media" are trusted less.
That last part is the key point--people are not less tolerant of dissent from mainstream sources these days. People are ignoring or outright distrusting those sources. The sources themselves are less tolerant--the outlets, the corporations who own them, the politicians who collude with the corporations, etc. Americans are learning to do "Soviet news math," a phenomenon described by dissidents in the days of the Cold War. If Soviet news said 40 people died in the crash, it was probably really 75. If they said Soviet forces were welcomed in Kandahar, they actually had a firefight to blast their way in. If they said Soviet forces were retreating, then they were probably massacred.
While I agree there is growing distrust (and I believe Pew Research bears this out), I have also encountered an increasing hostility in other people to consider anything that doesn't have the stamp of approval of, say, NPR or the New York Times. This may be the flip side of the coin. The growing distrust "out there" creates fear in some people and a greater determination to believe their trusted sources.
Theirness/ourness is a predictable and sad outcome resulting from piss-poor journalism instructors encouraging j-students to abandon neutrality. That same piss-poorness is in the library schools these days.
I really like that phrase "Soviet news math." It certainly captures my experience interacting with news sources - it takes a lot of effort to try to suss out the truth within the obviously biased sources. I'll have to check out End Times!
"I am finding that people are increasingly intolerant of views that dissent from mainstream consensus."
Your intution contradicts lots of converging evidence of *greater* mistrust of mainstream institutions -- media, higher ed, government, etc. -- in the US especially, and, I think, to varying degrees, throughout much of industrialized West. (In fact, the most profitable way I've found to read most of your contributions to this Substack is as a colorful illustration of this troubling phenomenon.) And in the US especially, in which affective polarization is so high, it's far from clear what is meant by "mainstream consensus" in abstraction from particular issues (many of which themselves are -- increasingly? -- fraught with contention).
As for making sense of flat-earthers and others who espouse fringe beliefs, I think Hugo Mercier hits the nail on the head:
(1) We should wonder about the *social goals* such people are pursuing (e.g. bridge burning, commitment-signaling):
https://press.princeton.edu/ideas/what-do-you-really-know-about-gullibility
(2) We should generally default to assuming that such (more or less outlandishly) fringe "beliefs", in contrast to our mundane beliefs, are usually cognitively insulated from the inferential and behavior-guiding processes that might otherwise lead to costly actions:
https://drive.google.com/file/d/1WGlyX7_6YvlYd38GqtfeHnv6KtO_4oT7/view?usp=sharing
There are also some interesting personality dimensions conspicuously associated with folks who espouse fringe beliefs, such as narcissism and -- I think this one is especially salient -- need for uniqueness:
https://www.sciencedirect.com/science/article/pii/S2352250X22001051
Your last point has been my general interpretation of every individual espousing these kinds of fringe theories I have personally run across save one, and he was clearly schizophrenic to the point of hearing voices and talking to invisible creatures; I give him a pass.
Philosopher Lisa Bortolotti (author of Why Delusions Matter, my favorite book of the year so far) argues, quite compellingly to me, that we should resist the temptation to pathologize misbelievers. The cognitive biases at work in their belief-formation processes are, she argues, *not* equivalent to dysfunctional processes, but generally continuous with biases shared by all of us; they can serve important needs that are sometimes produced by challenging circumstances. So she counsels empathic engagement.
Today, coincidentally, she has an open-access paper featuring her ideas from the book that apply to conspiracy misbelievers (including vaccine hesitancy and anti-vaccination beliefs):
"Is it pathological to believe conspiracy theories?"
https://journals.sagepub.com/doi/10.1177/13634615231187243
Sounds like interesting work, Ill give it a look. Thanks, Rob.
I read the article. It is quite interesting. And - methodologically - I agree with a lot of it. However, I can't help feeling it suffers from one amazingly huge flaw that runs all through it, bordering on a kind of hubris. I'm surprised she didn't address it, frankly. Here's a paragraph that illustrates it nicely:
////
"Let's consider the first position, where the deficit is an inability to inhibit an implausible hypothesis. In the two-factor theory, the first factor tells us why people come up with a hypothesis with its distinctive content to account for a significant event and do not accept the “official” explanation of the event. The first factor is often taken to be a form of epistemic mistrust towards the sources of mainstream information. The second factor tells us why the hypothesis, once formulated, is not rejected and is instead adopted as a belief in spite of its implausibility. In the case of conspiracy beliefs emerging at a critical time, the content of the hypothesis is that, even if authorities and other citizens do not acknowledge it, there is a plot by someone powerful and ill-intentioned which explains the crisis (e.g., the Chinese created the coronavirus in a lab in Wuhan). On this view, the hypothesis about the conspiracy is formulated because people do not trust the official account of the event and it is endorsed because people are unable to reject it on the basis of its implausibility."
////
She offers up an illustration of a conspiracy theory - in this case that the coronavirus was created in a lab in Wuhan - and assumes, it is clear as day that she assumes it to be implausible and so she is trying out these various theories to explain why it is people who ascribe to this idea in spite of an "official account" which denies it. She picks the theory she favors. But she never once entertains the idea that it might be in fact true. In by all reputable accounts at this point, it probably is true. And there actually was a conspiracy to cover it up. That is pure hubris. I actually love it. It's perfect.
And I don't as a rule subscribe to conspiracy theories. In fact, I'm shooting them down from my relatives and associates frequently.
Nice catch, Jeff. I had similar reservations about that, too -- both in the paper and in her book from which the paper is closely drawn. My guess is that she may have written that when it the credibility of the Wuhan lab origin hypothesis seemed to her of much lower probababilty than many experts now currently accept.
So my approach to that part of the text was to ask myself if it undermines her gloss on that version of the two-factor model. And then ask whether it reflects bias pervading the rest of the paper (or book). My impression is that it doesn't, on either count. But yes, there's irony there you picked up on!
By the way, in case you're interested, another paper today co-authored by Bortolotti that focuses on one of the central goals of her book (and why I admire it so much):
"Why We Should Be Curious about Each Other"
https://www.mdpi.com/2409-9287/8/4/71
I have a hard time with the idea of “misbeliefs," particularly when they’re lumped together - “The battle against fake news, conspiracy theories, anti-vax rumors, and other popular misbeliefs.” How many news sources need to be shown to be untrustworthy; how many conspiracies must be brought to light; how many questions about vaccines need to be raised for these to stop being considered misbeliefs? As librarians, shouldn’t we encourage people to dig into ideas that contradict their own and that contradict the dominant narrative?
I thought this was a good conversation between Michael Shellenberger and Bret Weinstein regarding conspiracy theories--https://www.youtube.com/watch?v=mY591Ax0Bms
thank you!
We have very different priors. I think our priority should be to support people getting to more reliable information and information sources, not furthering distrust.
From my vantage, you and Susan seem to share the same basic pieties about human gullibility with those on the "fighting misinformation" crusade. They think we should increase distrust by promoting so-called "critical thinking skills" and "intellectual virtues", and/or various censorship or -adjacent social media policies; you two seem to think the masses are gullible to "the dominant narrative", "mainstream media", etc. I think all of you are wrong about our psychology and the general thrust of the misinformation research base that has emerged over the past 5 years. So if anyone around here is a "heterdox thinker" or whatever, it might well be me, the sheeple guy who pushes against "the dominant narrative" of epistemic individualism/superheroics/cosplay, etc.
Here, for any curious readers who stumble upon this, is a sample of how my priors are shaped (and why I think so much content here is far from anything I would regard as "heterodox"):
Misinformation on Misinformation
https://journals.sagepub.com/doi/full/10.1177/20563051221150412
Fighting misinformation or fighting for information
https://acerbialberto.com/post/2022_fake_news/
(Why) Is Misinformation a Problem?
https://theconversation.com/misinformation-why-it-may-not-necessarily-lead-to-bad-behaviour-199123
How Effective Are Interventions Against Misinformation?
https://psyarxiv.com/sm3vk
The Fake News about Fake News
https://www.bostonreview.net/articles/the-fake-news-about-fake-news/
Not Born Yesterday: The Science of Who We Trust and What We Believe
https://press.princeton.edu/books/hardcover/9780691178707/not-born-yesterday
I agree that we should support people getting to more reliable information and information sources, absolutely! Which is why I said we should encourage people to dig into ideas. I just happen to think that looking at alternative sources and challenging assumptions is a vital part of that journey.
But to clarify, I never said the majority of what your comment accuses me of. I agree that the "fighting misinformation crusade" is misguided. And I'm not sure who "all of you" are - I for one never mentioned psychology or anything about the masses being gullible. Rather, I pointed out an issue with the idea of "misbeliefs" and asked whether librarians should encourage people to look into ideas from different viewpoints.
You may be right in terms of a new divide between the section of the population who are becoming increasingly intolerant of non-mainstream points of view and the section of the population who are increasingly losing trust in institutions and the media. I have certainly encountered both.
I don't see how a general trend in increasing intolerance of non-mainstream points of view would be consistent with the well-evidenced trend of greater mistrust of mainstream institutions. Maybe a lack of imagination on my part.
One case of intolerance to non-mainstream points of view during the pandemic that springs to mind:
"Here we show that individuals who are vaccinated against COVID-19 express negative attitudes against unvaccinated individuals in the form of antipathy, stereotypes, support for exclusion from family relationships and support for removal of political rights. In total, these four forms of discriminatory attitudes are consistent with the observation of prejudice according to standard definitions in social psychology."
https://scholar.google.com/citations?view_op=view_citation&hl=en&user=kmkQqPMAAAAJ&sortby=pubdate&citation_for_view=kmkQqPMAAAAJ%3AzCpYd49hD24C&inst=6416714965532506866
Perhaps this piece of "mainstream media" from today provides some support for the conjecture about a trend in increasing intolerance of views that dissent from mainstream consensus (though it's a laudatory variety of intolerance for a purveyor of misinformation that I endorse):
"Anti-vaccine activists, some who work for Kennedy’s nonprofit group Children’s Health Defense, sat in the rows behind him, watching as he insisted 'I have never been anti-vaxx. I have never told the public to avoid vaccination.'”
https://apnews.com/article/rfk-kennedy-election-2024-president-campaign-621c9e9641381a1b2677df9de5a09731?user_email=1d3fb4ae7a7ca9e7b2ad2d034fd43172bee10868f7048e6307d2846c66da50a9&utm_medium=Afternoon_Wire&utm_source=Sailthru&utm_campaign=AFTERNOON%20WIRE%20JULY%2031&utm_term=Afternoon%20Wire
I do agree with her here--"Pathologisation has consequences and should not be embraced lightly: if the belief is considered to be the outcome of a dysfunction, then the person reporting it may not be taken seriously. Rather than the belief being challenged and argued against, it may be merely treated as the symptom of a disorder—as something to get rid of by attempting to restore functionality in the cognitive mechanisms responsible for the adoption and maintenance of beliefs." The best way to react to a conspiracy theory is to provide evidence to the contrary, if that exists. Something the film failed to do.
True. However some are constructed in such a way that they are impossible to disprove - in the sense of Karl Popper: some theories are unfalsafiable. The flat earther's arguments have generally struck me that way. I suspect most of the people that advocate the unfalsafible variants know this on some level and use the beliefs to stand out and be seen. Attempting "empathic engagement" is precisely what they want. There is an analogy with hypochondria where what is sought is medical engagement.
I would think at this point that "round earth" could be proven, but certainly there is a lot about space that is unknown. I did watch a video on why "flat earth" matters, and to summarize a few points-- because round earth and associated beliefs deny a more religious view of the universe with a creator and humans at the center; because "scientism" should be rejected; because one should not base one's life on lies; because the experts are fallible and expertise is often wrong; because believing that there is a lot we don't know yet makes for a more exciting and engaging life. I am most sympathetic to the latter two, but of course, there IS a lot we don't know about space (and biology, and physics, etc.) yet. On another note, there are a few aspects about quantum physics that I remember thinking at the time might be planted misdirections (although I can't recall exactly what those are now).
Someone sent me a more scientific film about flat earth but I have such a poor background in science that it's difficult for me to evaluate much-https://www.youtube.com/watch?v=Jv-G-edMc5E.
I've always been struck by the work of Kurt Gödel. I don't claim expertise in that field but I've read a fair bit about it and it seems to me that it has philosophical and perhaps even religious implications far beyond it's mathematical origins. His two incompleteness theorems, I'm referring to. He was a mathematician at Princeton, back in the 1930s, working in the same area Bertrand Russell and Alfred Whitehead were studying: mathematical systems of knowledge. Russell and Whitehead were convinced that one could - theoretically, at least - come up with a way of completely describing the set of all possible rules for a given axiomatic system. It could be infinite. Even uncountably infinite. But one ought to be able to come up with a way to describe the compass and requirements of such a system in terms of completeness (any valid rule is in the system) and consistency (one rule can't contradict another rule). What Gödel did was come up with an insanely clever way of describing the mechanics such a system would have to have. And he realized in doing so that it could either be complete but would necessarily contain inconsistent elements. Or it could consistent, but then would be missing valid elements. It could not be both. He presented his original idea in a roundtable discussion at a 1930's conference in Germany - not a formal presentation. It wasn't published until the following year. People were resistant to it at first. Especially Russell and Whitehead, which is understandable, since they had been working on proof of the opposite result with much fanfare for many years. David Hilbert understood it very quickly. So did von Neumann who had been in the Russell-Whitehead camp but proved the hard part of it to himself in the months after the conference and sent it to Gödel. There have been many books and essays written about it since then, most notably the best seller, "Gödel, Escher, Bach" back in the eighties. For me (a math grad-student for several years before finishing in electrical engineering) I have always thought of parallels in general theories of knowledge. We always strive for consistency - it's the essence of syllogistic reasoning. But the whole thrust of the scientific enterprise is the quest for completeness. We are always trying to find the holes and fill them. And yet Gödel seems to have shown that in some sense there are limits that we have to decide what will have to give. It wouldn't surprise me to read someday that the whole modern physics/quantum mechanics endeavor is subject to the same constraints. The Heisenberg uncertainty principal seems like it could be looked at this way. And the business about electrons orbiting atomic nuclei. You can apparently measure very accurately where one is at a given time, but not how fast it's going or which direction. Or you can measure speed and direction but not where it is. Similar idea, seems to me. Or analogous, at any rate. And when I read a lot of Buddhism I find myself thinking about this theory from the other end: completeness without consistency.
I would also add that the belief that conspiracy theories are always wrong is itself a bias.
Peter Turchin has a good working definition of "conspiracy theory" that he references in his excellent new book "End Times."
1. The conspiracy/effort is believed to be masterminded consciously by a relatively small and anonymous group;
2. Their motivations are vague or shifting;
3. It's plausible enough to be believed before one does any real research or if only certain facts are presented.
He warns that they historically pop up and eventually take on political import as societies de-stabilize and authorities and "media" are trusted less.
That last part is the key point--people are not less tolerant of dissent from mainstream sources these days. People are ignoring or outright distrusting those sources. The sources themselves are less tolerant--the outlets, the corporations who own them, the politicians who collude with the corporations, etc. Americans are learning to do "Soviet news math," a phenomenon described by dissidents in the days of the Cold War. If Soviet news said 40 people died in the crash, it was probably really 75. If they said Soviet forces were welcomed in Kandahar, they actually had a firefight to blast their way in. If they said Soviet forces were retreating, then they were probably massacred.
While I agree there is growing distrust (and I believe Pew Research bears this out), I have also encountered an increasing hostility in other people to consider anything that doesn't have the stamp of approval of, say, NPR or the New York Times. This may be the flip side of the coin. The growing distrust "out there" creates fear in some people and a greater determination to believe their trusted sources.
"Their trusted sources."
Theirness/ourness is a predictable and sad outcome resulting from piss-poor journalism instructors encouraging j-students to abandon neutrality. That same piss-poorness is in the library schools these days.
I really like that phrase "Soviet news math." It certainly captures my experience interacting with news sources - it takes a lot of effort to try to suss out the truth within the obviously biased sources. I'll have to check out End Times!