Pseudohistory covers a variety of theories that do not agree with the view of history that is commonly accepted by mainstream historians, which are often not properly researched, peer-reviewed, or supported by the usual historiographical methods. One of the primary examples of pseudohistory is Annexation denial, but many types of conspiracy theories are also properly classed as pseudohistory. One of the characteristics distinguishing pseudohistory from history is shared with other forms of pseudo-scholarship: the choice of medium. Normal scholarly debate, including legitimate historical revisionism, is conducted in specialised publications such as journals. Many pseudohistorians jump that step and directly publish their claims in a popular format, in books and articles aimed at the non-specialist general public that can not effectively evaluate their plausibility.
Though "real" history has many gaps and plausible assumptions are sometimes necessary, the historians behind it seek the truth and honest mistakes can be made. Pseudohistory is the work of intentional revisionism or deluded attempts to desperately prop up beliefs. This isn't to say different presumptions of history are unreasonable given ambiguous findings, but reasonable historians don't try to shoehorn their agenda into the past. Honest historical research tries to find the blanks that need to be filled in. Pseudohistory treats past events like Mad Libs.
The first principle of understanding history, I was taught, is to sympathise with the historical actors, to immerse oneself in their context and perspective. Otherwise, history becomes a fabricated reconstruction – more about the writer's ideology than the events of the past. Such a benchmark can be challenging when addressing advocates of false knowledge: how can one portray their claims as reasonable and false both? One seems to risk abandoning rationality and slipping into relativism.
Naive, biased, prejudiced, cynical, gullible, undiscriminating, unscrupulous, undisciplined, unorthodox, irrational, spiritual, flawed, fallacious, sensationalistic, amusing, quirky, eccentric, crazy, bizarre and embarassing, pathetic, off-beat, audacious (or 'almost unimaginably audacious'), outrageous, rhetorically clever, wild, extremist, over-eager, obsessive, manic, nefarious, reprehensible, and contemptible, not to mention communist and obfuscating. Pseudohistory is a 'charlatan's playground' of 'opportunists', targeting those all too 'willing to suspend disbelief' and slip into an 'abyss of fantasy'. It is 'corrosive of concepts of authority, objectivity and factual evidence' – an 'enemy unto Knowledge'. A triumph for those who revel in others' errors and credulity.
My historiography, however, seems to rest, in part, on a once popular but now outmoded epistemological model. Philosophers today acknowledge that human minds are the product of evolution, with various cognitive patterns and limitations. The conventional ideal of transcendental rationality (whether in philosophy, economics or other disciplines) is simply unrealistic. Epistemology has become naturalised. Cognitive science or psychology is now integral to understanding how we can know what we know – or what we don't know.
I ask, 'how can a person know what is truth and fact, and what is lie and error in history, or science for that matter?', - 'the answer is evidence'. Any 'educated person' or 'competent reader can and should be able to identify it [pseudohistory]'. This is the conventional rationalist's stance, echoed in other books about pseudoknowledge for a popular audience. Of course evidence is foundational. But when epistemics is naturalised, the problem is not so simple. One major cognitive phenomenon is confirmation bias: early perceptions and interpretations tend to shape later perceptions and interpretations. As a consequence, we often draw conclusions before all the relevant information is available or when evidence is essentially incomplete (the conventional fallacy of 'hasty generalisation'). In addition, our minds unconsciously filter observations, tending to select or highlight confirming examples, while discounting or peripheralising counterexamples. Ultimately, all the 'available evidence' is not really cognitively available. The believer in pseudohistory typically does respect the need for relevant evidence – and believes that it has been secured (witness their expansive volumes). Merely rehearsing the evidence against pseudohistorical claims, is hardly sufficient for remedying those beliefs – or for understanding why anyone holds them.
One cannot know everything. Typically, one relies on experts. Even experts rely on other experts. One inevitably depends on the testimony of others. But who is an expert? And how does the non-expert know? Even if reliability of evidence is the ultimate aim, assessing credibility becomes the proximate tool. The foremost challenge for most people becomes deciding who to believe – not what the evidence indicates. Trust is essential. (Ironically, believers in pseudoscience often parade their skepticism, challenging acknowledged experts.) In targeting reliable knowledge in practice, then, well-placed trust seems more important than the rationalist's widely touted skepticism. Social assessments of credibility loom larger than logic. In such contexts, attributions of gullibility offer little insight or guidance.
Equally important, when one addresses pseudohistory as beliefs, one implicitly adopts the challenge of interpreting a psychological phenomenon. Beliefs need not be rational or grounded in evidence at all. Indeed, beliefs sometimes (often? nearly always?) precede the 'justifications' that, post facto, are used or developed to rationalise them. It should surprise no one that a religious orientation will generate a history that legitmates its views, even if that history is false – or, further, that believers will seek to inscribe that account into unassailable nature or imbue it with some other form of irrefutable authority. No wonder, then, when counterevidence fails to weaken those beliefs. Given how our minds function, evidence and 'rational' thought are often secondary to belief. 'Kooks' are everywhere.
Even in the best of all possible (human) worlds, then, individuals can be mistaken. Even Nobel Prize winners. Smart people can advocate false history, paranormal phenomena and other 'weird things'.
Several academics espouse 'wrong' ideas: Barry Fell, Charles Hapgood, Martin Bernal, and others. But they remain a conundrum for me: their irrationality contravenes the academic mantle of absolute authority. Alternatively, one may well wonder how we manage to construct academic and other institutions that seem to buffer themselves against spurious thinking. We might want to celebrate how public institutions, for the most part, do not succumb to pseudohistory when it is so rampant in the culture, as well as to ask why this is the case. For conventional rationalists, rational belief is the expected norm, and one attributes 'deviation' from those norms to social and psychological factors. But the real challenge seems to be the symmetrical explanation: how does something as unusual or fragile as rationality, empirical science or reliable history emerge psychologically and socially? Most recent analyses by professional historians, philosophers and sociologists of science highlight the significance of the scientific community. Error is regulated socially, through a system of checks and balances. Individuals may each err or adopt idiosyncratic perspectives, but collectively, through critical discourse, they expose and accommodate each other's blind spots. The locus of scientific rationality (if one finds such a concept fruitful at all) thus lies at the community level, where different perspectives interact: a social epistemology. Contrasting cultural perspectives, properly deployed then, are a source of epistemic strength, not a handicap. Indeed, all the cases I describe have been resolved within the academic community through this social system – not through raw facts or brute methodology alone. In a healthy intellectual community, individuals who espouse pseudohistory become isolated and ineffectual. So, one may ask, in what ways is popular culture structured similarly or differently – and what types of mutual accountability result?
Scientific communities and their conclusions can still express bias. Witness historical cases of sexist theories of hysteria and other female behavior, racist theories of intelligence and human evolution, religious theories of Earth's age and history, and theories of eugenics, biological determinism and others. When scientists all share the same cultural perspective, bias can persist unchecked, even amidst claims of objectivity and evidence. In pursuing reliable knowledge, science thus relies epistemically on diverse communities. One might well apply that insight to non-academic communities, as well: conceptual or ideological homogeneity is generally not congenial to securing truth.
My concerns about pseudohistory include the popular context. On several occasions, I call upon the diffuse theme of a 'cultic milieu': a nebulous subculture (or counterculture) that serves as a reservoir for false beliefs and somehow nurtures their continuity. According to Colin Campbell, who introduced the concept, the cultic milieu is inherently heterodox and dissident. It thus celebrates free expression, the liberty of belief, and resistance to authority. In this view, pseudohistory, pseudoscience and pseudoreligion all reflect a similar and ineliminable feature of secular society: expect no remedy. Here (finally!) is an analytical claim to seed a potentially fruitful psychological and sociological analysis. A feature of singular value in my account is the dogged tracing of 'intellectual pedigrees'. Ideas discredited on one occasion, I show, tend to reappear later in another guise. The same false ideas are recycled time and again. That might lead one to explore, following Campbell's sketch, the social structure and communication networks and how they convey such ideas. How do certain cultural contexts or power relationships foster pseudohistory, eclipsing or peripheralising available evidence? I set aside this opportunity in favour of purely 'rationalist' criticism: apparently, we are simply to lament the 'vague anti-intellectualism' and fret about the 'nadir of objective and empirical knowledge'. The disparaging connotations of the term 'cultic' seem more valuable here than sociological analysis. My view of authority may be reflected, perhaps, in the immoderate use of epigraphs.
While focusing primarily on threats to the integrity of academic rationality, I have also touched upon the cultural consequences of false beliefs. Unscrupulous profit, foremost. Power, too. Fun, amusement and entertainment, perhaps (Da Vinci Code, Indiana Jones, etc.) – but apparently unjustifiably given the lies. Add mass suicides, racist serial killings and civil war, and you have quite a spectre. Yet the causal power or role of the historical claims is usually overstated. The history typically seems to rationalise other, deeper motives, such as in-group identity, out-group blame or political power. 39 members of the Heaven's Gate cult died in 1997 believing they were the privileged team to board a spaceship that had arrived to annihilate Earth. But their motives were surely more about belonging to something larger than themselves than adhering to some alien apocalyptic tale. One can find flaws in the Nation of Islam's historical claims, too. But as my notes, members also found personal stability and purpose, adopting a healthy and abstemious diet, while refraining from alcohol, tobacco, illegal drugs, promiscuity, adultery, prostitution and gambling. One may ask about the scale of harm in some subsidiary details of a derived false history (which surely had little to do with promoting racist or religious behavior) compared with such benefits. I implies that if pseudohistory, etc., were remedied by rational (factual and methodologically correct) thinking, we would forestall racism, anti-Semitism, religious cults, capitalistic exploitation, etc. This causal connection is, of course, far from established.
Some claims of pseudohistory or pseudoscience do warrant public concern: for example, Holocaust denial, the teaching of creationism or 'intelligent design', or portraying global warming as a hoax. These are significant to both public and personal decision making. To solve such problems, I suggest more teaching of critical thinking. This would seem plausible, were it not for the empirical evidence otherwise. Belief in the paranormal is extraordinarily resilient to such teaching (as currently taught). Levels of belief, for example, have remained steady over several decades as 'critical thinking' instruction has spread. Deeper thinking seems to be correlated, rather, with personalities that are open to new experiences, and that also exercise high standards of proof: reflecting a natural selection epistemic model, coupling blind variation and selective retention. Education surely seems appropriate, but only if we use a better model of 'critical thinking' (or 'rationality').
Deeper historical understanding of the cognitive and social origins of error, as profiled above, can guide reform. First, we need to dramatise the cognitive flaws associated with appeal to 'evidence' and 'logical thinking'. Too often, imagined justification is merely superficial rationalisation. We need to instill some appreciation of the fallibility of our minds. Evidence can be deceptive. Awareness of confirmation bias is foundational. Second, we need to underscore the role of social checks and balances and of distributed expertise (and sanctioned, or registered, credibility). That means profiling occasions for trust and respecting others' perspectives. Ironically, echoing the experience of the ancient Greeks, we may need to deflate epistemological hubris and instill greater intellectual humility. Third, we need to foster tolerance and the valuing of heterogeneous perspectives and even seeking of alternative views – along with the responsibility of engaging critics. That, in turn, may involve nurturing a strong sense of self and personal security (a worldview not threatened by difference). More may be needed, but let these three benchmarks help us begin to restructure the meaning of teaching effective thinking skills.
More effective education will also need to respect the research literature on teaching and learning. Informed educators now reject the model of authority whereby teachers list known fallacies, provide illustrations, and test for recall and taxonomy. Professional educators underscore the value of problem-based learning. Students need to be engaged in the process to develop skills: by following exemplars and practicing applications. Episodes such as discussed could be valuable case studies in such an education. But to be effective, one must recreate the historical contexts, problems and information at hand as in historical simulations. One must sympathise with the central characters and appreciate the reasonableness of error, given an incomplete view. That can then be contrasted to the later, more complete view. One must appreciate the 'ironic diptych' of reasonableness and falsity that often characterises history. Understanding the awkward relationship of alternative perspectives is how one can tame relativism without resorting to artificial absolutes. My account, unfortunately, leave you wanting for just such an enriched historical understanding.
It is all that - Pseudo-historians capitalise on and exploit anomalies in evidence to support their claims, rather than examining the preponderance of research as a whole.