People have a tendency to assume that anything said by figures of authority must automatically be correct, at least when they are talking about their own fields. This, on its face, seems to be a logical assumption. After all, an expert has studied his field far more thoroughly than a layman, and with his greater knowledge he should be more capable of drawing conclusions.
Alas, this is not correct, as history has often shown. Plato and Aristotle taught that the Earth is the center of the universe; roman philosopher Lucretius taught that heavier objects fall faster than lighter objects; United States Deputy Secretary of Defense said in 1998 “The Y2K problem is the electronic equivalent of the El Niño and there will be nasty surprises around the globe.”. Albert Einstein said that “Human beings only use 10% of their brains.”. Educational system, as it is, is based on indoctrination – people are being taught not how to think, but what to think. While exceptions exist, as a rule, any thought that does not conform to a prevailling dogma is being discouraged, ridiculed or outright forbidden. This means that questioning assumptions and checking “facts” is an uphill battle, one that most people are loath to fight. Consequently, most people – “experts” included – find it easier to “go with the flow” and accept commonly held views as facts, bowing to authority and disregarding evidence of their own eyes. These “facts” are the same they were taught in the school or through the media system. That is to say, they are commonly accepted because it is in somebody’s interest for them to be accepted, not necessarily because they are correct. Students accept “facts” based on the authority of writer, or a teacher, not based on evidence; because it is easier, because it has to be done and because they entered education believing they will be properly informed. Even diregarding this, a legitimate authority speaking within an area of their expertise is not necessarily correct. Noone knows everything, and humans have a tendency to fill in the gaps in knowledge with related knowledge, experiences or their own personality. Consequently, conclusions can always be incorrect. Scientists, always held to be impartial, are just as susceptible to standard human vices of closing eyes to the evidence as anybody else. They are just as motivated by emotion as the next person. As a result, scientists who contradict the prevailling dogma face isolation, ridicule or outright attacks. In the 1950s United States, scientist Wilhelm Reich was jailed and his books burned because he published research contradicting prevailling dogma (his name certainly didn’t help matters either). In fact, real problem of the West is scientism, a belief that other disciplines are “worthwile only insofar as they conform their techniques of investigation to those of the physical and biological sciences.”, which means numbers. However, human sciences for example cannot achieve their goals if limited to numbers, because humans are not computers but rather living beings; they act based on emotions, experiences, expectations; not just on what is mathematically logical or more profitable. Scientism is opposed to science, and is in fact a very militant religion.
Further, in many causes authorities are not in positions where they can be trusted to be objective (this fallacy is called “Appeal to biased authority”), despite having both access to raw data and knowledge/expertise to interpret them. To take a military example, military personnell are commonly accepted to be an authority on weapons systems. However, due to the nature of procurement system – particularly in United States – generals have vested personal interest in seeing that expensive weapons procurement projects succeed, regardless of actual performance of such weapons, as doing so secures them lucrative positions in weapons industry after retirement. Even without such incentive, fact remains that air forces are bureocracies, and generals are top-notch bureocrats. Bureocrats tend to feel threatened by anything out of the box disrupting their carefully set-up routine. Generals are also in position where they can exert pressure on lower-ranking personnel and force them to support the “party line”. Consequently, statements by military personnel, especially generals, about performance of weapon systems should not be accepted on face value, as they are likely to be intentionally false due to personal goals and concerns, or simply due to wishful thinking (“A doctor who treats himself has a fool for a patient.”). In one of many examples of such pressure, John Boyd was court-martialled for proving that original Sidewinder was dodgable with no countermeasures. Robin Olds was ordered to cease BFM training in order to prevent mishaps, and even later on when Suter got approval to institute Red Flag exercises, these were initially done well but got quickly riddled with bureocraftic safety measures, such as 500 ft altitude limit. Many performance parameters will also be classified, which means that military personnel cannot reveal them even if they want to, and generals are not breathing down their necks.
Another form of appeal to authority is appeal to masses: “majority believes this to be true, so it is true” (bandwagon fallacy). However, majority is often not correct. While even ancient Greeks had known that Earth is round, and scientific community in Europe never actually believed the flat Earth myth, majority of people believed the Earth to be flat. However, humans are psychologically predisposed to accept views that are held by authority, held by majority, or first made on some subject, without much logical consideration and fact-checking. Failure to do so creates considerable psychological distress, in good part due to peer pressure. High-status individuals also create a stronger likelihood of a subject agreeing with an obviously false conclusion, despite the subject normally being able to clearly see that the answer was incorrect. This can create a ripple effect and cause vast majority of people, experts included, to agree on an incorrect assumption or position.
Related issue, especially problematic in extremely structured organizations (such as military) due to their reliance on interdependence, respecting authority and chain of command, is “groupthink”. In this case, a desire for smooth functioning of a system results in dysfunctional, and often irrational, decision-making outcome. Minimizing conflict is seen as an imperative, leading to intentional supression of critical evaluation and dissending viewpoints. This leads to lack of willingness to question authority, and loss of individual creativity and independent thinking. Result is an illusion of invulnerability and irrational overestimation of group’s ability to make a correct decision. This also means underestimation of ability of outsiders to notice flaws in group’s decision-making cycle. Oftentimes, process is subconscious, caused by education and patterns of thinking, and is thus not noticed by members of the group. Groupthink led to US Navy’s illusion of invincibility and thus directly to the Pearl Harbor fiasco. While Japanese preparations to attack US were well known within US military, nobody seriously considered a possibility of attack against Hawaii, due to overestimation of fleet’s ability do defend against air attack, underestimation of Japanese technological adaptability and illusion of safety due to distances involved. Again, groupthink typically starts from figures of authority and makes its way down the ranks.
Consensus science, a related “scientific” approach, itself is a fallacy. It is little more than an institutionalized groupthink, used to push forward ideas without proper debate. That way, science is being turned into politics. Whereas politics require consensus, science only requires person to be verifiably correct. But as Michael Chrichton puts it: “Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled.”. This is true for any field, and any theme. Scientific consensus was often wrong: claims of fevers being contagious were rejected by scientific consensus in 1795., 1843. and 1849. In 1920s, scientific consensus was that pellagra was contagious despite proof that it is caused by poor diet. Continental drift was denied by scientists for 50 years, until 1961., despite being obvious to a ten-year schoolchild looking at a map. Such examples are countless, yet people still rely on “most people agree on X” and “most [insert authority figures] agree on X” to counter logical arguments. Science has no shortage of charlatany – weather science is incapable of predicting weather twelve hours ahead, yet it claims to be able to predict weather patterns hundreds of years into future. Modern science tends to act like Middle/earl-New age religion, attacking anyone who disagrees with scientific dogma not through arguments, but through raw force, condemning dissenders for heresy.
People, both masses and authorities, often refuse all evidence and logic if it goes against their political beliefs (more here). Reason is often just a way of justifying decisions made on emotional basis, and when emotions and facts disagree – that worse for the facts. But even when person genuinely wants to be objective, emotions and prejudices are often used – even unintentionally, or unconsciously – to fill in always present gaps in knowledge. This is true for all people, regardless of their background or education.
Even when it is reasonable to believe an authority, more direct evidence always takes precedence. For example, aircraft turn rates can be compared by comparing wing loading, g limits and aerodynamic configuration. If done correctly, it carries more weight than claims of experts. If, however, turn rates have been measured and data is avaliable, this data takes precedence over anything else.
On the opposite side of the coin, that a person is arguing from a position of no authority does not mean that the argument is invalid. Oftentimes, people from within the system are incapable of seeing forrest for all the trees they are surrounded with, and it takes persons from outside to make an obvious observations about system as a whole. Further, people who are outside the system are less likely to have personal interest in any side of the argument, making it less likely that their observations will be biased. Of course, in any case, quality of observations is dependant on quality of data avaliable, meaning that person from outside the system has to do more work on “connecting the dots” in order to have sufficient basis for their conclusions when compared to person who works within the system.
All of this does not mean that claims of experts can be simply disregarded. It does mean that they should be judged on basis of evidence avaliable, not on basis of who said it, and that evidence should be carefully scrunitized. If such evidence is not avaliable, then authority figure itsef should be evaluated, especially in terms of any possible causes of bias, such as connections to groups that stand to profit from authority figure’s claims. If such connections exist – as they do between e.g. USAF and US defense industry, and indeed most world’s militaries with respective countries’ defense industries – then authority figure immediately looses on relevancy. Authority figure should also be making statements about a field they have knowledge in, else their statements are just as valid as those of lay men.
Above is an actual practice in courts. While expert testimony is accepted, it is not accepted by itself. Rather, after an expert gives testimony, facts and methodology he used to reach a conclusion are very carefully scrunitized. If they are found unsound, then expert’s opinion is rejected, regardless of expert’s credentials.
Reason why people have a habit of taking anything for granted if it was stated by an expert or another figure for authority is nothing but good old intellectual laziness. Thinking is, whatever people may believe, not easy, as it requires finding large amounts of data, sorting and analyzing them, before drawing conclusions. Relying on expert’s opinions is just a shortcut to avoid work of finding evidence and connecting the dots. It is also often used when person is incapable of finding evidence, and thus uses an appeal to authority as a substitute. Same intellectual laziness is the reason why people, when discussing weapons systems, focus on hardware specifications of weapons and ignore their meaning within actual combat usage, or overwhelming importance of human factor, that is, user’s abilities / competence and his interaction with weapon. This is especially prevalent in assessments of weapons’ performance in combat, where human factor is oftentimes ignored and results of one side’s superiority in training, organization and adaptiveness are attributed to its – oftentimes nonexistent – superiority in hardware (see invasion of France in 1940 or bth Gulf Wars for classical examples of such fallacious approach).