April 30, 2017

Expert Speech

What we see throughout history is a nearly unbroken chain of time when some theory that will ultimately overtake and revolutionize a field is being marginalized by the current experts in the field.

Line By Line

John Astruc, physician to Louis 14th, set out to jam down on Germ Theory once and for all.

“There are some, however, whom I forebear now to spend time in imputing, such as Augustus Hauptman and Christian Langius, who think that the venereal disease is nothing else but a numerous school of nimble, brisk, invisible living tihngs, of a very proflific nature, which when once admitted, increase and multiple in abundance, which lead frequent colonies to different parts of the body and inflame, erode and exulcerate the parts the fix on…””In short, which without regard to the particular quality of any humor occasion all the symptoms that occur in venereal disease. But as these are mere visionary imaginations, UNSUPPORTED BY ANY AUTHORITY, they do not require any argument to refute them…””If it were once admitted that the venereal disease could be produced by invisible living things swimming in the blood, one might with equal reason allege the same thing, not only of plague, as Athanasius Kircher, the Jesuit, and John Saguens, a Minim, lately have done. Germs could also be said to cause hydrophobia, itch, tetters and other contagious diseases, and indeed all the distempers whatsoever; and thus the whole Theory of Medicine would fall to the ground, as nothing could be said to prove the venereal disease depending upon little living things which might not the urged to prove that all other diseases were derived from the like little living things though of a different species – THAN WHICH NOTHING COULD BE MORE RIDICULOUS.”

Between 1546 and 1846, scientists had to battle to get Germ theory to have the light of day. For most people, the quote above is rather humorous. But to those who have said unpopular things and been mobbed, it is close and real.

The story of Ignaz Semmelweis is also instructive.

Semmelweis has a Hungarian obstetrician who worked in the Vienna General hospital in 1847. He discovered that the women in ward 1 had a much higher incidence of puerperal fever than ward 2. It was so much higher that ward 1 was called the ward of death, and women would physically resist being put in that ward.

What was happening is doctors would work on corpses, then deliver babies in ward 1. Semmelweis found that the incidence of puerperal fever could be reduced to zero if the doctors would just wash their hands in a light solution of carbolic acid.

Despite the predictable result of puerperal fever being virtually eradicated as a result of Semmelweis’ policy, Semmelweis’ idea never caught on his lifetime. Semmelweis spent the rest of his life arguing for doctors to wash their damned hands, and went insane and died in a nuthouse. We can only speculate as to why he went insane.

The popular explanation of puerperal fever at the time was that it was psychosomatic. Mid-1800s. Psychosomatic. Not a new idea.

In 1854 there was a Cholera outbreak in London. John Snow went to investigate the cause, and he operated on the then unfashionable germ theory. By mapping the Cholera outbreaks on a map of London, he narrowed the source of the outbreak to the Broad Street water pump where people went to get water.

The pump was disabled, and the outbreak promptly died. As it turned out, the pump was built over a cesspit that was forgotten about.

John Snow then tried to get the town to adopt a general policy to reduce fecal contamination of water. The officials refused, and refused to accept the germ theory of disease, saying it was too depressing to even contemplate. So even when the theory was applied and worked, it didn’t sway the authorities.

But Snow did manage to persuade a Reverend Henry Whitehead who set out to debunk him. Whitehead held to the Miasma (bad air) theory of disease, and in the process of trying to refute snow, came to be convinced by Snow and a believer in germ theory. This is when church and the university weren’t as separate as they are now.

Lets go back to William Harvey. William Harvey in 1628 challenged Galen, claiming that the heart pumped blood throughout the body. Galen believed that blood was generated in the heart and liver, and then would be created as the organs consumed it. Harvey wasn’t taken seriously and was generally seen as a eccentric joke for going up against the established theories of Galen.

It wasn’t until Marcello Lapighi, when looking at a bat’s wing, found capillaries and could show how blood flowed into the veins, that the circulation theory of blood gained ground. But it’s instructive that Galen’s idea had zero evidence, just tradition, but to uproot the evidence-less tradition required evidence.

Luigi Galvani was referred to as “The Frog’s Dancing Master” by his contemporaries for arguing for the existence of animal electricity. Alessandro Volta managed to produce electricity by chemical means, and so Galvani was seen as something of a joke, his experiments seen as producing electricity chemically near a frog’s legs and claiming the existence of animal electricity.

Of course we now know that electrical signals are vital to move muscles in the body, but he was ridiculed at the time. On a side note, Galvani opposed the French Revolution and predicted it would end in disaster.

In 1794, Chladni was ridiculed for his belief in the existence of meteorites.

In 1810, Goethe and his theory of colors.

In the early 1800s, Carl Gauss spoke but never published anything on non-euclidian geometry for fear of ridicule. Nikolai Lobachevsky later published writings on non-euclidian geometry and was ridiculed, and it took decades still for non-euclidian geometry to be generally accepted.

In 1827, Georg Ohm set up a series of experiments to measure electrical resistance, eventually leading, in 1881, to the Ohm being adopted as the official unit for the measure of electrical resistance.

During his lifetime, however, he got no respect. One reviewer said of Ohm’s work,

[it[ is a web of naked fancies, which can never find the semblance of support from even the most superficial observation of facts; he who looks on the world with the eye of reverence must turn aside from this as the result of an incurable delusion, whose sole effort is to detract from the dignity of nature.”

The Prussian minister of education in 1830 said of Ohm’s work, “A professor who preached such heresies is unworthy to teach science.”

In 1842, Christian Doppler proposed the optical Doppler effect, the red-shift and blue-shift, but was opposed for 26 years because his theory didn’t fit with the then established Luminiferous Aether theory. Doppler was proven right in 1868 when Huggins found red and blue shifts in stars.

In 1887, Svante Arrhenius introduced the idea that atoms could have ions – that is they could be missing or have an extra electron. This went at odds with conventional thinking that atoms were indivisible, because the Greek origin of the term. Those Greeks I tell ya. They had things to say.

John Baird and his television camera were laughed at in 1902.

Here’s something few have heard about: the Wright brothers actually spent a year after Kittyhawk flying their plane next to a busy rail line in Dayton Ohio. Authorities refused to come to the demonstrations, the Scientific American published stories debunking the “Lying Brothers”, and the local newspaper never bothered to even send a reporter, but they did complain about the local crazies who swore they saw the thing fly. It wasn’t until the Wright Brothers went to Europe that the invention was an overnight sensation.

The US authorities weren’t just “slow to react” as the history channel will say. They downright denied it.

In 1912, Alfred Wegener pushed the idea of plate tectonics. Dr. Rollin T. Chamberlain of the University of Chicago said,

Wegener’s hypothesis in general is of the footloose type, in that it takes considerable liberty with our globe, and is less bound by restrictions or tied down by awkward, ugly facts than most rival theories.”

Pierre Termier, the director of the Geological Survey of France, said Wagener’s work was,

a beautiful dream, the dream of a great poet. One tries to embrace it, and finds that he has in his arms but a little vapor and smoke; it is at the same time both alluring and intangible.”

The president of the American Philosophical Society said,

Anyone who valued his reputation for scientific sanity would never dare support such a theory.”

Edward Berry, a professor of paleontology at Johns Hopkins, said,

My principal objection to the Wegener hypothesis rests on the author’s method. This, in my opinion, is not scientific, but takes the familiar course of an initial idea, a selective search through the literature for corroborative evidence, ignoring most of the facts that are opposed to the idea, and ending in a state of auto-intoxication in which the subjective idea comes to be considered as an objective fact”.

In 1926, Rollin Chamberlain said,

If we are to believe Wegener’s hypothesis, we must forget everything which has been learned about earth science in the last 70 years and start all over again.”

The theory of plate tectonics wasn’t accepted until the 1960s.

On the topic of geology, in 1923 Harlen Bretz put forth the theory based on the Washington scablands that the landscape could be shaped by massive, catastrophic events, such as floods. This was at odds with the uniformitarian view. Bretz was not kindly recognized by the establishment, who immediately pounced on it as outrageously wrong and in need of debunking.

Bretz came into contact with Joseph Pardee who was convinced of Bretz’s work. Pardee wanted to work with Bretz, but was dissuaded by his employer who threatened to fire him if he associate with Bretz.

Then in 1927, the matter came to a head when Bretz was invited to a debate at the Geological Society of Washington. There were six authoritative geologists lined up to oppose Bretz. And they completely trounced Bretz, and many of Bretz’s former supporters abandoned him after this debate.

In 1940, there was a field trip organized by that same society to the Scablands designed to put to rest any doubt. Bretz was invited, but declined for fear of further ridicule. On the trip, there were 8 geologists who went, and they all sat around and reported their findings, until Pardee stood up, and quietly explained certain formations that could have only formed from a massive, catastrophic event.

Pardee was not blackballed for this act, his input merely quietly acknowledged, then ignored. It wasn’t until the 1970s that Bretz’s ideas were generally accepted.

In 1919, Robert Goddard had been making some liquid-fueled rockets, and hypothesized that with the right design and scale, such a rocket could break earth’s orbit and perhaps even reach the moon. That year he published a book entitled, “A Method of Reaching Extreme Altitudes” where he outlined all of this.

He was criticized by the New York Times, which said that a rocket could never break the atmosphere because it would have nothing to push against, and claimed Goddard didn’t understand basic high school physics. Goddard had his funding cut, was blackballed, acquaintances turned away – you know the drill.

In 1931, Karl Jansky, an amateur astronomer, had been experimenting with radio equipment and built a radio telescope. Not blackballed, but ignored by contemporary astronomers who were all using optical telescopes.

The case of Ernst Stuckelberg, is illustrative not because of any mobbing or blackballing like with the others, but because of what he discovered and how the credit went to others who were better at presenting;

1934: He devised a fully covariant perturbation theory for quantum fields that was more powerful than other formulations at the time

1935: He gave vector boson exchange as the theoretical explanation for the strong nuclear force. Normally credited to Yukawa who discovered it independently at the same time.

1938: He discovered that massive electrodynamics contain a hidden scalar, which would become known later as the Abelian Higgs mechanism.

1942: He proposed the interpretation of the positron as a negative energy electron traveling back in time, which is an observation credited to Feynman.

1943: He came up with a renormalization program to attack the problems of infinities in quantum electrodynamics. This was a precursor to something Frynman, Schwinger and Tomanga won a Nobel Prize for but he got no mention.

1953: He and Andre Petermann discovered the renormalization group, but Kenneth Wilson took the Nobel Prize for it after applying it to something important.

The point here is the inherently social nature of academia. Zwicky and Zweig are also examples of brilliant people who just didn’t get on well.

In 1948 Barbara McClintock, who discovered transposons and the role of Activators and Dissociators in the DNA sequence. She wrote of her experience:

Over the years I have found that it was difficult if not impossible to bring to consciousness of another person the nature of his tacit assumptions when, by some special experiences, I have been made aware of them. This became painfully evident to me in my attempts during the 1950s to convince geneticists that the action of genes had to be and was controlled. It is now equally painful to recognize the fixity of assumptions that many persons hold on the nature of controlling elements in [corn] and the manners of their operation. One must await the right time for conceptual change.”

Basically she’s saying researchers were unaware of their assumptions, even when she would point them out.

We saw similar things happen with Francis Rous and his idea of viral cancer in 1911, Joseph Goldberger regarding Pellegra, Fritz Zwicky on dark matter, George Zweig on quarks, and Chandrasekhar on black holes.

Lynn Margulis on Endosymbiotic Organelles. Today, textbooks usually quote her discovery as fact, but don’t mention the dogmatic opposition to it. How she was denied funding and told to never apply again.

Fernando Nottebohm did research on birds that showed neurons growing in adulthood. At first he was ridiculed, but 20 years later we now know adult humans can grow neurons.

Theodore Maimain on the laser. Stanley Prusiner on prions, which was initially scorned by rapidly vindicated when mad cow disease began to spread to humans.

Which reminds me of Barry Marshall, who was trying to show his opponents that the bacteria Helicobacter Pylori was the cause of stomach ulcers. He finally decided to ingest the bacteria and immediately developed massive stomach ulcers, and that effectively ended the debate.

Then there was Josiah Nott, who pushed the idea that yellow fever was transmitted through mosquitos. He was ignored, and the group of men who championed his theory at the turn of the century were derisively called the “Mosquito-men” and were seen as crackpots. It wasn’t until the eradication of mosquitos during the construction of the Panama canal, and the resulting collapse in malaria, that the mosquito transmission theory was accepted.

The last of the laundry list I want to go over here is the story of Warren S. Warren. Warren discovered spin interactions between distant molecules, which apparently means something for Magnetic Resonance Imaging machines. His colleagues knew he was wrong, warned that he was endangering his career if he kept up such nonsense, and actually held a roast where they mocked and ridiculed his work.

Seven years later his results were vindicated, are used to improve MRI techniques today, and nothing happened to those who mocked him.

That’s what I could find. How much more has been forgotten? How many episodes of this academic thuggery have gone down the memory hole?

Bretz opposed uniformitarianism, but how many other researchers wanted to point to the same thing but were shut down?

Many of these people were only vindicated after they had died.

No Supermen

What many people forget when looking at the theories that worked marvelously and the people who came up with them, is the many wrong theories these people ALSO held.

Julius von Mayer – the guy who gave us the conservation of matter, also believed that the Sun would burn out in 5000 years if not replenished by a new energy source, and was able to keep burning because of meteorites that hit it.

Isaac Newton – Isaac Newton’s main interest was not physics. It was studying hidden meaning in the bible and alchemy. He believed that if he knew the dimensions of the temple of Solomon he could predict the date of the Apocalypse, and that it was possible to create the philosopher’s stone which granted immortality and could turn any metal into Gold.

Alfred Russel Wallace – Came up with evolution, and even helped Charles Darwin through several concepts. Wallace believed that we could talk to spirits on the other side and tried to induce seances to do so.

Of course then there is the fact that everyone, at least before 1900, was a professed Christian, probably a young-earth creationist.

I also want to point to 3 figures who epitomize this. Karl Marx, Sigmund Freud, and Noam Chomsky.

The genius of Karl Marx was to recognize antagonistic classes. Prior to Marx, the predominant thinking had been of classes of society that were harmonious, that existed with pre-set functions and worked together as a seamless whole.

When we look at the medieval pictures depicting those who fought, those who prayed, and those who worked, we see that as more of a descriptive thinking about the way things are at that point in history, not some divinely or naturally inevitable order. The modern eye doesn’t realize that this was PRESCRIPTIVE, meant to describe the way things SHOULD be.

The modern eye sees that those who worked were getting screwed, in that sense we are all Marxists.

Wedded to this is a bunch of baggage, i.e. the Marxist Labor Theory of value, ascribing profits as inherently being the seizure of surplus value by the capitalist, even though Marx later recognized that the capitalist serves some function. So you had breakthrough in thinking + horrible baggage.

Sigmund Freud’s great breakthrough was the idea of the subconscious. Prior to this, people generally didn’t believe in a subconscious. Today the idea of the subconscious is common coin. Of course Freud then went on to postulate anal and oral stages or development, penis envy and all sorts of weird psycho-sexual nonsense that doesn’t really work. Again, breakthrough in thinking + silly baggage.

Noam Chomsky, in the field of linguistics, made a tremendous break from his contemporaries who viewed linguistics as verbal taxonomy, while Chomsky looked at language as merely a clue regarding the underlying psychology. You look at the language people speak and how they learn it as a clue to what’s going on down below.

Now today Chomsky isn’t really on the cutting edge anymore, even though he set the paradigm. And I believe his more radical conception of deep structure will be remembered as a bit silly.

The reason I bring up these geniuses that believed wrong things is not to malign Newton’s genius. It is to say that superman doesn’t exist, to point out just how mortal these scientists are.

If you go to a university, there is no doubt you have good professors. Great teachers, great faculty who help you and have genuine concern for you.

And so it is important to keep separate criticism of the institution from criticism of the individuals in those institutions, and to recognize that at the end of the day we’re still just hairless apes.

Wrong Shifts

One cannot deny that, on the whole, scientific consensus progresses toward truth. But when someone says for example that global warming is a sham, they aren’t making a claim that scientific consensus in general is more often wrong than right. Just that it is wrong in a specific instance, and that this is not unprecedented.

Scientific consensus has shifted in incorrect ways before. The easiest way to show this, without having to actually go into the deep and endless complexity of actual scientific debates, is with examples of scientific consensuses that went back and forth.

For example, if scientists held to theory A, then moved to theory B, then back to theory A, we would know that at least ONE of those movements had to have been in the wrong direction.

Fat vs. Carbohydrates as cause of obesity –

In 1976 Ancil Keyes wrote the “Seven Countries” study which argued that a diet high in saturated fat caused heart disease. Prior to that, the conventional wisdom had been that carbohydrates caused obesity and heart disease.

In 1988, surgeon general C. Everett Koop made a statement that high saturated fat caused obesity and heart disease, and by around the 1990s the “low fat” craze began, where foods that were high in fat had the fat removed and replaced with carbohydrates.

However, since 2005 the consensus has moved back in the direction of carbohydrates as the chief culprit, and in 2010 a symposium of nutritionists meeting in Copenhagen came to no firm conclusion on the topic.

I am not here to argue who is correct or incorrect. The point is that consensus was pro-animal fat, then it shifted to pro-carb, now it’s shifting back to pro-fat.

This means the consensus had to have shifted in an incorrect direction at one of these times.

Minimum Wage –

Adam Smith supported a minimum wage on the grounds that capital could collude much more easily and hold out in the short term more comfortably than labor, and while in the long run capital probably needed labor more, in reality the laborers would feel hunger pangs and break quicker, and needed legislation to keep their wages up.

David Ricardo argued against it, saying,

These then are the laws by which wages are regulated, and by which the happiness of far the greatest part of every community is governed. Like all other contracts, wages should be left to the fair and free competition of the market, and should never be controlled by the interference of the legislature.”

And as far as we can tell, by the 1900s economists were split on the question of the minimum wage as outlined in “The Very Idea of Applying Economics: The Modern Minimum-Wage Controversy and Its Antecedents” by Thomas C. Leonard.

However in 1936, John Maynard Keynes published “The General Theory of Employment, Interest and Money” and in that was very much opposed to minimum wages, in fact “sticky wages” was a central problem according to his theory.

The first survey of economists on the minimum wage that I know of was in 1978 from the American Association of Economists, and it showed that 68% of economists believed the minimum wage decreased employment, 22% agreed with provisions, 10% generally disagreed.

That same question asked in 1992 found that 56.5% of economists believed the minimum wage decreased employment, 22.4% agreed with provisions, 20.5% generally disagreed.

In 2003 45.6% of economists believed the minimum wage decreased employment, 27.9% agreed with provisions, 26.5% generally disagreed.

In 2006, Robert Whaples surveyed the American Economics Association and asked what they supported. 46.8% of economics supported eliminating the minimum wage, 1.3% wanted it decreased, 14.3% wanted it to stay the same and 37.7% wanted it increased.

In 2013, the IGM economics survey panel found that 47% of economists thought that raising the minimum wage was worth the distortionary costs, 32% were uncertain, while only11% disagreed.

Whatever your view on the minimum wage, it must be the case that the consensus moved in the wrong direction at some point. When Keynes burst onto the scene, there was certainly an increase in opposition to the minimum wage among economists from whatever it was before, which is unclear since there are no surveys from back then.

And we know that from 1978 to 2013, economists have been shifting more in favor of the minimum wage. Since these two shifts were in opposite directions, one of them had to be wrong.

Behaviorism vs. Deep Structure in language –

Prior to Noam Chomsky, the consensus view in the field of linguistics was the “big brain” theory, or “behaviorist” theory of language, which was simply that the brain is an all-purpose computer that, through conditioning, learned language in the way one learns math or learns how an engine works. The primary champion of this view was Burrhus Frederic Skinner.

Noam Chomsky in 1957 argued against this, saying that underlying all language was a universal grammar that all humans posessed. This “universal grammar” was not something that could be learned or specified, but was more of a pre-language structure of thought into which specific language was plugged in. By the 1960s this became the dominant school of thought in linguistics.

Increasingly however linguists are coming back to behaviorist views of language. Keep in mind that the existence of a speech area of the brain is not evidence of universal grammar. The reason for this is that behaviorist views of language generally have more practical applications, and machine learning for language uses simple statistical models that have no underlying universal grammar.

I don’t know where the field as a whole stands today, but I do know that it is moving back toward the behaviorist side relative to where it was in say 1970, which would be another example of opposite shifts in the field of linguistics, meaning that one of the shifts must have been in the wrong direction.

Leeches and Bloodletting – The use of leeches to treat injuries and increase bloodflow has been documented as far back as 2,500 years ago in ancient India and Greece. Their use peaked roughly around 1830 to 1850, and declined in use and became known as quackery until the 1980s. The rise of plastic surgery led to a rebounding of leech usage, and in 2004 the FDA certified them as an effective medical tool.

Whatever your opinion on leeches, the fact is that they were widely used, then considered quackery, and are now making a comeback, meaning that at least one of these shifts had to have been in the wrong direction.

Lamarckian / Lysenkoist Evolution – Lamarckian evolution is the belief that long term evolution occurs in response to the direct environmental stimuli of the parent.

For example, if the parent faces a cold climate, the offspring will be directly adapted to a colder climate.

This view of evolution fell out of favor following the rise of Darwinian evolution, which posits that adaptation occurs in populations by the dying off and / or higher reproduction of individuals better adapted to an environment.

Because each generation is somewhat of a dice roll in terms of inherent traits, some individuals will be more or less adapted to, for example, a cold climate, and those better adapted will reproduce more, and so over time the population will be better adapted to the cold.

However, it has since been discovered that the expression of genes can be modified through processes that change the kinds of proteins that genes produce. These are known as “epigenetic effects”, and are triggered all the time by environmental effects such as nutrition, stress levels, exercise, et cetera.

You can think of these “epigenetic effects” as dipswitches that modify how the genes are expressed, and the positioning of these dipswitches on the DNA can be inherited. Of course this cannot be the primary mode of evolution in the long run as it is ultimately limited by the DNA pattern (The ACTG base pairs along the double-helix ladder), but in can in theory produce much bigger effects in the short run than natural selection can.

And so the complete dismissal of the heritability of environmental effects was an overstretch. I don’t think it represented a shift in the wrong direction like the other four examples, as pure Darwinian evolution is more correct than pure Lamarckian evolution, but it’s an example of a partial return to an older idea that was completely dismissed, even though it was partially correct.

What’s the point of this?

Obviously bringing up a string of anecdotes is not the best kind of argument. Data is almost always better. So why even create an assembly of anecdotes like this?

The reason is because of the positive effect learning about these episodes has had on my view of the nature of groups of experts. Because while you can know, by the data, that you shouldn’t believe them all the time, it is very easy to get sucked back in if expert-groups can create a feeling that they have addressed everything in good faith.

This is because someone who is devoting their life to a narrow topic will always know more than you will on that topic, will always be able to cite more things. And so even if you know you shouldn’t, people will be, at an emotional level, drawn to rank and the trappings of authority.

And the point of these anecdotes is to create an anti-authoritarian feeling to counter the comfort feeling from authority.

Facebook Comments
  • MeanFacts

    I like it.

  • Jake

    Hey Ryan, this is “rainskull”. Would be interested to catch up with you sometime.