A few years ago a group of researchers in Scotland studying learning in apes did some experiments (involving opening boxes to get a piece of candy inside) that showed that chimpanzees learn in a variety of “flexibly adaptive” ways, and that 3 year old children being presented with a similar task most often did it in ways that appear to be less intelligent than the apes. They “suggest that the difference in performance of chimpanzees and children may be due to a greater susceptibility of children to cultural conventions.” (Horner and Whiten, 2005; Whiten, et al., 2004).
In my newsletter on puberty, I described some of the effects of foods and hormones on intelligence. Here, I want to consider the effects of culture on the way people learn and think. Culture, it seems, starts to make us stupid long before the metabolic problems appear.
For many years I described culture as the perceived limits of possibility, but people usually prefer to think of it as the learned rules of conduct in a society. In the late 1950s I was talking with a psychologist about the nature of “mental maps,” and I said that I found my way around campus by reference to mental pictures of the locations of things, and he said that his method was to follow a series of rules, “go out the front door and turn left, turn left at the first corner, walk three blocks and turn right, ....up the stairs, turn right, fourth office on the left.” He had been studying mental processes for about 40 years, so his claim made an impression on me.
I thought this style of thinking might have something to do with the growing technological preference for digital, rather than analog, devices. The complexity and continuity of the real world is made to seem more precise and concrete by turning it into rules and numbers.
Around the same time, I found that some people dream in vivid images, while others describe dreams as “listening to someone tell a story.”
Several years later, a graduate student of “language philosophy” from MIT told me that I was just confused if I believed that I had mental images that I could use in thinking. His attitude was that language, in its forms and in the ways it could convey meaning, was governed by rules. He was part of an effort to define consciousness in terms of rules that could be manipulated formally. This was just a new variation on the doctrine of an “ideal language” that has concerned many philosophers since Leibniz, but now its main use is to convince people that cultural conventions and authority are rooted in the nature of our minds, rather than in particular things that people experience and the ways in which they are treated.
George Orwell, whose novels showed some of the ways language is used to control people, believed that language should be like a clear window between minds, but knew that it was habitually used to distort, mislead, and control. Scientific and medical practices often follow the authority of culture and indoctrination, instead of intelligently confronting the meaning of the evidence, the way chimpanzees are able to do.
Not so many years ago, people believed that traits were “determined by genes,” and that the development of an organism was the result of--was caused by--the sequential expression of genes in the nucleus of the fertilized egg. When B.F. Skinner in the 1970s said “a gestating baby isn't influenced by what happens to its mother,” he was expressing a deeply rooted bio-medical dogma. Physicians insisted that a baby couldn't be harmed by its mother's malnutrition, as long as she lived to give birth. People could be quite vicious when their dogma was challenged, but their actions were systematically vicious when they weren't challenged.
An ovum doesn't just grow from an oocyte according to instructions in its genes, it is constructed, with surrounding nurse cells adding substances to its cytoplasm. Analogously, the fertilized egg doesn't just grow into a human being, it is constructed, by interactions with the mother's physiology. At birth, the environment continues to influence the ways in which cells develop and interact with each other.
Even during adulthood, the ways in which our cells--in the brain, immune system, and other organs--develop and interact are shaped by the environment. When Skinner was writing, many biologists still believed that each synapse of a nerve was directed by a gene, and couldn't be influenced by experience.
Our brain grows into our culture, and the culture lives in our nervous system. If a person grows up without hearing people speak, he will have grown a special kind of brain, making it difficult to learn to speak. (Genie, wolf boy, Kaspar Hauser, for example.)
When we ask a question and find an answer, we are changed. Thinking with learning is a developmental process. But many people learn at an early age not to question. This changes the nature of subsequent learning and brain development.
In the 1960s, many textbooks were published that claimed to use scientific language theory to improve the instruction of English, from grade school level to college level. They didn't work, and at the time they were being published they appeared fraudulent to people who didn't subscribe to the incipient cults of “Generative Grammar” and “Artificial Intelligence” that later developed into “Cognitive Science.”
At the time that Artificial Intelligence was coming to the attention of investors and academicians, Neodarwinism had already cleansed the university biology departments of its opponents who advocated more holistic views, and the idea of a brain that was “hard-wired” according to genetic instructions had entered both neurology and psychology. The field concept was disappearing from developmental biology, as Gestalt psychology was disappearing from the universities and journals.
In the humanities and social sciences, a fad appeared in the 1960s, in which a theory of grammar advocated by Noam Chomsky of MIT was said to explain human thinking and behavior, and specialists in anthropology, psychology, literature, rhetoric, sociology, and other academic fields, claimed that it informed their work in an essential way. The rapid spread of a doctrine for which there was essentially no evidence suggests that it was filling a need for many people in our culture. This doctrine was filling some of the gaps left by the failure of genetic determinism that was starting to be recognized. It gave new support to the doctrine of inborn capacities and limitations, in which formulaic indoctrination can be justified by the brain's natural structure.
Chomsky was committed to an idealistic, “rationalist” doctrine of innate ideas, and to argue for that doctrine, which held that there are transcendent forms (or “deep structures”) that control mind, he disposed of the opposing “empiricist” approach to mind by claiming that children simply learn language so rapidly that it would be impossible to explain on the basis of learning from experience. Separating vocabulary from grammar, he acknowledged that each language is different, and can be learned as easily by the children of immigrants of different ethnicity as by children whose ancestors spoke it, but that all humans have a genetically encoded “universal grammar,” a “language organ.” It is this “inborn grammar” that allows children to learn what he said would be inconceivable to learn so quickly from experience.
The abstract, computational nature of the “inborn” functions of the “language organ” would make a nice program for a translating machine, and the absence of such a useful program, after more than 50 years of trying to devise one, argues against the possibility of such a thing.
Since Plato's time, some people have believed that, behind the changing irregularities of real languages, there is a timeless, context-free language. In the late 1950s, when I was studying language and the “ideal languages” of the philosophers, I realized that George Santayana was right when he pointed out that each time an artificial language is used by real people in real situations, it is altered by the experience that accrues to each component, from the context in which it is used. If real language were the model for mathematics, then the values of numbers would change a little with every calculation.
Adults are usually slower than children at learning a new language, but they can make the process much quicker by memorizing paradigms. With those models, they can begin speaking intelligible sentences when they know only a few words. These basics of grammar are often outlined in just a few pages, but listing irregularities and exceptions can become very detailed and complex. The grammar that children use isn't as subtle as the grammar some adults use, and college freshmen are seldom masters of the grammar of their native language.
There have been various studies that have investigated the number of words understood by children at different ages.
The Virginia Polytechnic Institute website says that
By age 4 a person probably knows 5,600 words
By age 5 a person probably knows 9,600 words
By age 6 a person probably knows 14,700 words
By age 7 a person probably knows 21,200 words
By age 8 a person probably knows 26,300 words
By age 9 a person probably knows 29,300 words
By age 10 a person probably knows 34,300 words
By age 20 a college sophomore probably knows 120,000 words
A dictionary with 14,000 words is a substantial book. The grammar used by a 6 year old person isn't very complex, because at that age a person isn't likely to know all of the subtleties of their language. There is no reason to assume that a mind that can learn thousands of words and concepts in a year can't learn the grammatical patterns of a language--a much smaller number of patterns and relationships--in a few years.
Idioms and clichés are clusters of words that are frequently used together in the same pattern to express a stereotyped meaning. There are thousands of them in English, and some of them have existed for centuries, while others are regional and generational. It is possible to speak or write almost completely in clichés, and they are such an important part of language that their acquisition along with the basic vocabulary deserves more attention than linguists have given it. A mind that can learn so many clichés can certainly learn the relatively few stereotypical rules of phrasing that make up the grammar of a language. In fact, a grammar in some ways resembles a complex cliché.
Recognition of patterns, first of things that are present, then of meaningful sequences, is what we call awareness or consciousness. There is biological evidence, from the level of single cells through many types of organism, both plant and animal, that pattern recognition is a basic biological function. An organism that isn't oriented in space and time isn't an adapted, adapting, organism. Environments change, and the organization of life necessarily has some flexibility.
A traveling bird or dog can see a pattern once, and later, going in the opposite direction, can recognize and find specific places and objects. An ant or bee can see a pattern once, and communicate it to others.
If dogs and birds lived in colonies or cities, as bees and ants do, and carried food home from remote locations, they might have a need to communicate their knowledge. The fact that birds and dogs use their vocal organs and brains to communicate in ways that people have seldom cared to study doesn't imply that their brains differ radically from human brains in lacking a “language organ.”
People whose ideology says that “animals use instinct rather than intelligence,” and that they lack “the language instinct,” refuse to perceive animals that are demonstrating their ability to generalize or to understand language.
Organisms have genes, so a person could say that pattern recognition is genetically determined, but it would be a foolish and empty thing to say. (Nevertheless, people do say it.) The people who believe that there are “genes for grammar” believe that these mind-controlling genes give us the ability to generalize, and therefore say that animals aren't able to generalize, though their “instinctive behaviors” might sometimes seem to involve generalization.
In language, patterns are represented symbolically by patterned sounds, and some of those symbolically represented patterns are made up of other patterns. Different languages have different ways of representing different kinds of patterns.
“Things” are recognizable when they are far or near, moving or still, bright or dark, or upside down, because the recognition of a pattern is an integration involving both spatial and temporal components. The recognition of an object involves both generalization and concreteness.
Things that are very complex are likely to take longer to recognize, but the nature of any pattern is that it is a complex of parts and properties.
A name for “a thing” is a name for a pattern, a set of relationships.
The method of naming or identifying a relationship can make use of any way of patterning sound that can be recognized as making distinctions. Concepts and grammar aren't separable things, “semantics” and “syntax” are just aspects of a particular language's way of handling meaning.
As a child interacts with more and more things, and learns things about them, the patterns of familiar things are compared to the patterns of new things, and differences and similarities are noticed and used to understand relationships. The comparison of patterns is a process of making analogies, or metaphors. Similarities perceived become generalizations, and distinctions allow things to be grouped into categories.
When things are explored analogically, the exploration may first identify objects, and then explore the factors that make up the larger pattern that was first identified, in a kind of analysis, but this analysis is a sort of expansion inward, in which the discovered complexity has the extra meaning of the larger context in which it is found.
When something new is noticed, it excites the brain, and causes attention to be focused, in the “orienting reflex.” The various senses participate in examining the thing, in a physiological way of asking a question. Perception of new patterns and the formation of generalizations expands the ways in which questions are asked. When words are available, questions may be verbalized. The way in which questions are answered verbally may be useful, but it often diverts the questioning process, and provides rules and arbitrary generalizations that may take the place of the normal analogical processes of intelligence. The vocabulary of patterns no longer expands spontaneously, but tends to come to rest in a system of accepted opinions.
A few patterns, formulated in language, are substituted for the processes of exploration through metaphorical thinking. In the first stages of learning, the process is expansive and metaphorical. If a question is closed by an answer in the form of a rule that must be followed, subsequent learning can only be analytical and deductive.
Learning of this sort is always a system of closed compartments, though one system might occasionally be exchanged for another, in a “conversion experience.”
The exploratory analogical mind is able to form broad generalizations and to make deductions from those, but the validity of the generalization is always in a process of being tested. Both the deduction and the generalization are constantly open to revision in accordance with the available evidence.
If there were infallible authorities who set down general rules, language and knowledge could be idealized and made mathematically precise. In their absence, intelligence is necessary, but the authorities who would be infallible devise ways to confine and control intelligence, so that, with the mastery of a language, the growth of intelligence usually stops.
In the 1940s and '50s, W.J.J. Gordon organized a group called Synectics, to investigate the creative process, and to devise ways to teach people to solve problems effectively. It involved several methods for helping people to think analogically and metaphorically, and to avoid stereotyped interpretations. It was a way of teaching people to recover the style of thinking of young children, or of chimps, or other intelligent animals.
When the acquisition of language is burdened by the acceptance of clichés, producing the conventionalism mentioned by Horner and Whiten, with the substitution of deductive reasoning for metaphorical-analogical thinking, the natural pleasures of mental exploration and creation are lost, and a new kind of personality and character has come into existence.
Bob Altemeyer spent his career studying the authoritarian personality, and has identified its defining traits as conventionalism, submission to authority, and aggression, as sanctioned by the authorities. His last book, The Authoritarians (2006) is available on the internet.
Altemeyer found that people who scored high on his scale of authoritarianism tended to have faulty reasoning, with compartmentalized thinking, making it possible to hold contradictory beliefs, and to be dogmatic, hypocritical, and hostile.
Since he is looking at a spectrum, focusing on differences, I think he is likely to have underestimated the degree to which these traits exist in the mainstream, and in groups such as scientists, that have a professional commitment to clear reasoning and objectivity. With careful training, and in a culture that doesn't value creative metaphorical thinking, authoritarianism might be a preferred trait.
Konrad Lorenz (who with Niko Tinbergen got the Nobel Prize in 1973) believed that specific innate structures explained animal communication, and that natural selection had created those structures. Chomsky, who said that our genes create an innate “Language Acquisition Device,” distanced himself slightly from Lorenz's view by saying that it wasn't certain that natural selection was responsible for it. However, despite slightly different names for the hypothetical innate “devices,” their views were extremely similar.
Both Lorenz and Chomsky, and their doctrine of innate rule-based consciousness, have been popular and influential among university professors. When Lorenz wrote a book on degeneration, which was little more than a revised version of the articles he had written for the Nazi party's Office for Race Policy in the late 1930s and early 1940s, advocating the extermination of racial “mongrels” such as jews and gypsies, most biologists in the US praised it. Lorenz identified National Socialism with evolution as an agent of racial purification. His lifelong beliefs and activities--the loyalty to a strong leader, advocating the killing of the weak--identified Lorenz as an extreme authoritarian.
When a famous professor went on a lecture tour popularizing and affirming the scientific truth and importance of those publications, and asserting that all human actions and knowledge, language, work, art, and belief, are specified and determined by genes, he and his audience (which, at the University of Oregon, included members of the National Academy of Sciences and Jewish professors who had been refugees from Nazism, who listened approvingly) were outraged when a student mentioned the Nazi origin and intention of the original publications.
They said “you can't say that a man's work has anything to do with his life and political beliefs,” but in fact the lecturer had just finished saying that everything a person does is integral to that person's deepest nature, just as Lorenz said that a goose with a pot belly and odd beak, or a person with non-nordic physical features and behavior and cultural preferences--should be eliminated for the improvement of the species. Not a single professor in the audience questioned the science that had justified Hitler's racial policies, and some of them showed great hostility toward the critic.
In the 1960s, a professor compared graduate students' scores on the Miller Analogies Test, which is a widely used test of analogical thinking ability, to their academic grades. She found that the students who scored close to the average on the test had the highest grades and the greatest academic success, and those who deviated the most from the average on that test, in either direction, had the worst academic grades. If the ability to think analogically is inversely associated with authoritarianism, then her results would indicate that graduate schools select for authoritarianism. (If not, then they simply select for mediocrity.)
Although Bob Altemeyer's scale mainly identified right-wing, conservative authoritarians, he indicated that there could be left-wing authoritarians, too. Noam Chomsky is identified with left-wing political views, but his views of genetic determinism and a “nativist” view of language learning, and his anti-empiricist identification of himself as a philosophical Rationalist, have a great correspondence to the authoritarian character. The “nativist” rule-based nature of “Cognitive Science” is just the modern form of an authoritarian tradition that has been influential since Plato's time.
The first thing a person is likely to notice when looking at Chomsky's work in linguistics is that he offers no evidence to support his extreme assertions. In fact, the main role evidence plays in his basic scheme is negative, that is, his doctrine of “Poverty of the Stimulus” asserts that children aren't exposed to enough examples of language for them to be able to learn grammar--therefore, grammar must be inborn.
I think Chomsky discovered long ago that the people around him were sufficiently authoritarian to accept assertions without evidence if they were presented in a form that looked complexly technical. Several people have published their correspondence with him, showing him to be authoritarian and arrogant, even rude and insulting, if the person questioned his handling of evidence, or the lack of evidence.
For example, people have argued with him about the JFK assassination, US policy in the Vietnam war, the HIV-AIDS issue, and the 9/11 investigation. In each case, he accepts the official position of the government, and insults those who question, for example, the adequacy of the Warren Commission report, or who believe that the pharmaceutical industry would manipulate the evidence regarding AIDS, or who doubt the conclusions of the 9/11 Commission investigation.
He says that investigation of such issues is “diverting people from serious issues,” as if those aren't serious issues. And “even if it's true” that the government was involved in the 9/11 terrorism, “who cares? I mean, it doesn't have any significance. I mean it's a little bit like the huge amount of energy that's put out on trying to figure out who killed John F. Kennedy. I mean, who knows, and who cares…plenty of people get killed all the time. Why does it matter that one of them happens to be John F. Kennedy?"
"If there was some reason to believe that there was a high level conspiracy" in the JFK assassination, "it might be interesting, but the evidence against that is just overwhelming." "And after that it's just a matter of, uh, if it's a jealous husband or the mafia or someone else, what difference does it make?" "It's just taking energy away from serious issues onto ones that don't matter. And I think the same is true here," regarding the events of 9/11. These reactions seem especially significant, considering his reputation as America's leading dissenter.
The speed with which Chomskyism spread through universities in the US in the 1960s convinced me that I was right in viewing the instruction of the humanities and social sciences as indoctrination, rather than objective treatment of knowledge. The reception of the authoritarian ideas of Lorenz and his apologists in biology departments offered me a new perspective on the motivations involved in the uniformity of the orthodox views of biology and medicine.
In being introduced into a profession, any lingering tendency toward analogical-metaphoric thinking is suppressed. I have known perceptive, imaginative people who, after a year or two in medical school, had become rigid rule-followers.
One of the perennial questions people have asked when they learn of the suppression of a therapy, is “if the doctors are doing it to defend the profitable old methods, how can they refuse to use the better method even for themselves and their own family?” The answer seems to be that their minds have been radically affected by their vocational training.
For many years, cancer and inflammation have been known to be closely associated, even to be aspects of a single process. This was obvious to “analog minded” people, but seemed utterly improbable to the essentialist mentality, because of the indoctrination that inflammation is a good thing, that couldn't coexist with a bad thing like cancer.
The philosophy of language might seem remote from politics and practical problems, but Kings and advertisers have understood that words and ideas are powerfully influential in maintaining relationships of power.
Theories of mind and language that justify arbitrary power, power that can't justify itself in terms of evidence, are more dangerous than merely mistaken scientific theories, because any theory that bases its arguments on evidence is capable of being disproved.
In the middle ages, the Divine Right of Kings was derived from certain kinds of theological reasoning. It has been replaced by newer ideologies, based on deductions from beliefs about the nature of mind and matter, words and genes, “Computational Grammar,” or numbers and quantized energy, but behind the ideology is the reality of the authoritarian personality.
I think if we understand more about the nature of language and its acquisition we will have a clearer picture of what is happening in our cultures, especially in the culture of science.
REFERENCES
New Yorker, April 16, 2007, “The Interpreter: Has a remote Amazonian tribe upended our understanding of language?” by John Colapinto. “Dan Everett believes that Pirahã undermines Noam Chomsky's idea of a universal grammar.”
Language & Communication Volume 23, Issue 1, January 2003, Pages 1-43. “Remarks on the origins of morphophonemics in American structuralist linguistics,” E. F. K. Koerner. Chomsky has led the public to believe that he originated things which he borrowed from earlier linguists.
Science. 2008 Feb 1;319(5863):569; author reply 569. Comparing social skills of children and apes. De Waal FB, Boesch C, Horner V, Whiten A. Letter
Curr Biol. 2007 Jun 19;17(12):1038-43. Epub 2007 Jun 7. Transmission of multiple traditions within and between chimpanzee groups. Whiten A, Spiteri A, Horner V, Bonnie KE, Lambeth SP, Schapiro SJ, de Waal FB. Centre for Social Learning and Cognitive Evolution and Scottish Primate Research Group, School of Psychology, University of St Andrews, St Andrews KY16 9JP, United Kingdom. A.whiten@st-andrews.ac.uk Field reports provide increasing evidence for local behavioral traditions among fish, birds, and mammals. These findings are significant for evolutionary biology because social learning affords faster adaptation than genetic change and has generated new (cultural) forms of evolution. Orangutan and chimpanzee field studies suggest that like humans, these apes are distinctive among animals in each exhibiting over 30 local traditions. However, direct evidence is lacking in apes and, with the exception of vocal dialects, in animals generally for the intergroup transmission that would allow innovations to spread widely and become evolutionarily significant phenomena. Here, we provide robust experimental evidence that alternative foraging techniques seeded in different groups of chimpanzees spread differentially not only within groups but serially across two further groups with substantial fidelity. Combining these results with those from recent social-diffusion studies in two larger groups offers the first experimental evidence that a nonhuman species can sustain unique local cultures, each constituted by multiple traditions. The convergence of these results with those from the wild implies a richness in chimpanzees' capacity for culture, a richness that parsimony suggests was shared with our common ancestor.
J Comp Psychol. 2007 Feb;121(1):12-21. Learning from others' mistakes? limits on understanding a trap-tube task by young chimpanzees (Pan troglodytes) and children (Homo sapiens). Horner V, Whiten A. Centre for Social Learning and Cognitive Evolution, School of Psychology, University of St Andrews, Fife, Scotland, UK. Vhorner@rmy.emory.edu A trap-tube task was used to determine whether chimpanzees (Pan troglodytes) and children (Homo sapiens) who observed a model's errors and successes could master the task in fewer trials than those who saw only successes. Two- to 7-year-old chimpanzees and 3- to 4-year-old children did not benefit from observing errors and found the task difficult. Two of the 6 chimpanzees developed a successful anticipatory strategy but showed no evidence of representing the core causal relations involved in trapping. Three- to 4-year-old children showed a similar limitation and tended to copy the actions of the demonstrator, irrespective of their causal relevance. Five- to 6-year-old children were able to master the task but did not appear to be influenced by social learning or benefit from observing errors.
Proc Biol Sci. 2007 Feb 7;274(1608):367-72. Spread of arbitrary conventions among chimpanzees: a controlled experiment. Bonnie KE, Horner V, Whiten A, de Waal FB. Living Links, Yerkes National Primate Research Center, Atlanta, GA 30329, USA. Kebonni@emory.edu Wild chimpanzees (Pan troglodytes) have a rich cultural repertoire--traditions common in some communities are not present in others. The majority of reports describe functional, material traditions, such as tool use. Arbitrary conventions have received far less attention. In the same way that observations of material culture in wild apes led to experiments to confirm social transmission and identify underlying learning mechanisms, experiments investigating how arbitrary habits or conventions arise and spread within a group are also required. The few relevant experimental studies reported thus far have relied on cross-species (i.e. human-ape) interaction offering limited ecological validity, and no study has successfully generated a tradition not involving tool use in an established group. We seeded one of two rewarded alternative endpoints to a complex sequence of behaviour in each of two chimpanzee groups. Each sequence spread in the group in which it was seeded, with many individuals unambiguously adopting the sequence demonstrated by a group member. In one group, the alternative sequence was discovered by a low ranking female, but was not learned by others. Since the action-sequences lacked meaning before the experiment and had no logical connection with reward, chimpanzees must have extracted both the form and benefits of these sequences through observation of others.
Proc Natl Acad Sci U S A. 2006 Sep 12;103(37):13878-83. Faithful replication of foraging techniques along cultural transmission chains by chimpanzees and children. Horner V, Whiten A, Flynn E, de Waal FB. Centre for Social Learning and Cognitive Evolution, School of Psychology, University of St. Andrews, Fife KY16 9JP, United Kingdom. Observational studies of wild chimpanzees (Pan troglodytes) have revealed population-specific differences in behavior, thought to represent cultural variation. Field studies have also reported behaviors indicative of cultural learning, such as close observation of adult skills by infants, and the use of similar foraging techniques within a population over many generations. Although experimental studies have shown that chimpanzees are able to learn complex behaviors by observation, it is unclear how closely these studies simulate the learning environment found in the wild. In the present study we have used a diffusion chain paradigm, whereby a behavior is passed from one individual to the next in a linear sequence in an attempt to simulate intergenerational transmission of a foraging skill. Using a powerful three-group, two-action methodology, we found that alternative methods used to obtain food from a foraging device ("lift door" versus "slide door") were accurately transmitted along two chains of six and five chimpanzees, respectively, such that the last chimpanzee in the chain used the same method as the original trained model. The fidelity of transmission within each chain is remarkable given that several individuals in the no-model control group were able to discover either method by individual exploration. A comparative study with human children revealed similar results. This study is the first to experimentally demonstrate the linear transmission of alternative foraging techniques by non-human primates. Our results show that chimpanzees have a capacity to sustain local traditions across multiple simulated generations.
Nature. 2005 Sep 29;437(7059):737-40. Conformity to cultural norms of tool use in chimpanzees. Whiten A, Horner V, de Waal FB. Centre for Social Learning and Cognitive Evolution, School of Psychology, University of St Andrews, St Andrews, Fife, KY16 9JP, UK. A.whiten@st-and.ac.uk Rich circumstantial evidence suggests that the extensive behavioural diversity recorded in wild great apes reflects a complexity of cultural variation unmatched by species other than our own. However, the capacity for cultural transmission assumed by this interpretation has remained difficult to test rigorously in the field, where the scope for controlled experimentation is limited. Here we show that experimentally introduced technologies will spread within different ape communities. Unobserved by group mates, we first trained a high-ranking female from each of two groups of captive chimpanzees to adopt one of two different tool-use techniques for obtaining food from the same 'Pan-pipe' apparatus, then re-introduced each female to her respective group. All but two of 32 chimpanzees mastered the new technique under the influence of their local expert, whereas none did so in a third population lacking an expert. Most chimpanzees adopted the method seeded in their group, and these traditions continued to diverge over time. A subset of chimpanzees that discovered the alternative method nevertheless went on to match the predominant approach of their companions, showing a conformity bias that is regarded as a hallmark of human culture.
Anim Cogn. 2005 Jul;8(3):164-81. Causal knowledge and imitation/emulation switching in chimpanzees (Pan troglodytes) and children (Homo sapiens). Horner V, Whiten A. Centre for Social Learning and Cognitive Evolution, School of Psychology, University of St Andrews, St Andrews, KY16 9JU, UK. Vkh1@st-andrews.ac.uk This study explored whether the tendency of chimpanzees and children to use emulation or imitation to solve a tool-using task was a response to the availability of causal information. Young wild-born chimpanzees from an African sanctuary and 3- to 4-year-old children observed a human demonstrator use a tool to retrieve a reward from a puzzle-box. The demonstration involved both causally relevant and irrelevant actions, and the box was presented in each of two conditions: opaque and clear. In the opaque condition, causal information about the effect of the tool inside the box was not available, and hence it was impossible to differentiate between the relevant and irrelevant parts of the demonstration. However, in the clear condition causal information was available, and subjects could potentially determine which actions were necessary. When chimpanzees were presented with the opaque box, they reproduced both the relevant and irrelevant actions, thus imitating the overall structure of the task. When the box was presented in the clear condition they instead ignored the irrelevant actions in favour of a more efficient, emulative technique. These results suggest that emulation is the favoured strategy of chimpanzees when sufficient causal information is available. However, if such information is not available, chimpanzees are prone to employ a more comprehensive copy of an observed action. In contrast to the chimpanzees, children employed imitation to solve the task in both conditions, at the expense of efficiency. We suggest that the difference in performance of chimpanzees and children may be due to a greater susceptibility of children to cultural conventions, perhaps combined with a differential focus on the results, actions and goals of the demonstrator.
Learn Behav. 2004 Feb;32(1):36-52. How do apes ape? Whiten A, Horner V, Litchfield CA, Marshall-Pescini S. Centre for Social Learning and Cognitive Evolution, Scottish Primate Research Group, School of Psychology, University of St. Andrews, St. Andrews, Fife, Scotland. A.whiten@st-and.ac.uk In the wake of telling critiques of the foundations on which earlier conclusions were based, the last 15 years have witnessed a renaissance in the study of social learning in apes. As a result, we are able to review 31 experimental studies from this period in which social learning in chimpanzees, gorillas, and orangutans has been investigated. The principal question framed at the beginning of this era, Do apes ape? has been answered in the affirmative, at least in certain conditions. The more interesting question now is, thus, How do apes ape? Answering this question has engendered richer taxonomies of the range of social-learning processes at work and new methodologies to uncover them. Together, these studies suggest that apes ape by employing a portfolio of alternative social-learning processes in flexibly adaptive ways, in conjunction with nonsocial learning. We conclude by sketching the kind of decision tree that appears to underlie the deployment of these alternatives.
http://www.ucc.vt.edu/stdysk/vocabula.html
© Ray Peat Ph.D. 2009. All Rights Reserved. www.RayPeat.com
Peter Aaby at the Symposium about Scientific Freedom, Copenhagen, 9 March 2019.
Lecture: "WHO is the brain in the system - The sound of silence? A case study of how public health vaccinology deals with fundamental contradictions of current policy."
Anthropologist, Dr Peter Aaby is credited for the discovery of non-specific effects of vaccines, leading the World Health Organization, WHO, to change its measles vaccine programme in the early 1990s.
For almost 40 years, he has run the Bandim Health Project, a health and demographic surveillance system site that he established in Guinea-Bissau in 1978.
This lecture is part of the Symposium about Scientific Freedom and the inauguration of the Institute for Scientific Freedom, which took place in Copenhagen, Denmark, 9 March 2019.
World renowned Danish scientist Peter C Gøtzsche is the founder of the institute. The Institute’s primary area of focus is healthcare and the institute has three main visions:
- All science should strive to be free from financial conflicts of interest.
- All science should be published as soon as possible, and made freely accessible.
- All scientific data, including study protocols, should be freely accessible, allowing others to do their own analyses.
The Vilification of Healthy People; Especially Children
Throughout the past several years apparently healthy people have been re-defined as being potential asymptomatic spreaders of a disease that can be lethal in high-risk individuals. The disease is known as the novel coronavirus disease that was first identified in 2019 (COVID-19). People around the world have been instilled with near-paralyzing fear that their family member, friend, neighbour and/or colleague who has no signs or symptoms can kill them by spreading severe acute respiratory syndrome-coronavirus-2 (SARS-CoV-2), which is the causative agent of COVID-19.
This paradigm that a person has no way of knowing who is safe to be around has formed the rationale for mass lockdowns, masking, and mandating ‘vaccines’ for which the initial clinical experiments are still ongoing. This has caused massive fracturing of relationships around the globe. Nobody has been spared. Families have split, best friendships that lasted decades ended abruptly, and colleagues lashed out.
We were told that everyone had to do their part to prevent hospitals from being overwhelmed. Those who felt healthy could not be trusted. Unbeknown to them they might have a wicked pathogen oozing out of their body. Healthy children who were at a statistical risk equivalent to zero of dying from COVID-19 would almost certainly kill their grandparents if they were not locked down, masked and ‘vaccinated’. Those who resisted lockdowns, masking, and mandating of so-called vaccines that could neither prevent the disease nor transmission of its causative agent have been treated like uncaring villains that are deserving of segregation. Remember this front page of one of Canada’s best-known newspapers that was published on August 26, 2021?…
The Prime Minister of Canada, Justin Trudeau, has been a classic example of a leader who has vigorously promoted this kind of hatred and division within his own country.
So, how did we get so far off-track with our response to COVID-19?
Why will future history books, if accurate, document this as the most mismanaged crisis of our time?
Most of the blame rests on the scientific and medical community allowing a very elegant scientific test to be chronically misused. This test is known as the ‘reverse transcriptase-polymerase chain reaction’ (RT-PCR).
Did we follow the science?
In court, I have often seen judges puzzled by the apparent contradictions in the scientific evidence being put forward by various experts. These judges often question how scientists can interpret the same data so differently. When it comes to the science underpinning COVID-19, published papers can be placed into two bins:
-
Those that are trustworthy because they are based on sound scientific methods.
-
Those that are untrustworthy because they are based on flawed scientific methods.
In the past several years science in bin 2 has become voluminous and has contributed excessively to the rationale for the so-called prevailing ‘COVID-19 narrative’. The problem is that the science in bin 2 cannot be properly interpreted because it is built on a fundamentally flawed foundation. Too many scientists failed to critically assess the methods used to generate the early COVID-19 data. This has resulted in this junk science to snowball out of control. The RT-PCR test is at the heart of this problem.
The House Built on Sand Must be Dismantled
If one goes back to the birth of COVID-19 science and critically assesses it, misusing the RT-PCR test jumps out as a key fundamental flaw that caused substantial overestimation of the number of cases of COVID-19 and erroneous labeling of healthy people as asymptomatic spreaders of a deadly disease. The only way to correct course and stop the avalanche of faulty COVID-19 science is to establish which papers can and cannot be trusted. Importantly, editors of scientific journals cannot allow any more COVID-19 ‘facts’ to be published unless the authors unequivocally demonstrate that their data are based on methods that have been implemented properly. Most notably, authors must demonstrate that their research methodologies have been appropriately calibrated such that their conclusions are justified.
Misuse of An Elegant Scientific Technique Has Plagued COVID-19 Science From the Very Beginning
To properly gauge the scope of an outbreak of an infectious disease, one first needs to accurately diagnose it. Diseases are diagnosed primarily based on two things:
-
Accurately detecting the presence of a pathogen using a laboratory-based test.
-
Detection of signs and/or symptoms consistent with the disease, which is usually done by a physician.
Symptoms are aspects of a disease that a person experiences but cannot be assessed easily by an observer. Examples include general malaise, pain, and a loss of appetite. In contrast, signs of illness can be objectively observed and documented by others, and include coughing, sneezing, or a fever that can measured with a thermometer. Often, symptoms precede the onset of signs of illness.
When it comes to defining what it means to be ‘asymptomatic’, there are three relevant scenarios:
-
A person who is not infected with a pathogen will never be at risk of developing the disease associated with that pathogen. These are healthy individuals who are asymptomatic by virtue of not having been infected. They cannot infect others.
-
A person can be infected with a potential pathogen but never develop symptoms of a disease because the causative agent fails to cause substantial harm in the body. In many cases, this might be because the immune system can respond rapidly and effectively. There have also been examples of people getting infected with SARS-CoV-2 but never apparently experiencing symptoms nor developing signs of COVID-19. Infection does not always result in disease. For example, billions of microbes, including many bacteria and viruses, live on and in our bodies without causing us harm. They have invaded our bodies but do not cause disease, even though some of them can cause serious disease in other people or even ourselves should they get into an inappropriate physiological location (e.g., some fecal bacteria entering a body via the oral route). Infected but asymptomatic (disease-free) people are also healthy (i.e., there is no impairment to their ability to function in their daily activities).
-
People who get infected and then progress to a diseased state always have a period in between when they are ‘asymptomatic’. Technically, these individuals that do eventually get sick are referred to as being ‘pre-symptomatic’. One does not know if a person is truly asymptomatic or pre-symptomatic until the typical incubation period for a pathogen has passed; this is the expected time from infection to the onset of symptoms in a susceptible person. A person who is infected and symptomatic can spread the causative agent of the disease to others.
When people have COVID-19, they experience obvious symptoms and signs also usually become apparent. This is the scenario that has been easy to manage throughout the declared COVID-19 pandemic. People who are sick have been asked to stay home. From a social hygiene perspective, it is my expert opinion that this should be encouraged for all the infectious diseases we live with. This would reduce infectious disease-related morbidities and mortalities.
In the context of COVID-19, most masking, isolation and vaccination policies around the world are predicated on the assumption that transmission of SARS-CoV-2 can be efficiently mediated by asymptomatic people who are transiently infected but never get COVID-19 and/or pre-symptomatic individuals. This is based on the assumption that SARS-CoV-2 can replicate to the point where a person who is not coughing or sneezing can expel a threshold dose required to potentially infect another person. Although this is theoretically possible and likely occurs rarely, it is incorrect to conclude that this is commonplace and a significant driver of the spread of COVID-19. This incorrect concept is based on an array of scientific studies that relied on RT-PCR testing that was inappropriately calibrated.
How to Define a Case of COVID-19
Cases of COVID-19 should only be determined as follows:
-
It should be a physician making the diagnosis.
-
It should be based on the presence of signs and symptoms that are consistent with the clinical definition of COVID-19.
-
The presence of symptoms and/or signs should be supported by laboratory results derived from properly calibrated tests that demonstrate the presence of SARS-CoV-2 virions. A virion is a single virus particle. Virions can be replication-competent; these are the only ones that can potentially infect another person and cause disease. Or they can be replication-incompetent; these ones can never spread to others and cause COVID-19.
Throughout the declared pandemic many so-called ‘cases’ of COVID-19 were incorrectly ‘diagnosed’. Cases, especially early in the declared pandemic, have been defined by individuals other than physicians, assumed based on signs and symptoms only, or exclusively based on a positive laboratory test result. The latter has been extremely common. This contradicts the World Health Organization, which noted that “Most PCR assays are indicated as an aid for diagnosis, therefore, health care providers must consider any result in combination with timing of sampling, specimen type, assay specifics, clinical observations, patient history, confirmed status of any contacts, and epidemiological information”.
The core definition, and all-too-often the sole definition of ‘cases’ of COVID-19 has been based on the use of a laboratory testing method referred to as ‘RT-PCR’. To understand how asymptomatic people were mislabeled as significant sources of transmission of SARS-CoV-2, one must first understand how RT-PCR testing should have been properly calibrated around the world.
A polymerase is a protein that can copy DNA, which is a genetic blueprint. So, the PCR method requires this genetic blueprint known as DNA to be present in order to work. If DNA is in a sample, when a scientist adds a polymerase, a few other ingredients, and then varies the temperature, new copies of tiny portions of the DNA will be made. With each ‘cycle’ that the PCR test is run, more copies of these fragments of the genetic blueprint will be made. Once a threshold number of copies appear in the sample, they can be detected. Think of it like a photocopier. From a great distance, you might not be able to tell if a single copy of a page has been made. However, once you have a stack of five hundred pages sitting on the output tray, you know for sure that the photocopier is churning out copies. In short, PCR is a method that scientists can use to determine whether a particular genetic blueprint is present in a sample.
The genetic blueprint for SARS-CoV-2 is not made of DNA. Instead, it is made of a related structure called ‘RNA’. Therefore, to use the PCR test to determine whether an RNA-based virus is present in a sample requires one additional step at the beginning. Specifically, a ‘reverse transcriptase’ is used to convert the RNA from SARS-CoV-2 into DNA, portions of which can then be detected with the PCR test. This is how the RT-PCR test is used to detect the presence of small pieces of the genetic material from SARS-CoV-2.
The Inappropriate Use of RT-PCR Testing Caused a Disconnect Between Laboratory Studies and ‘Real World’ Data
Laboratory studies suggested that asymptomatic individuals could potentially shed infectious SARS-CoV-2 one to two days before the onset of symptoms of COVID-19. However, the largest ‘real world’ study done to date looked at the prevalence of SARS-CoV-2 in ~10 million people in Wuhan, China and found no evidence of asymptomatic transmission. This typical disconnect in the results of laboratory-based studies and ‘real world’ data is due to the former types of experiments having relied on the use of uncalibrated or incorrectly calibrated RT-PCR tests. An RT-PCR test can only determine if tiny fragments of the genetic material from a virus is present in a sample. It can never indicate, on its own, whether that material is from virus particles that have the potential to infect and cause disease, or from replication-incompetent virions or even portions thereof that cannot cause disease.
Flawed RT-PCR Testing Caused Over-Diagnosis of COVID-19
On its own, a positive result on a RT-PCR test to detect SARS-CoV-2 is insufficient to diagnose COVID-19, yet this became routine in most parts of the world. In addition to the potential for false positive tests, true positive results can also be obtained from genomes of SARS-CoV-2 particles that are no longer infectious. An example of the latter would be an individual who has mounted an effective immune response and may have remnant replication-incompetent viral particles or partially degraded viral genetic material inside relatively long-lived white blood cells that have killed the virus. These cells are known as ‘phagocytes’ and are part of our immune system. Indeed, following clearance of SARS-CoV-2 from the body, full and/or partial genomes of SARS-CoV-2 can remain for up to several weeks. Phagocytosis (or ‘eating’) of SARS-CoV-2 is a mechanism to kill and remove the virus from the body. These phagocytic cells tend to hang on to these ‘killed’ virions so that they can activate other immunological effector cells, including B cells that produce the antibodies we have heard so much about. As such, these phagocytes can be a source of SARS-CoV-2 genomes that could be amplified by a PCR test. However, these genomes would not have the potential to cause COVID-19. Instead it would evidence that the infection has resolved or is resolving. Persistence of whole or partial genomes that are not associated with infectious particles is well-documented for a variety of other viruses, including measles, Middle East respiratory syndrome-coronavirus, and other coronaviruses. A positive RT-PCR test for the presence of SARS-CoV-2 should never be used, on its own, to define cases of COVID-19; and definitely should not be used to claim that someone has the potential to infect another person.
Building a Rock-Solid Foundation for COVID-19 Science:
The Gold Standard Functional Virology Assay that Should Always be Used to Calibrate RT-PCR Tests
A gold standard test for infectivity of a virus is a cell-based functional assay that determines the potential to replicate and cause cell death. The assay works like this: Cells that are stripped of their anti-viral properties are put into a dish and allowed to adhere to the bottom. The cells would typically cover the entire bottom of the dish. A scientist can look under a microscope to confirm the cells are healthy. A sample then gets added to the cells. If the sample contains replication-competent (i.e., potentially disease-causing) virions, these will infect and kill the cells. A day or two later, the scientist can check the cells under a microscope again. If they see what is called a ‘cytopathic effect’, which means the cells have died, this indicates that replication-competent virions were present. If there was no cytopathic effect, there were no replication-competent virions. Here are pictures from my research team that show how this virology test works…
…the cells on the left were not exposed to a replication-competent (infectious) virus. They remain happily adhered to the bottom of the dish. There was no cytopathic effect. The cells on the right were exposed to a replication-competent virus that infected and killed them. As the cells died, they rounded up and lost their ability to remain stuck to the bottom of the plate. This is a classic example of cytopathic effect. You can see how easy it is to use this test to determine whether a sample contains any infectious virions.
To calibrate a RT-PCR test for SARS-Cov-2, samples from nasopharyngeal swabs of a large array of people would be split into two; one for RT-PCR testing and the other for testing in the gold standard virology assay. Scientists would note the cycle threshold values from the RT-PCR test that are associated with evidence of replication-competent virions from the cellular virology assay versus those that did not cause a cytopathic effect. This allows a cycle threshold cut-off to be determined. Above this threshold, there is no evidence of replication-competent virions in samples from the nasopharyngeal swabs. This is the objective and proper way to calibrate a RT-PCR test when studying transmission of a virus. Without doing this, RT-PCR test results cannot be interpreted in a meaningful way, and they would lead to inappropriate conclusions, like asymptomatic people being spreaders of COVID-19.
Early in the declared COVID-19 pandemic the Public Health Agency of Canada appropriately performed this calibration of their RT-PCR test. For the test they were using, they identified a cycle threshold cut-off of 24 for declaring people to have the potential to infect others. If they had subsequently offered this service to support studies of the spread of COVID-19, only samples yielding a signal at 24 or fewer cycles would be declared to have evidence of potentially infectious SARS-CoV-2. However, with no explanation provided, this initial and appropriate way of calibrating the RT-PCR assay was not required for labs around the world that were studying transmission of SARS-CoV-2. In fact, cycle threshold cut-offs were arbitrarily assigned. As such, RT-PCR data used to determine global cases of COVID-19 have been highly unreliable.
Even so-called ‘fact-checkers’ of people who criticized the inappropriate designation of the RT-PCR as a stand-alone gold standard diagnostic test have had to admit that it cannot possibly distinguish between infectious and non-infectious virions or parts thereof. For example, a ‘fact check’ from Reuters concluded “PCR tests are being used widely in England to show that SARS-CoV-2 viral genetic material is present in the patient”. I bolded the relevant text. Indeed, RT-PCR tests are a valuable tool for determining whether portions of a virus’s genetic material are present in a sample. They cannot determine whether that genetic material is from a replication-competent virion that would have the potential to infect someone.
Positive RT-PCR tests for SARS-CoV-2 in asymptomatic people are almost universally based on high cycle threshold values, which raises the question of whether these individuals harbor infectious viral particles. The absence of a functional cell-based assay to prove infectivity renders results of asymptomatic testing impossible to interpret accurately. Indeed, the World Health Organization, agreeing with many health professionals around the world, has emphasized that spreading of SARS-CoV-2 by asymptomatic individuals is rare and an emphasis should be placed, therefore, on testing people with signs or symptoms of illness, not those who are apparently healthy.
In addition to the Canadian study that identified a cycle threshold of 24 as an appropriate cut-off for declaring samples positive for infectious SARS-CoV-2, other studies reported results of similar calibrations of other RT-PCR assays for SARS-CoV-2. They identified cycle threshold cut-offs of 22-27 and 30. Altogether, this suggests that tests with cycle threshold values above 22-30 are likely not indicative of the presence of replication-competent SARS-CoV-2.
The logical conclusion is that it is erroneous to declare samples with high cycle threshold values, especially those above 30, as being positive for infectious SARS-CoV-2. However, in many countries people were assumed to be infectious when their samples were declared positive using RT-PCR assays with cycle threshold cut-offs as high as 45 cycles. Such an unjustifiably high cut-off would have resulted in a substantial overestimation of cases of COVID-19 and would have led to erroneous labeling of asymptomatic people as potential spreaders of COVID-19.
Failure to Calibrate the RT-PCR Test Shows How a Representative Influential Scientific Study Incorrectly Concluded that Asymptomatic People Might be a Risk for Spreading COVID-19
The figure below shows results of a published study that claimed to depict the frequency at which asymptomatic people tested positive for SARS-CoV-2 relative to that observed for people with symptomatic infections. Specifically, graphs are shown from figure 2 of a paper published in the influential Journal of the American Medical Association - Internal Medicine. The argument being made was that the frequency at which asymptomatic people tested positive for SARS-CoV-2 was like that observed for people with symptomatic infections. However, the authors failed to calibrate their RT-PCR assay.
Following is the description the authors of the study provided in the methods section of their paper. The most important portion of this text is the last sentence, which is bolded.
“Specimen Collection and RT-PCR for SARS-CoV-2
The URT specimens were collected from both nasopharyngeal and oropharyngeal swabs obtained by trained medical staffs (physicians and nurses). For LRT specimens, participants were given instructions the night before to collect a first morning sputum (after gargling) in a specimen cup; RT-PCR assays for SARS-CoV-2 were performed using Allplex 2020-nCoV assay (Seegene, Seoul, ROK) to determine the presence of virus through the identification of 3 genetic markers: envelope (env) gene, RNA-dependent RNA polymerase (RdRp) gene, and nucleocapsid protein (N) gene. The cycle threshold (Ct) during RT-PCR testing refers to when the detection of viral amplicons occurs, it is inversely correlated with the amount of RNA present. A lower Ct value indicates large quantities of viral RNA. It was considered positive when the Ct values of all genes were less than 40 cycles.”
Remarkably, the authors applied an arbitrary cycle threshold of 40 to define a positive test result. Proper calibration of the test was not performed. I applied a new cycle threshold cut-off of 24, based on the published results of the Canadian study for calibrating a RT-PCR test for SARS-CoV-2. This is shown as a red dotted line on the graphs in the figure above. Symbols appearing in the light red rectangle above this line would be considered negative, in contrast to the positive designation that the authors had assigned. Remarkably, 99.7% of the people the authors declared to be harbouring infectious SARS-CoV-2 likely had no evidence of potentially infectious SARS-CoV-2 virions, had the test been properly calibrated. This represents a fatal flaw in this paper; one that negates its conclusion that “Isolation of asymptomatic patients may be necessary to control the spread of SARS-CoV-2”. It should also precipitate its retraction. Such a paper should never have been allowed to be published in the first place.
This highlights a fatal flaw that has been extremely common in publications throughout the declared pandemic that claimed asymptomatic people could be a significant source of transmission of SARS-CoV-2 that could cause COVID-19 in other people. Every paper making this claim should have the materials and methods section carefully evaluated to determine whether the cycle threshold cut-off for the RT-PCR assay was based on the appropriate calibration method or was selected arbitrarily.
Here is a list of other influential publications of original research studies that erroneously concluded that asymptomatic people might be significant sources of replication-competent SARS-CoV-2 virions. Most are based on fatally flawed RT-PCR testing and the remaining papers fail to disclose how they defined an ‘infection’. All of them should be retracted. None of their conclusions can be trusted…
-
Bai, Y. et al. Presumed Asymptomatic Carrier Transmission of COVID-19. Jama 323, 1406-1407 (2020).
-
Arons, M.M. et al. Presymptomatic SARS-CoV-2 Infections and Transmission in a Skilled Nursing Facility. The New England journal of medicine 382, 2081-2090 (2020).
-
Stock, A.D. et al. COVID-19 Infection Among Healthcare Workers: Serological Findings Supporting Routine Testing. Front Med (Lausanne) 7, 471 (2020).
-
Bi, Q. et al. Epidemiology and transmission of COVID-19 in 391 cases and 1286 of their close contacts in Shenzhen, China: a retrospective cohort study. The Lancet. Infectious diseases 20, 911-919 (2020).
-
Böhmer, M.M. et al. Investigation of a COVID-19 outbreak in Germany resulting from a single travel-associated primary case: a case series. The Lancet. Infectious diseases 20, 920-928 (2020).
-
Chan, J.F. et al. A familial cluster of pneumonia associated with the 2019 novel coronavirus indicating person-to-person transmission: a study of a family cluster. Lancet (London, England) 395, 514-523 (2020).
-
Van Vinh Chau, N. et al. The Natural History and Transmission Potential of Asymptomatic Severe Acute Respiratory Syndrome Coronavirus 2 Infection. Clinical infectious diseases : an official publication of the Infectious Diseases Society of America 71, 2679-2687 (2020).
-
Chaw, L. et al. Analysis of SARS-CoV-2 Transmission in Different Settings, Brunei. Emerging infectious diseases 26, 2598-2606 (2020).
-
Cheng, H.Y. et al. Contact Tracing Assessment of COVID-19 Transmission Dynamics in Taiwan and Risk at Different Exposure Periods Before and After Symptom Onset. JAMA internal medicine 180, 1156-1163 (2020).
-
Gao, M. et al. A study on infectivity of asymptomatic SARS-CoV-2 carriers. Respiratory medicine 169, 106026 (2020).
-
Gao, Y. et al. A cluster of the Corona Virus Disease 2019 caused by incubation period transmission in Wuxi, China. The Journal of infection 80, 666-670 (2020).
-
Guan, W.J. et al. Clinical Characteristics of Coronavirus Disease 2019 in China. The New England journal of medicine 382, 1708-1720 (2020).
-
He, X. et al. Temporal dynamics in viral shedding and transmissibility of COVID-19. Nat Med 26, 672-675 (2020).
-
Hodcroft, E.B. Preliminary case report on the SARS-CoV-2 cluster in the UK, France, and Spain. Swiss medical weekly 150 (2020).
-
Hoehl, S. et al. Evidence of SARS-CoV-2 Infection in Returning Travelers from Wuhan, China. The New England journal of medicine 382, 1278-1280 (2020).
-
Lauer, S.A. et al. The Incubation Period of Coronavirus Disease 2019 (COVID-19) From Publicly Reported Confirmed Cases: Estimation and Application. Annals of internal medicine 172, 577-582 (2020).
-
Li, R. et al. Substantial undocumented infection facilitates the rapid dissemination of novel coronavirus (SARS-CoV-2). Science (New York, N.Y.) 368, 489-493 (2020).
-
Li, C. et al. Asymptomatic and Human-to-Human Transmission of SARS-CoV-2 in a 2-Family Cluster, Xuzhou, China. Emerging infectious diseases 26, 1626-1628 (2020).
-
Liu, Y., Funk, S. & Flasche, S. The contribution of pre-symptomatic infection to the transmission dynamics of COVID-2019. Wellcome open research 5, 58 (2020).
-
Lu, X. et al. SARS-CoV-2 Infection in Children. The New England journal of medicine 382, 1663-1665 (2020).
-
Lu, S. et al. Alert for non-respiratory symptoms of coronavirus disease 2019 patients in epidemic period: A case report of familial cluster with three asymptomatic COVID-19 patients. Journal of medical virology 93, 518-521 (2021).
-
Luo, S.H. et al. A confirmed asymptomatic carrier of 2019 novel coronavirus. Chinese medical journal 133, 1123-1125 (2020).
-
Mizumoto, K., Kagaya, K., Zarebski, A. & Chowell, G. Estimating the asymptomatic proportion of coronavirus disease 2019 (COVID-19) cases on board the Diamond Princess cruise ship, Yokohama, Japan, 2020. Euro surveillance : bulletin Europeen sur les maladies transmissibles = European communicable disease bulletin 25 (2020).
-
Sun, K. et al. Transmission heterogeneities, kinetics, and controllability of SARS-CoV-2. Science (New York, N.Y.) 371 (2021).
-
Nishiura, H. et al. Estimation of the asymptomatic ratio of novel coronavirus infections (COVID-19). Int J Infect Dis 94, 154-155 (2020).
-
Nishiura, H., Linton, N.M. & Akhmetzhanov, A.R. Serial interval of novel coronavirus (COVID-19) infections. Int J Infect Dis 93, 284-286 (2020).
-
Pan, Y., Zhang, D., Yang, P., Poon, L.L.M. & Wang, Q. Viral load of SARS-CoV-2 in clinical samples. The Lancet. Infectious diseases 20, 411-412 (2020).
-
Pan, X. et al. Asymptomatic cases in a family cluster with SARS-CoV-2 infection. The Lancet. Infectious diseases 20, 410-411 (2020).
-
Park, S.Y. et al. Coronavirus Disease Outbreak in Call Center, South Korea. Emerging infectious diseases 26, 1666-1670 (2020).
-
Payne, D.C. et al. SARS-CoV-2 Infections and Serologic Responses from a Sample of U.S. Navy Service Members - USS Theodore Roosevelt, April 2020. MMWR. Morbidity and mortality weekly report 69, 714-721 (2020).
-
Kimball, A. et al. Asymptomatic and Presymptomatic SARS-CoV-2 Infections in Residents of a Long-Term Care Skilled Nursing Facility - King County, Washington, March 2020. MMWR. Morbidity and mortality weekly report 69, 377-381 (2020).
-
Qian, G. et al. COVID-19 Transmission Within a Family Cluster by Presymptomatic Carriers in China. Clinical infectious diseases : an official publication of the Infectious Diseases Society of America 71, 861-862 (2020).
-
Ran, L. et al. Risk Factors of Healthcare Workers With Coronavirus Disease 2019: A Retrospective Cohort Study in a Designated Hospital of Wuhan in China. Clinical infectious diseases : an official publication of the Infectious Diseases Society of America 71, 2218-2221 (2020).
-
Rosenberg, E.S. et al. COVID-19 Testing, Epidemic Features, Hospital Outcomes, and Household Prevalence, New York State-March 2020. Clinical infectious diseases : an official publication of the Infectious Diseases Society of America 71, 1953-1959 (2020).
-
Sakurai, A. et al. Natural History of Asymptomatic SARS-CoV-2 Infection. The New England journal of medicine 383, 885-886 (2020).
-
Samsami, M., Zebarjadi Bagherpour, J., Nematihonar, B. & Tahmasbi, H. COVID-19 Pneumonia in Asymptomatic Trauma Patients; Report of 8 Cases. Archives of academic emergency medicine 8, e46 (2020).
-
Tabata, S. et al. Clinical characteristics of COVID-19 in 104 people with SARS-CoV-2 infection on the Diamond Princess cruise ship: a retrospective analysis. The Lancet. Infectious diseases 20, 1043-1050 (2020).
-
Tong, Z.D. et al. Potential Presymptomatic Transmission of SARS-CoV-2, Zhejiang Province, China, 2020. Emerging infectious diseases 26, 1052-1054 (2020).
-
Treibel, T.A. et al. COVID-19: PCR screening of asymptomatic health-care workers at London hospital. Lancet (London, England) 395, 1608-1610 (2020).
-
Wei, W.E. et al. Presymptomatic Transmission of SARS-CoV-2 - Singapore, January 23-March 16, 2020. MMWR. Morbidity and mortality weekly report 69, 411-415 (2020).
-
Xu, J., Li, Y., Gan, F., Du, Y. & Yao, Y. Salivary Glands: Potential Reservoirs for COVID-19 Asymptomatic Infection. Journal of dental research 99, 989 (2020).
-
Yang, R., Gui, X. & Xiong, Y. Comparison of Clinical Characteristics of Patients with Asymptomatic vs Symptomatic Coronavirus Disease 2019 in Wuhan, China. JAMA network open 3, e2010182 (2020).
-
Yang, N. et al. In-flight transmission cluster of COVID-19: a retrospective case series. Infectious diseases (London, England) 52, 891-901 (2020).
-
Ye, F. et al. Delivery of infection from asymptomatic carriers of COVID-19 in a familial cluster. International journal of infectious diseases : IJID : official publication of the International Society for Infectious Diseases 94, 133-138 (2020).
-
Yu, P., Zhu, J., Zhang, Z. & Han, Y. A Familial Cluster of Infection Associated With the 2019 Novel Coronavirus Indicating Possible Person-to-Person Transmission During the Incubation Period. The Journal of infectious diseases 221, 1757-1761 (2020).
-
Zhang, J., Tian, S., Lou, J. & Chen, Y. Familial cluster of COVID-19 infection from an asymptomatic. Critical care (London, England) 24, 119 (2020).
-
Almadhi, M.A. et al. The high prevalence of asymptomatic SARS-CoV-2 infection reveals the silent spread of COVID-19. International journal of infectious diseases : IJID : official publication of the International Society for Infectious Diseases 105, 656-661 (2021).
-
Choi, A. et al. Symptomatic and Asymptomatic Transmission of SARS-CoV-2 in K-12 Schools, British Columbia, Canada April to June 2021. Microbiology spectrum, e0062222 (2022).
…these 48 papers represent most, if not all, of the peer-reviewed scientific evidence that has been used by most public health officials to mislabel asymptomatic people as sources of COVID-19-causing SARS-CoV-2. All of it is fatally flawed.
It was even concluded in a study that patients testing ‘positive’ with cycle threshold values above 33 could likely be discharged from hospitals. Such a recommendation would never be made if there was any evidence that these people harboured SARS-CoV-2 virions with the potential to infect others. So one must wonder why testing labs were allowed to arbitrarily pick cycle thresholds ranging from 38 to 45 as upper limits for defining the presence of infectious SARS-CoV-2.
Exclusive reliance on improperly calibrated RT-PCR testing as an indication of ‘infection’ has also led to the erroneous conclusion that post-symptomatic people may also need to be masked and/or isolated.
I have yet to see appropriate scientific evidence to justify the unusually high cycle threshold values being used in studies that label people as asymptomatic sources of COVID-19. In the absence of such data, there is no justification for masking, isolating or mandating experimental vaccine technologies for asymptomatic people.
Others have also criticized the exclusive use of RT-PCR tests in diagnosing COVID-19 and drawing conclusions about transmission in the absence of infectivity testing.
How RT-PCR Testing Should Have Been Used to Support Diagnoses of COVID-19
All labs should have been required to calibrate their RT-PCR test prior to providing any ‘real world’ data to public health officials that would be used to study the transmission of SARS-CoV-2. Use of the gold standard functional virology assay to do this calibration would have provided each lab with a strong objective rationale for their specific cycle threshold cut-off value when determining whether a person could have the potential to infect others. And this should have always been married to a clinical diagnosis rendered by a physician. As mentioned earlier, if this standard is applied retroactively to the COVID-19 scientific literature, it becomes obvious that much of it is untrustworthy.
Much of the Foundational COVID-19 Science is Fundamentally Flawed
RT-PCR testing has generally been misused during the declared COVID-19 pandemic due to failures to calibrate it properly. The result has been mislabeling asymptomatic people as significant potential sources for transmission of COVID-19. This, in turn, has resulted in inappropriate mandating of masking, isolation, and ‘vaccines’ for people who do not represent a genuine health risk to others. It has also taken the diagnostic expertise away from physicians and placed it in the hands of anonymous laboratory technicians.
Now, we are left with a mountain of COVID-19 science that cannot be interpreted properly. Scientists with integrity and the relevant expertise know that a substantial but undefined number of people that tested ‘positive for COVID-19’ never had the potential to spread SARS-CoV-2 to others and many of these also did not actually have the disease known as COVID-19.
Resolving the Apparent Conflicts in Evidence Presented by ‘Experts’
To judges who are puzzled by the differing interpretations of experts in their courts, the explanation is fairly simple. If you remove the fundamentally flawed science from expert reports, you will be left with trustworthy data that generally do not support what has been the prevailing narrative over the past several years. When scientists talk about following the overall weight of the scientific evidence, what we really mean is to follow the weight of the trustworthy scientific evidence. Do not get bedazzled by the numerous reports that have accumulated, often in ‘prestigious’ journals, that were based on flawed scientific methods. Don’t get distracted by the number of health ‘authorities’ that have blindly propagated this flawed science. Truth is not a democracy. It is not defined by a majority vote.
Harm to Public Trust in Science
The global propagation of poorly conducted science over the past several years has caused massive and irreparable harm. Children and teenagers took the brunt of this damage. They were given no choice. They had no voice. They became shields used in a conflict waged by adults who wielded faulty science like it was the gospel truth.
As a scientist with deep expertise in viral immunology, I am incredibly disheartened by the state of my scientific disciplines. My colleagues that sat in their ivory towers allowing junk science to justify crushing constitutional freedoms should be ashamed of themselves. I am proud of the relatively few who stood tall on a foundation of integrity and endured brutal treatment for the past couple of years. I can only hope that the harm done to public trust in the health sciences can be remedied.
In the early-to-mid 1900s, the rates of stroke & heart attacks were on the rise.
Smoking was a known contributor to this phenomenon. It is a dense source of oxidative stress. When you take your first puff of cigarette smoke, it feels like your chest is on fire. In more ways than one, it is.
Oxidation reactions there needs fuel. So, if smoking is providing oxidative stress...then what is the fuel?
Unfortunately for us, there was a second culprit blamed for the rise of cardiovascular disease - saturated fats. That is the topic of this article.
Fire & Mitochondria
Before we continue with the danger to the circulatory system, we need to take a brief detour to discuss the relationship between fire and metabolism.
. . .
Read the whole article
. . .
The evidence is overwhelming (experimental and clinical) that oxidation of PUFAs play a role in every step of atherosclerosis, including fatty streak formation, atheromatous lesion, plaque instability and rupture. Instability & rupture causes strokes in the brain, heart attacks, and infarctions in every organ that has limited blood supply. The liver and skin are less impacted - they have a lot of redundant blood supply, and usually the other critical organs fail first (e.g brain, heart, kidneys).
The above diagrammed chain reaction is the reason why PUFAs are so dangerous.
Recall the description of combustion:
The combustion of fuel occurs in the absence of enzymes, and relies on a chain reaction to propagate itself until the fuel or oxygen has been consumed.
Lipid peroxidation chain reactions will continue until one radical encounters another radical, neutralizing one another.
Now, imagine that the cells which line your blood vessel (endothelium) have PUFAs in their membranes. The red blood cell membranes also have PUFAs. The balls of cholesterol floating in your blood stream have PUFAs. They are all waiting for an oxidation event.
Oxidation-reduction reactions are the norm throughout the body. But in the presence of PUFAs, this can lead to chain reactions that can propagate uncontrolled. Whether it's the inner wall of the blood vessel, or a ball of cholesterol rushing towards your brain.
The endothelium (blood vessel lining) is a very sensitive single-layer membrane that comes in direct contact with the contents of blood. Even a microsecond of a PUFA oxidation event can damage it, exposing the layers underneath. Once these layers are exposed, one of two things can happen which are potentially devastating:
- Other reactive fatty acids (more PUFAs, for example) can bind to it. This results in fatty streak formation, and eventual atherosclerosis.
- Platelets can start binding and aggregating to exposed basement membrane proteins. This can lead to clot formation ⇨ infarcts
This is why ulcerated atherosclerotic plaques are predictive of impending stroke, heart attack, or dissection. In my estimation, this mechanism also plays a role in aneurysm rupture.
Death by a Thousand Cuts
If the damage caused by these PUFAs was a brief episode, you probably wouldn't suffer long-term illness.
But, many people who consume PUFAs do so for years and decades, if not their whole life. You can hardly blame them - seed oils are in almost everything we consume. In fact, people who try to avoid seed oils often have a hard time both in restaurants and the grocery store. It is so widespread, it has likely contributed to the recent trend of extreme elimination diets.
These elimination diets have a point.
Lipid peroxidation and highly inflammatory states contribute to almost everything that's causing illness in the modern world:
- Cardiovascular disease (leading cause of death and disability)
- Cancer
- Dementia
- Kidney Failure
- Autoimmunity
- Even Osteoporosis ⇨ fractures, which is often categorized as traumatic
As a little experiment, go to a search engine and query:
"Lipid peroxidation" and "any chronic illness"
Every tissue needs blood perfusion. By slowly destroying the blood supply, you develop micro-infarcts that accumulate across time. And, by delivering highly reactive PUFAs to an organ, you are actively oxidizing it. It's lose-lose.
Brain MRI demonstrating grade of 'microvascular ischemia' - tiny infarcts that manifest as confluent loss of white matter across time
Connection to COVID
There are two connections to COVID, actually.
The first is reasonably well encapsulated in this tweet:
I believe the abnormally high rates of illness this past 2 years is due to:
1. Lockdowns/restrictions and the downstream health consequences
2. Inflation causing people to shift towards more affordable food (seed oil, cheap carbs)
3. Global-scale clinical trialNo one virus.
— Remnant | MD (@RemnantMd) April 17, 2022
When the lockdown and restrictions started to get really heavy-handed, reasonable people were concerned with the behavioral changes that would be harmful:
- More sedentary life
- Ordering food (for many reasons, including supporting local businesses)
- Increased alcohol and drug use
This is one connection to the "COVID era," and we are now seeing the consequences of these behaviors. I know some people want to blame the differences in rates of illness to the vaccines, but I think that is only one of three contributing factors.
Revisiting Oxidation
The second connection is directly related to oxidative stress.
Our lungs help bring oxygen into the bloodstream, so that it can be circulated to our organs. Oxygen is just one oxidant in the air. There's also ozone, amongst others. Our lungs are constantly exposed to oxidative stress - especially if you are a smoker.
You would think that our lungs have evolved mechanisms to resist oxidants - and they have! For example, surfactant (the fluid that keeps your lungs open) has a protein which binds to and neutralizes oxidants. But, even the lungs have their limit. At some point, they will give in. What happens when the lungs fail?
The lungs provide a boundary that allows your blood to engage with the air without letting free air into your blood stream, or fluid moving into your lungs. So when the lung fails...fluid tends to accumulate in the lungs.
Acute Respiratory Distress Syndrome
ARDS is a type of respiratory failure characterized by rapid onset of widespread inflammation in the lungs - leading to fluid build-up.
Obesity is a risk factor for ARDS. It also happens to be a major risk factor for severe COVID-19. One mechanism accounting for this is that people who are obese tend to have a diet high in PUFAs. PUFAs stored in fat can be oxidized as part of a chain reaction, which will lead to a higher severity/amplitude of inflammation.
In the early days of COVID, people were admitted left and right with what looked like ARDS. Our central medical authorities decided this was 'severe COVID.'
But, if you take an unhealthy person, with an unhealthy diet...expose them to an infection...then refuse to treat them for 10 days...a large proportion of them will get ARDS. In this setting any infection is potentially fatal. If you are old, obese, or have high baseline oxidative stress...any infection can get real bad, real fast.
What is the Crime?
In the 1950s Ancel Keys was trying to make a name for himself. He saw an opportunity with heart disease. He had (stole) a hypothesis - and he ran with it.
But, how did he make it so far? Who were his benefactors? One group was Proctor & Gamble.
In the early 1900s, P&G were in the business of making wax and soap - from animal fat. At some point, animal fat got expensive. So, P&G collaborated with a German who had developed the process of hydrogenation. Here they saw an opportunity to mill cottonseed into oil, and hydrogenate it into something that was 'fatty.' This became a substitute for their wax & soap production, which allowed them to scale.
Eventually, electricity made candle-light obsolete. Which meant that P&G were stuck with a cheap way of making "oil," but no product to put it in. That is until unsaturated fatty acids were hailed (by Ancel et al) as a healthy alternative. P&G repurposed the cottonseed oil into Crisco vegetable oil - without telling people what was in it.
For some people in certain positions (e.g. P&G executives), the idea that their product might cause mass harm and suffering does not even cross their mind.
And, why should it?
The WHO & AHA have promoted PUFAs for decades, and to some extent still do.
_ [This was an answer to the 2013 edge.org question, What should we be worried about? That link is the official page. And this is an archive.org capture of that page, What should be worried about?
Scroll down about a third, and right between Naughton and Scheiner, the archive has answer by mathematician Steven Strogatz, which is missing from the newest version, even though the count at the top still says there are 155 responses. This is the missing response in full.] _
In every realm where we exist as a collective -- in society, in the global economy, on the Internet -- we are blithely increasing the coupling between us, with no idea what that might entail.
"Coupling" refers to the ability of one part of a complex system to influence another. If I put a hundred metronomes on the floor and set them ticking, they'll each do their own thing, swinging at their own rhythm. In this condition, they're not yet coupled. Because the floor is rigid, the metronomes can't feel each other's vibrations, at least not enough to make a difference. But now place them all on a movable platform like the seat of a child's swing. The metronomes will start to feel each other's jiggling. The swing will start to sway, imperceptibly at first, but enough to disturb each metronome and alter its rhythm. Eventually the whole system will synchronize, with all the metronomes ticking in unison. By allowing the metronomes to impose themselves on each other through the vibrations they impart to the movable platform, we have coupled the system and changed its dynamics radically.
In all sorts of complex systems, this is the general trend: increasing the coupling between the parts seems harmless enough at first. But then, abruptly, when the coupling crosses a critical value, everything changes. The exact nature of the altered state isn't easy to foretell. It depends on the system's details. But it's always something qualitatively different from what came before. Sometimes desirable, sometimes deadly.
I worry that we're playing the coupling game with ourselves, collectively. With our cell phones and GPS trackers and social media, with globalization, with the coming Internet of things, we're becoming more tightly connected than ever. Of course, maybe that's good. Greater coupling means faster and easier communication and sharing. We can often do more together than apart.
But the math suggests that increasing coupling is a siren's song. Too much makes a complex system brittle. In economics and business, the wisdom of the crowd works only if the individuals within it are independent, or nearly so. Loosely coupled crowds are the only wise ones.
The human brain is the most exquisitely coupled system we know of, but the coupling between different brain areas has been honed by evolution to allow for the subtleties of attention, memory, perception, and consciousness. Too much coupling produces pathological synchrony: the rhythmic convulsions and loss of consciousness associated with epileptic seizures.
Propagating malware, worldwide pandemics, flash crashes -- all symptoms of too much coupling. Unfortunately it's hard to predict how much coupling is too much. We only know that we want more, and that more is better... until it isn't.
El gato malo does more intelligent analysis in a week than the idiots in our governments do in a year.
Today’s analysis suggests Dr. Geert Vanden Bossche was correct in predicting that applying a leaky vaccine effective at preventing sickness in the middle of a pandemic was a very bad idea.
https://boriquagato.substack.com/p/are-leaky-vaccines-driving-delta
all a virus wants is to replicate. “make a copy of me and pass it on.” that’s the biological imperative of the selfish gene. excel at it, you win. fail, you disappear. simple as that.
killing or harming the host is maladaptive to viral spread. it’s like burning down your own house with your car in the garage. now you have nowhere to live and no way to get around. that’s not a recipe for reproductive fitness.
so viruses evolve to become less, not more virulent. they do not want to kill you. ideally, they’d like to help you. figure out how to be a useful symbiote, and you get a huge boost in propagation. (mitochondria were probably bacteria that were so useful, all our cells incorporated them.)
so seeing case fatality rate (CFR) rise in a variant of a virus is like watching water flow uphill. it’s not supposed to do that and when it does, you need to suspect some external force acting on it.
and we’re seeing water flow uphill here.
Key points:
- Case Fatality Rate (CFR) is rising for Delta and is probably not caused by Antibody Dependent Enhancement (ADE) or Original Antigenic Sin (OAS) because CFR is rising in both vaccinated and unvaccinated, and is not rising in previously infected, and Vaccine Efficacy (VE) for deaths remains good.
- The most probable explanation is Vaccine Mediated Evolution (VME) in which a leaky vaccine that keeps the host healthy causes the virus to evolve to a more deadly variant.
- Vaccine Efficacy (VE) on spread is negative (bad) because infected people don’t know they’re infected which accelerates spread.
- Everyone is harmed but unvaccinated are worse off creating the illusion that the vaccines are a good idea.
it’s just simple math. if we do something to one group that makes their death rate rise from 1 to 2 per 100 but that also makes the death rate in another group rise from 1 to 4 per 100, that looks like a VE of 50%. in reality, it’s killing 100% more vaxxed people and 300% more of the unvaxxed.
mistaking that gas pedal for the brake and pushing ever harder when you fail to slow would represent an accelerating disaster curve.
I like that el gato malo seeks to prove himself wrong. That’s a strong signal for someone with integrity and intelligence that we should trust.
it’s still, or course, possible that i’m wrong, but this is looking more and more like it has to be the answer. i can find nothing else fits the facts and the facts themselves are weird enough that “it’s just normal” does not look like a satisfying explanation either and we have enough features here that we can really start testing our puzzle pieces. this one aligns in an AWFUL lot of places.
for something this odd to happen, it takes a truly uncommon exogenous stressor.
i’m just not seeing what else it could be than vaccine mediated selection for hotter variants driving pernicious delta evolution.
so, i’m putting this out to you all to see if you can find some other explanation for what’s going on that fits these facts.
looking forward to the peer review as, honestly, i hope i’m wrong here. this is not an outcome that anyone wants. it’s the nightmare scenario both as a pandemic and as a political horror in the making as if this was an “own-goal”, what would the experts and politicians that pushed this plan not be willing to do to avoid accepting the blame?
because this is career or pharma franchise polonium, and that’s if you’re lucky.
I also very much like that el gato malo does not subscribe to crazy conspiracies that lack evidence. I would of course augment el gato malo’s explanation by including an element of genetic reality denial in our leaders.
“But what is the end game if purposefully designed this way?”
i don’t think it was. i think these fools really thought mRNA and adenovirus carrier vaccines would be sterilizing.
they pushed them as herd immunity.
having it all fall apart cornered them but by the time they knew it, they were “pot committed” and had already vaxxed 100’s of millions of people.
this has been this shiny tech they have been trying to make work (and recoup money on) for decades and failing over and over.
i doubt this was deliberate. it was just stunningly arrogant and reckless.
So now the million dollar question:
Assuming a better explanation does not emerge, what should an unvaccinated person do?
Prioritizing self-preservation this analysis suggests one should either:
- get vaccinated, or
- acquire natural immunity by deliberately getting infected before the variants become more deadly, and apply early treatment protocols to maximize the probability of a successful recovery.
Choosing to get vaccinated makes the most sense if:
- you are in a high risk group (old or obese)
- you do not care about worsening the overall outcome for both vaccinated and unvaccinated.
Choosing natural immunity makes the most sense if:
- you are in a low risk group
- you are concerned about the yet to be established long term health effects of the novel vaccines
- you want to be a good citizen and do what is best for everyone.
I’m old but not obese which makes the choice difficult.
I’m going to watch the data and hope for a better explanation to emerge for a while longer before making a decision.
You can’t make this shit up: observe that our “leaders” are pushing hard in exactly the opposite direction of what wise leaders would do if this VME hypothesis is correct:
- stop further vaccination of low risk people
- start collecting the data necessary to prove or disprove this hypothesis
- promote healthy immune systems (vitamin D, weight loss, etc.)
- aggressively evaluate and deploy promising early treatment protocols (Ivermectin etc.)
- aggressively investigate root causes and modify policies to prevent a recurrence.
One more observation to make you admire our “leaders” even less:
the same NIH that was funding the GoF research in wuhan miraculously had the viral code to drop into the moderna mRNA vaccine in under 2 weeks.
that always smelled like a sushi bar dumpster.
https://boriquagato.substack.com/p/were-some-folks-a-little-too-prepared
17-Oct-2021 Addition
In a paper today, Dr. Geert Vanden Bossche argues that boosters will probably boost the virulence of Delta rather than long term protection from severe disease.
Israel is misreading their booster results by only tracking booster effectiveness for 12 days.
https://www.geertvandenbossche.org/post/what-happens-if-israel-fails-the-stress-test
The Covid-19 pandemic has disrupted lives the world over for more than a year. Its death toll will soon reach three million people. Yet the origin of pandemic remains uncertain: the political agendas of governments and scientists have generated thick clouds of obfuscation, which the mainstream press seems helpless to dispel.
In what follows I will sort through the available scientific facts, which hold many clues as to what happened, and provide readers with the evidence to make their own judgments. I will then try to assess the complex issue of blame, which starts with, but extends far beyond, the government of China.
By the end of this article, you may have learned a lot about the molecular biology of viruses. I will try to keep this process as painless as possible. But the science cannot be avoided because for now, and probably for a long time hence, it offers the only sure thread through the maze.
The virus that caused the pandemic is known officially as SARS-CoV-2, but can be called SARS2 for short. As many people know, there are two main theories about its origin. One is that it jumped naturally from wildlife to people. The other is that the virus was under study in a lab, from which it escaped. It matters a great deal which is the case if we hope to prevent a second such occurrence.
I’ll describe the two theories, explain why each is plausible, and then ask which provides the better explanation of the available facts. It’s important to note that so far there is no direct evidence for either theory. Each depends on a set of reasonable conjectures but so far lacks proof. So I have only clues, not conclusions, to offer. But those clues point in a specific direction. And having inferred that direction, I’m going to delineate some of the strands in this tangled skein of disaster.
A Tale of Two Theories
After the pandemic first broke out in December 2019, Chinese authorities reported that many cases had occurred in the wet market — a place selling wild animals for meat — in Wuhan. This reminded experts of the SARS1 epidemic of 2002 in which a bat virus had spread first to civets, an animal sold in wet markets, and from civets to people. A similar bat virus caused a second epidemic, known as MERS, in 2012. This time the intermediary host animal was camels.
The decoding of the virus’s genome showed it belonged a viral family known as beta-coronaviruses, to which the SARS1 and MERS viruses also belong. The relationship supported the idea that, like them, it was a natural virus that had managed to jump from bats, via another animal host, to people. The wet market connection, the only other point of similarity with the SARS1 and MERS epidemics, was soon broken: Chinese researchers found earlier cases in Wuhan with no link to the wet market. But that seemed not to matter when so much further evidence in support of natural emergence was expected shortly.
Wuhan, however, is home of the Wuhan Institute of Virology, a leading world center for research on coronaviruses. So the possibility that the SARS2 virus had escaped from the lab could not be ruled out. Two reasonable scenarios of origin were on the table.
From early on, public and media perceptions were shaped in favor of the natural emergence scenario by strong statements from two scientific groups. These statements were not at first examined as critically as they should have been.
“We stand together to strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin,” a group of virologists and others wrote in the Lancet on February 19, 2020, when it was really far too soon for anyone to be sure what had happened. Scientists “overwhelmingly conclude that this coronavirus originated in wildlife,” they said, with a stirring rallying call for readers to stand with Chinese colleagues on the frontline of fighting the disease.
Contrary to the letter writers’ assertion, the idea that the virus might have escaped from a lab invoked accident, not conspiracy. It surely needed to be explored, not rejected out of hand. A defining mark of good scientists is that they go to great pains to distinguish between what they know and what they don’t know. By this criterion, the signatories of the Lancet letter were behaving as poor scientists: they were assuring the public of facts they could not know for sure were true.
It later turned out that the Lancet letter had been organized and drafted by Peter Daszak, president of the EcoHealth Alliance of New York. Dr. Daszak’s organization funded coronavirus research at the Wuhan Institute of Virology. If the SARS2 virus had indeed escaped from research he funded, Dr. Daszak would be potentially culpable. This acute conflict of interest was not declared to the Lancet’s readers. To the contrary, the letter concluded, “We declare no competing interests.”
Peter Daszak, president of the EcoHealth Alliance
Virologists like Dr. Daszak had much at stake in the assigning of blame for the pandemic. For 20 years, mostly beneath the public’s attention, they had been playing a dangerous game. In their laboratories they routinely created viruses more dangerous than those that exist in nature. They argued they could do so safely, and that by getting ahead of nature they could predict and prevent natural “spillovers,” the cross-over of viruses from an animal host to people. If SARS2 had indeed escaped from such a laboratory experiment, a savage blowback could be expected, and the storm of public indignation would affect virologists everywhere, not just in China. “It would shatter the scientific edifice top to bottom,” an MIT Technology Review editor, Antonio Regalado, said in March 2020.
A second statement which had enormous influence in shaping public attitudes was a letter (in other words an opinion piece, not a scientific article) published on 17 March 2020 in the journal Nature Medicine. Its authors were a group of virologists led by Kristian G. Andersen of the Scripps Research Institute. “Our analyses clearly show that SARS-CoV-2 is not a laboratory construct or a purposefully manipulated virus,” the five virologists declared in the second paragraph of their letter.
Kristian G. Andersen, Scripps Research
Unfortunately this was another case of poor science, in the sense defined above. True, some older methods of cutting and pasting viral genomes retain tell-tale signs of manipulation. But newer methods, called “no-see-um” or “seamless” approaches, leave no defining marks. Nor do other methods for manipulating viruses such as serial passage, the repeated transfer of viruses from one culture of cells to another. If a virus has been manipulated, whether with a seamless method or by serial passage, there is no way of knowing that this is the case. Dr. Andersen and his colleagues were assuring their readers of something they could not know.
The discussion part their letter begins, “It is improbable that SARS-CoV-2 emerged through laboratory manipulation of a related SARS-CoV-like coronavirus”. But wait, didn’t the lead say the virus had clearly not been manipulated? The authors’ degree of certainty seemed to slip several notches when it came to laying out their reasoning.
The reason for the slippage is clear once the technical language has been penetrated. The two reasons the authors give for supposing manipulation to be improbable are decidedly inconclusive.
First, they say that the spike protein of SARS2 binds very well to its target, the human ACE2 receptor, but does so in a different way from that which physical calculations suggest would be the best fit. Therefore the virus must have arisen by natural selection, not manipulation.
If this argument seems hard to grasp, it’s because it’s so strained. The authors’ basic assumption, not spelt out, is that anyone trying to make a bat virus bind to human cells could do so in only one way. First they would calculate the strongest possible fit between the human ACE2 receptor and the spike protein with which the virus latches onto it. They would then design the spike protein accordingly (by selecting the right string of amino acid units that compose it). But since the SARS2 spike protein is not of this calculated best design, the Andersen paper says, therefore it can’t have been manipulated.
But this ignores the way that virologists do in fact get spike proteins to bind to chosen targets, which is not by calculation but by splicing in spike protein genes from other viruses or by serial passage. With serial passage, each time the virus’s progeny are transferred to new cell cultures or animals, the more successful are selected until one emerges that makes a really tight bind to human cells. Natural selection has done all the heavy lifting. The Andersen paper’s speculation about designing a viral spike protein through calculation has no bearing on whether or not the virus was manipulated by one of the other two methods.
The authors’ second argument against manipulation is even more contrived. Although most living things use DNA as their hereditary material, a number of viruses use RNA, DNA’s close chemical cousin. But RNA is difficult to manipulate, so researchers working on coronaviruses, which are RNA-based, will first convert the RNA genome to DNA. They manipulate the DNA version, whether by adding or altering genes, and then arrange for the manipulated DNA genome to be converted back into infectious RNA.
Only a certain number of these DNA backbones have been described in the scientific literature. Anyone manipulating the SARS2 virus “would probably” have used one of these known backbones, the Andersen group writes, and since SARS2 is not derived from any of them, therefore it was not manipulated. But the argument is conspicuously inconclusive. DNA backbones are quite easy to make, so it’s obviously possible that SARS2 was manipulated using an unpublished DNA backbone.
And that’s it. These are the two arguments made by the Andersen group in support of their declaration that the SARS2 virus was clearly not manipulated. And this conclusion, grounded in nothing but two inconclusive speculations, convinced the world’s press that SARS2 could not have escaped from a lab. A technical critique of the Andersen letter takes it down in harsher words.
Science is supposedly a self-correcting community of experts who constantly check each other’s work. So why didn’t other virologists point out that the Andersen group’s argument was full of absurdly large holes? Perhaps because in today’s universities speech can be very costly. Careers can be destroyed for stepping out of line. Any virologist who challenges the community’s declared view risks having his next grant application turned down by the panel of fellow virologists that advises the government grant distribution agency.
The Daszak and Andersen letters were really political, not scientific statements, yet were amazingly effective. Articles in the mainstream press repeatedly stated that a consensus of experts had ruled lab escape out of the question or extremely unlikely. Their authors relied for the most part on the Daszak and Andersen letters, failing to understand the yawning gaps in their arguments. Mainstream newspapers all have science journalists on their staff, as do the major networks, and these specialist reporters are supposed to be able to question scientists and check their assertions. But the Daszak and Andersen assertions went largely unchallenged.
Doubts about natural emergence
Natural emergence was the media’s preferred theory until around February 2021 and the visit by a World Health Organization commission to China. The commission’s composition and access were heavily controlled by the Chinese authorities. Its members, who included the ubiquitous Dr. Daszak, kept asserting before, during and after their visit that lab escape was extremely unlikely. But this was not quite the propaganda victory the Chinese authorities may have been hoping for. What became clear was that the Chinese had no evidence to offer the commission in support of the natural emergence theory.
This was surprising because both the SARS1 and MERS viruses had left copious traces in the environment. The intermediary host species of SARS1 was identified within four months of the epidemic’s outbreak, and the host of MERS within nine months. Yet some 15 months after the SARS2 pandemic began, and a presumably intensive search, Chinese researchers had failed to find either the original bat population, or the intermediate species to which SARS2 might have jumped, or any serological evidence that any Chinese population, including that of Wuhan, had ever been exposed to the virus prior to December 2019. Natural emergence remained a conjecture which, however plausible to begin with, had gained not a shred of supporting evidence in over a year.
And as long as that remains the case, it’s logical to pay serious attention to the alternative conjecture, that SARS2 escaped from a lab.
Why would anyone want to create a novel virus capable of causing a pandemic? Ever since virologists gained the tools for manipulating a virus’s genes, they have argued they could get ahead of a potential pandemic by exploring how close a given animal virus might be to making the jump to humans. And that justified lab experiments in enhancing the ability of dangerous animal viruses to infect people, virologists asserted.
With this rationale, they have recreated the 1918 flu virus, shown how the almost extinct polio virus can be synthesized from its published DNA sequence, and introduced a smallpox gene into a related virus.
These enhancements of viral capabilities are known blandly as gain-of-function experiments. With coronaviruses, there was particular interest in the spike proteins, which jut out all around the spherical surface of the virus and pretty much determine which species of animal it will target. In 2000 Dutch researchers, for instance, earned the gratitude of rodents everywhere by genetically engineering the spike protein of a mouse coronavirus so that it would attack only cats.
The spike proteins on the coronavirus’s surface determine which animal it can infect. CDC.gov
Virologists started studying bat coronaviruses in earnest after these turned out to be the source of both the SARS1 and MERS epidemics. In particular, researchers wanted to understand what changes needed to occur in a bat virus’s spike proteins before it could infect people.
Researchers at the Wuhan Institute of Virology, led by China’s leading expert on bat viruses, Dr. Shi Zheng-li or “Bat Lady”, mounted frequent expeditions to the bat-infested caves of Yunnan in southern China and collected around a hundred different bat coronaviruses.
Dr. Shi then teamed up with Ralph S. Baric, an eminent coronavirus researcher at the University of North Carolina. Their work focused on enhancing the ability of bat viruses to attack humans so as to “examine the emergence potential (that is, the potential to infect humans) of circulating bat CoVs [coronaviruses].” In pursuit of this aim, in November 2015 they created a novel virus by taking the backbone of the SARS1 virus and replacing its spike protein with one from a bat virus (known as SHC014-CoV). This manufactured virus was able to infect the cells of the human airway, at least when tested against a lab culture of such cells.
The SHC014-CoV/SARS1 virus is known as a chimera because its genome contains genetic material from two strains of virus. If the SARS2 virus were to have been cooked up in Dr. Shi’s lab, then its direct prototype would have been the SHC014-CoV/SARS1 chimera, the potential danger of which concerned many observers and prompted intense discussion.
“If the virus escaped, nobody could predict the trajectory,” said Simon Wain-Hobson, a virologist at the Pasteur Institute in Paris.
Dr. Baric and Dr. Shi referred to the obvious risks in their paper but argued they should be weighed against the benefit of foreshadowing future spillovers. Scientific review panels, they wrote, “may deem similar studies building chimeric viruses based on circulating strains too risky to pursue.” Given various restrictions being placed on gain-of function (GOF) research, matters had arrived in their view at “a crossroads of GOF research concerns; the potential to prepare for and mitigate future outbreaks must be weighed against the risk of creating more dangerous pathogens. In developing policies moving forward, it is important to consider the value of the data generated by these studies and whether these types of chimeric virus studies warrant further investigation versus the inherent risks involved.”
That statement was made in 2015. From the hindsight of 2021, one can say that the value of gain-of-function studies in preventing the SARS2 epidemic was zero. The risk was catastrophic, if indeed the SARS2 virus was generated in a gain-of-function experiment.
Inside the Wuhan Institute of Virology
Dr. Baric had developed, and taught Dr. Shi, a general method for engineering bat coronaviruses to attack other species. The specific targets were human cells grown in cultures and humanized mice. These laboratory mice, a cheap and ethical stand-in for human subjects, are genetically engineered to carry the human version of a protein called ACE2 that studs the surface of cells that line the airways.
Dr. Shi returned to her lab at the Wuhan Institute of Virology and resumed the work she had started on genetically engineering coronaviruses to attack human cells.
Dr. Zheng-li Shi in a high safety (level BSL4) lab. Her coronavirus research was done in the much lower safety levels of BSL2 and BSL3 labs.
How can we be so sure?
Because, by a strange twist in the story, her work was funded by the National Institute of Allergy and Infectious Diseases (NIAID), a part of the U.S. National Institutes of Health (NIH). And grant proposals that funded her work, which are a matter of public record, specify exactly what she planned to do with the money.
The grants were assigned to the prime contractor, Dr. Daszak of the EcoHealth Alliance, who subcontracted them to Dr. Shi. Here are extracts from the grants for fiscal years 2018 and 2019. “CoV” stands for coronavirus and “S protein” refers to the virus’s spike protein.
“Test predictions of CoV inter-species transmission. Predictive models of host range (i.e. emergence potential) will be tested experimentally using reverse genetics, pseudovirus and receptor binding assays, and virus infection experiments across a range of cell cultures from different species and humanized mice.”
“We will use S protein sequence data, infectious clone technology, in vitro and in vivo infection experiments and analysis of receptor binding to test the hypothesis that % divergence thresholds in S protein sequences predict spillover potential.”
What this means, in non-technical language, is that Dr. Shi set out to create novel coronaviruses with the highest possible infectivity for human cells. Her plan was to take genes that coded for spike proteins possessing a variety of measured affinities for human cells, ranging from high to low. She would insert these spike genes one by one into the backbone of a number of viral genomes (“reverse genetics” and “infectious clone technology”), creating a series of chimeric viruses. These chimeric viruses would then be tested for their ability to attack human cell cultures (“in vitro”) and humanized mice (“in vivo”). And this information would help predict the likelihood of “spillover,” the jump of a coronavirus from bats to people.
The methodical approach was designed to find the best combination of coronavirus backbone and spike protein for infecting human cells. The approach could have generated SARS2-like viruses, and indeed may have created the SARS2 virus itself with the right combination of virus backbone and spike protein.
It cannot yet be stated that Dr. Shi did or did not generate SARS2 in her lab because her records have been sealed, but it seems she was certainly on the right track to have done so. “It is clear that the Wuhan Institute of Virology was systematically constructing novel chimeric coronaviruses and was assessing their ability to infect human cells and human-ACE2-expressing mice,” says Richard H. Ebright, a molecular biologist at Rutgers University and leading expert on biosafety.
“It is also clear,” Dr. Ebright said, “that, depending on the constant genomic contexts chosen for analysis, this work could have produced SARS-CoV-2 or a proximal progenitor of SARS-CoV-2.” “Genomic context” refers to the particular viral backbone used as the testbed for the spike protein.
The lab escape scenario for the origin of the SARS2 virus, as should by now be evident, is not mere hand-waving in the direction of the Wuhan Institute of Virology. It is a detailed proposal, based on the specific project being funded there by the NIAID.
Even if the grant required the work plan described above, how can we be sure that the plan was in fact carried out? For that we can rely on the word of Dr. Daszak, who has been much protesting for the last 15 months that lab escape was a ludicrous conspiracy theory invented by China-bashers.
On 9 December 2019, before the outbreak of the pandemic became generally known, Dr. Daszak gave an interview in which he talked in glowing terms of how researchers at the Wuhan Institute of Virology had been reprogramming the spike protein and generating chimeric coronaviruses capable of infecting humanized mice.
“And we have now found, you know, after 6 or 7 years of doing this, over 100 new sars-related coronaviruses, very close to SARS,” Dr. Daszak says around minute 28 of the interview. “Some of them get into human cells in the lab, some of them can cause SARS disease in humanized mice models and are untreatable with therapeutic monoclonals and you can’t vaccinate against them with a vaccine. So, these are a clear and present danger….
“Interviewer: You say these are diverse coronaviruses and you can’t vaccinate against them, and no anti-virals — so what do we do?
“Daszak: Well I think…coronaviruses — you can manipulate them in the lab pretty easily. Spike protein drives a lot of what happen with coronavirus, in zoonotic risk. So you can get the sequence, you can build the protein, and we work a lot with Ralph Baric at UNC to do this. Insert into the backbone of another virus and do some work in the lab. So you can get more predictive when you find a sequence. You’ve got this diversity. Now the logical progression for vaccines is, if you are going to develop a vaccine for SARS, people are going to use pandemic SARS, but let’s insert some of these other things and get a better vaccine.” The insertions he referred to perhaps included an element called the furin cleavage site, discussed below, which greatly increases viral infectivity for human cells.
In disjointed style, Dr. Daszak is referring to the fact that once you have generated a novel coronavirus that can attack human cells, you can take the spike protein and make it the basis for a vaccine.
One can only imagine Dr. Daszak’s reaction when he heard of the outbreak of the epidemic in Wuhan a few days later. He would have known better than anyone the Wuhan Institute’s goal of making bat coronaviruses infectious to humans, as well as the weaknesses in the institute’s defense against their own researchers becoming infected.
But instead of providing public health authorities with the plentiful information at his disposal, he immediately launched a public relations campaign to persuade the world that the epidemic couldn’t possibly have been caused by one of the institute’s souped-up viruses. “The idea that this virus escaped from a lab is just pure baloney. It’s simply not true,” he declared in an April 2020 interview.
The Safety Arrangements at the Wuhan Institute of Virology
Dr. Daszak was possibly unaware of, or perhaps he knew all too well, the long history of viruses escaping from even the best run laboratories. The smallpox virus escaped three times from labs in England in the 1960’s and 1970’s, causing 80 cases and 3 deaths. Dangerous viruses have leaked out of labs almost every year since. Coming to more recent times, the SARS1 virus has proved a true escape artist, leaking from laboratories in Singapore, Taiwan, and no less than four times from the Chinese National Institute of Virology in Beijing.
One reason for SARS1 being so hard to handle is that there were no vaccines available to protect laboratory workers. As Dr. Daszak mentioned in his December 19 interview quoted above, the Wuhan researchers too had been unable to develop vaccines against the coronaviruses they had designed to infect human cells. They would have been as defenseless against the SARS2 virus, if it were generated in their lab, as their Beijing colleagues were against SARS1.
A second reason for the severe danger of novel coronaviruses has to do with the required levels of lab safety. There are four degrees of safety, designated BSL1 to BSL4, with BSL4 being the most restrictive and designed for deadly pathogens like the Ebola virus.
The Wuhan Institute of Virology had a new BSL4 lab, but its state of readiness considerably alarmed the State Department inspectors who visited it from the Beijing embassy in 2018. “The new lab has a serious shortage of appropriately trained technicians and investigators needed to safely operate this high-containment laboratory,” the inspectors wrote in a cable of 19 January 2018.
The real problem, however, was not the unsafe state of the Wuhan BSL4 lab but the fact that virologists worldwide don’t like working in BSL4 conditions. You have to wear a space suit, do operations in closed cabinets and accept that everything will take twice as long. So the rules assigning each kind of virus to a given safety level were laxer than some might think was prudent.
Before 2020, the rules followed by virologists in China and elsewhere required that experiments with the SARS1 and MERS viruses be conducted in BSL3 conditions. But all other bat coronaviruses could be studied in BSL2, the next level down. BSL2 requires taking fairly minimal safety precautions, such as wearing lab coats and gloves, not sucking up liquids in a pipette, and putting up biohazard warning signs. Yet a gain-of-function experiment conducted in BSL2 might produce an agent more infectious than either SARS1 or MERS. And if it did, then lab workers would stand a high chance of infection, especially if unvaccinated.
Much of Dr. Shi’s work on gain-of-function in coronaviruses was performed at the BSL2 safety level, as is stated in her publications and other documents. She has said in an interview with Science magazine that “The coronavirus research in our laboratory is conducted in BSL-2 or BSL-3 laboratories.”
“It is clear that some or all of this work was being performed using a biosafety standard — biosafety level 2, the biosafety level of a standard US dentist’s office — that would pose an unacceptably high risk of infection of laboratory staff upon contact with a virus having the transmission properties of SARS-CoV-2,” says Dr. Ebright.
“It also is clear,” he adds, “that this work never should have been funded and never should have been performed.”
This is a view he holds regardless of whether or not the SARS2 virus ever saw the inside of a lab.
Concern about safety conditions at the Wuhan lab was not, it seems, misplaced. According to a fact sheet issued by the State Department on January 21,2021, “ The U.S. government has reason to believe that several researchers inside the WIV became sick in autumn 2019, before the first identified case of the outbreak, with symptoms consistent with both COVID-19 and common seasonal illnesses.”
David Asher, a fellow of the Hudson Institute and former consultant to the State Department, provided more detail about the incident at a seminar. Knowledge of the incident came from a mix of public information and “some high end information collected by our intelligence community,” he said. Three people working at a BSL3 lab at the institute fell sick within a week of each other with severe symptoms that required hospitalization. This was “the first known cluster that we’re aware of, of victims of what we believe to be COVID-19.” Influenza could not completely be ruled out but seemed unlikely in the circumstances, he said.
Comparing the Rival Scenarios of SARS2 Origin
The evidence above adds up to a serious case that the SARS2 virus could have been created in a lab, from which it then escaped. But the case, however substantial, falls short of proof. Proof would consist of evidence from the Wuhan Institute of Virology, or related labs in Wuhan, that SARS2 or a predecessor virus was under development there. For lack of access to such records, another approach is to take certain salient facts about the SARS2 virus and ask how well each is explained by the two rival scenarios of origin, those of natural emergence and lab escape. Here are four tests of the two hypotheses. A couple have some technical detail, but these are among the most persuasive for those who may care to follow the argument.
1) The place of origin.
Start with geography. The two closest known relatives of the SARS2 virus were collected from bats living in caves in Yunnan, a province of southern China. If the SARS2 virus had first infected people living around the Yunnan caves, that would strongly support the idea that the virus had spilled over to people naturally. But this isn’t what happened. The pandemic broke out 1,500 kilometers away, in Wuhan.
Beta-coronaviruses, the family of bat viruses to which SARS2 belongs, infect the horseshoe bat Rhinolophus affinis, which ranges across southern China. The bats’ range is 50 kilometers, so it’s unlikely that any made it to Wuhan. In any case, the first cases of the Covid-19 pandemic probably occurred in September, when temperatures in Hubei province are already cold enough to send bats into hibernation.
What if the bat viruses infected some intermediate host first? You would need a longstanding population of bats in frequent proximity with an intermediate host, which in turn must often cross paths with people. All these exchanges of virus must take place somewhere outside Wuhan, a busy metropolis which so far as is known is not a natural habitat of Rhinolophus bat colonies. The infected person (or animal) carrying this highly transmissible virus must have traveled to Wuhan without infecting anyone else. No one in his or her family got sick. If the person jumped on a train to Wuhan, no fellow passengers fell ill.
It’s a stretch, in other words, to get the pandemic to break out naturally outside Wuhan and then, without leaving any trace, to make its first appearance there.
For the lab escape scenario, a Wuhan origin for the virus is a no-brainer. Wuhan is home to China’s leading center of coronavirus research where, as noted above, researchers were genetically engineering bat coronaviruses to attack human cells. They were doing so under the minimal safety conditions of a BSL2 lab. If a virus with the unexpected infectiousness of SARS2 had been generated there, its escape would be no surprise.
2) Natural history and evolution
The initial location of the pandemic is a small part of a larger problem, that of its natural history. Viruses don’t just make one time jumps from one species to another. The coronavirus spike protein, adapted to attack bat cells, needs repeated jumps to another species, most of which fail, before it gains a lucky mutation. Mutation — a change in one of its RNA units — causes a different amino acid unit to be incorporated into its spike protein and makes the spike protein better able to attack the cells of some other species.
Through several more such mutation-driven adjustments, the virus adapts to its new host, say some animal with which bats are in frequent contact. The whole process then resumes as the virus moves from this intermediate host to people.
In the case of SARS1, researchers have documented the successive changes in its spike protein as the virus evolved step by step into a dangerous pathogen. After it had gotten from bats into civets, there were six further changes in its spike protein before it became a mild pathogen in people. After a further 14 changes, the virus was much better adapted to humans, and with a further 4 the epidemic took off.
But when you look for the fingerprints of a similar transition in SARS2, a strange surprise awaits. The virus has changed hardly at all, at least until recently. From its very first appearance, it was well adapted to human cells. Researchers led by Alina Chan of the Broad Institute compared SARS2 with late stage SARS1, which by then was well adapted to human cells, and found that the two viruses were similarly well adapted. “By the time SARS-CoV-2 was first detected in late 2019, it was already pre-adapted to human transmission to an extent similar to late epidemic SARS-CoV,” they wrote.
Even those who think lab origin unlikely agree that SARS2 genomes are remarkably uniform. Dr. Baric writes that “early strains identified in Wuhan, China, showed limited genetic diversity, which suggests that the virus may have been introduced from a single source.”
A single source would of course be compatible with lab escape, less so with the massive variation and selection which is evolution’s hallmark way of doing business.
The uniform structure of SARS2 genomes gives no hint of any passage through an intermediate animal host, and no such host has been identified in nature.
Proponents of natural emergence suggest that SARS2 incubated in a yet-to-be found human population before gaining its special properties. Or that it jumped to a host animal outside China.
All these conjectures are possible, but strained. Proponents of lab leak have a simpler explanation. SARS2 was adapted to human cells from the start because it was grown in humanized mice or in lab cultures of human cells, just as described in Dr. Daszak’s grant proposal. Its genome shows little diversity because the hallmark of lab cultures is uniformity.
Proponents of laboratory escape joke that of course the SARS2 virus infected an intermediary host species before spreading to people, and that they have identified it — a humanized mouse from the Wuhan Institute of Virology.
3) The furin cleavage site.
The furin cleavage site is a minute part of the virus’s anatomy but one that exerts great influence on its infectivity. It sits in the middle of the SARS2 spike protein. It also lies at the heart of the puzzle of where the virus came from.
The spike protein has two sub-units with different roles. The first, called S1, recognizes the virus’s target, a protein called angiotensin converting enzyme-2 (or ACE2) which studs the surface of cells lining the human airways. The second, S2, helps the virus, once anchored to the cell, to fuse with the cell’s membrane. After the virus’s outer membrane has coalesced with that of the stricken cell, the viral genome is injected into the cell, hijacks its protein-making machinery and forces it to generate new viruses.
But this invasion cannot begin until the S1 and S2 subunits have been cut apart. And there, right at the S1/S2 junction, is the furin cleavage site that ensures the spike protein will be cleaved in exactly the right place.
The virus, a model of economic design, does not carry its own cleaver. It relies on the cell to do the cleaving for it. Human cells have a protein cutting tool on their surface known as furin. Furin will cut any protein chain that carries its signature target cutting site. This is the sequence of amino acid units proline-arginine-arginine-alanine, or PRRA in the code that refers to each amino acid by a letter of the alphabet. PRRA is the amino acid sequence at the core of SARS2’s furin cleavage site.
Viruses have all kinds of clever tricks, so why does the furin cleavage site stand out? Because of all known SARS-related beta-coronaviruses, only SARS2 possesses a furin cleavage site. All the other viruses have their S2 unit cleaved at a different site and by a different mechanism.
How then did SARS2 acquire its furin cleavage site? Either the site evolved naturally, or it was inserted by researchers at the S1/S2 junction in a gain-of-function experiment.
Consider natural origin first. Two ways viruses evolve are by mutation and by recombination. Mutation is the process of random change in DNA (or RNA for coronaviruses) that usually results in one amino acid in a protein chain being switched for another. Many of these changes harm the virus but natural selection retains the few that do something useful. Mutation is the process by which the SARS1 spike protein gradually switched its preferred target cells from those of bats to civets, and then to humans.
Mutation seems a less likely way for SARS2’s furin cleavage site to be generated, even though it can’t completely be ruled out. The site’s four amino acid units are all together, and all at just the right place in the S1/S2 junction. Mutation is a random process triggered by copying errors (when new viral genomes are being generated) or by chemical decay of genomic units. So it typically affects single amino acids at different spots in a protein chain. A string of amino acids like that of the furin cleavage site is much more likely to be acquired all together through a quite different process known as recombination.
Recombination is an inadvertent swapping of genomic material that occurs when two viruses happen to invade the same cell, and their progeny are assembled with bits and pieces of RNA belonging to the other. Beta-coronaviruses will only combine with other beta-coronaviruses but can acquire, by recombination, almost any genetic element present in the collective genomic pool. What they cannot acquire is an element the pool does not possess. And no known SARS-related beta-coronavirus, the class to which SARS2 belongs, possesses a furin cleavage site.
Proponents of natural emergence say SARS2 could have picked up the site from some as yet unknown beta-coronavirus. But bat SARS-related beta-coronaviruses evidently don’t need a furin cleavage site to infect bat cells, so there’s no great likelihood that any in fact possesses one, and indeed none has been found so far.
The proponents’ next argument is that SARS2 acquired its furin cleavage site from people. A predecessor of SARS2 could have been circulating in the human population for months or years until at some point it acquired a furin cleavage site from human cells. It would then have been ready to break out as a pandemic.
If this is what happened, there should be traces in hospital surveillance records of the people infected by the slowly evolving virus. But none has so far come to light. According to the WHO report on the origins of the virus, the sentinel hospitals in Hubei province, home of Wuhan, routinely monitor influenza-like illnesses and “no evidence to suggest substantial SARSCoV-2 transmission in the months preceding the outbreak in December was observed.”
So it’s hard to explain how the SARS2 virus picked up its furin cleavage site naturally, whether by mutation or recombination.
That leaves a gain-of-function experiment. For those who think SARS2 may have escaped from a lab, explaining the furin cleavage site is no problem at all. “Since 1992 the virology community has known that the one sure way to make a virus deadlier is to give it a furin cleavage site at the S1/S2 junction in the laboratory,” writes Dr. Steven Quay, a biotech entrepreneur interested in the origins of SARS2. “At least eleven gain-of-function experiments, adding a furin site to make a virus more infective, are published in the open literature, including [by] Dr. Zhengli Shi, head of coronavirus research at the Wuhan Institute of Virology.”
4) A Question of Codons
There’s another aspect of the furin cleavage site that narrows the path for a natural emergence origin even further.
As everyone knows (or may at least recall from high school), the genetic code uses three units of DNA to specify each amino acid unit of a protein chain. When read in groups of 3, the 4 different kinds of DNA can specify 4 x 4 x 4 or 64 different triplets, or codons as they are called. Since there are only 20 kinds of amino acid, there are more than enough codons to go around, allowing some amino acids to be specified by more than one codon. The amino acid arginine, for instance, can be designated by any of the six codons CGU, CGC, CGA, CGG, AGA or AGG, where A, U, G and C stand for the four different kinds of unit in RNA.
Here’s where it gets interesting. Different organisms have different codon preferences. Human cells like to designate arginine with the codons CGT, CGC or CGG. But CGG is coronavirus’s least popular codon for arginine. Keep that in mind when looking at how the amino acids in the furin cleavage site are encoded in the SARS2 genome.
Now the functional reason why SARS2 has a furin cleavage site, and its cousin viruses don’t, can be seen by lining up (in a computer) the string of nearly 30,000 nucleotides in its genome with those of its cousin coronaviruses, of which the closest so far known is one called RaTG13. Compared with RaTG13, SARS2 has a 12-nucleotide insert right at the S1/S2 junction. The insert is the sequence T-CCT-CGG-CGG-GC. The CCT codes for proline, the two CGG’s for two arginines, and the GC is the beginning of a GCA codon that codes for alanine.
There are several curious features about this insert but the oddest is that of the two side-by-side CGG codons. Only 5% of SARS2’s arginine codons are CGG, and the double codon CGG-CGG has not been found in any other beta-coronavirus. So how did SARS2 acquire a pair of arginine codons that are favored by human cells but not by coronaviruses?
Proponents of natural emergence have an up-hill task to explain all the features of SARS2’s furin cleavage site. They have to postulate a recombination event at a site on the virus’s genome where recombinations are rare, and the insertion of a 12-nucleotide sequence with a double arginine codon unknown in the beta-coronavirus repertoire, at the only site in the genome that would significantly expand the virus’s infectivity.
“Yes, but your wording makes this sound unlikely — viruses are specialists at unusual events,” is the riposte of David L. Robertson, a virologist at the University of Glasgow who regards lab escape as a conspiracy theory. “Recombination is naturally very, very frequent in these viruses, there are recombination breakpoints in the spike protein and these codons appear unusual exactly because we’ve not sampled enough.”
Dr. Robertson is correct that evolution is always producing results that may seem unlikely but in fact are not. Viruses can generate untold numbers of variants but we see only the one-in-a-billion that natural selection picks for survival. But this argument could be pushed too far. For instance any result of a gain-of-function experiment could be explained as one that evolution would have arrived at in time. And the numbers game can be played the other way. For the furin cleavage site to arise naturally in SARS2, a chain of events has to happen, each of which is quite unlikely for the reasons given above. A long chain with several improbable steps is unlikely to ever be completed.
For the lab escape scenario, the double CGG codon is no surprise. The human-preferred codon is routinely used in labs. So anyone who wanted to insert a furin cleavage site into the virus’s genome would synthesize the PRRA-making sequence in the lab and would be likely to use CGG codons to do so.
A Third Scenario of Origin
There’s a variation on the natural emergence scenario that’s worth considering. This is the idea that SARS2 jumped directly from bats to humans, without going through an intermediate host as SARS1 and MERS did. A leading advocate is the virologist David Robertson who notes that SARS2 can attack several other species besides humans. He believes the virus evolved a generalist capability while still in bats. Because the bats it infects are widely distributed in southern and central China, the virus had ample opportunity to jump to people, even though it seems to have done so on only one known occasion. Dr. Robertson’s thesis explains why no one has so far found a trace of SARS2 in any intermediate host or in human populations surveilled before December 2019. It would also explain the puzzling fact that SARS2 has not changed since it first appeared in humans — it didn’t need to because it could already attack human cells efficiently.
One problem with this idea, though, is that if SARS2 jumped from bats to people in a single leap and hasn’t changed much since, it should still be good at infecting bats. And it seems it isn’t.
“Tested bat species are poorly infected by SARS-CoV-2 and they are therefore unlikely to be the direct source for human infection,” write a scientific group skeptical of natural emergence.
Still, Dr. Robertson may be onto something. The bat coronaviruses of the Yunnan caves can infect people directly. In April 2012 six miners clearing bat guano from the Mojiang mine contracted severe pneumonia with Covid-19-like symptoms and three eventually died. A virus isolated from the Mojiang mine, called RaTG13, is still the closest known relative of SARS2. Much mystery surrounds the origin, reporting and strangely low affinity of RaTG13 for bat cells, as well as the nature of 8 similar viruses that Dr. Shi reports she collected at the same time but has not yet published despite their great relevance to the ancestry of SARS2. But all that is a story for another time. The point here is that bat viruses can infect people directly, though only in special conditions.
So who else, besides miners excavating bat guano, comes into particularly close contact with bat coronaviruses? Well, coronavirus researchers do. Dr. Shi says she and her group collected more than 1,300 bat samples during some 8 visits to the Mojiang cave between 2012 and 2015, and there were doubtless many expeditions to other Yunnan caves.
Imagine the researchers making frequent trips from Wuhan to Yunnan and back, stirring up bat guano in dark caves and mines, and now you begin to see a possible missing link between the two places. Researchers could have gotten infected during their collecting trips, or while working with the new viruses at the Wuhan Institute of Technology. The virus that escaped from the lab would have been a natural virus, not one cooked up by gain of function.
The direct-from-bats thesis is a chimera between the natural emergence and lab escape scenarios. It’s a possibility that can’t be dismissed. But against it are the facts that 1) both SARS2 and RaTG13 seem to have only feeble affinity for bat cells, so one can’t be fully confident that either ever saw the inside of a bat; and 2) the theory is no better than the natural emergence scenario at explaining how SARS2 gained its furin cleavage site, or why the furin cleavage site is determined by human-preferred arginine codons instead of by the bat-preferred codons.
Where We Are So Far
Neither the natural emergence nor the lab escape hypothesis can yet be ruled out. There is still no direct evidence for either. So no definitive conclusion can be reached.
That said, the available evidence leans more strongly in one direction than the other. Readers will form their own opinion. But it seems to me that proponents of lab escape can explain all the available facts about SARS2 considerably more easily than can those who favor natural emergence.
It’s documented that researchers at the Wuhan Institute of Virology were doing gain-of-function experiments designed to make coronaviruses infect human cells and humanized mice. This is exactly the kind of experiment from which a SARS2-like virus could have emerged. The researchers were not vaccinated against the viruses under study, and they were working in the minimal safety conditions of a BSL2 laboratory. So escape of a virus would not be at all surprising. In all of China, the pandemic broke out on the doorstep of the Wuhan institute. The virus was already well adapted to humans, as expected for a virus grown in humanized mice. It possessed an unusual enhancement, a furin cleavage site, which is not possessed by any other known SARS-related beta-coronavirus, and this site included a double arginine codon also unknown among beta-coronaviruses. What more evidence could you want, aside from the presently unobtainable lab records documenting SARS2’s creation?
Proponents of natural emergence have a rather harder story to tell. The plausibility of their case rests on a single surmise, the expected parallel between the emergence of SARS2 and that of SARS1 and MERS. But none of the evidence expected in support of such a parallel history has yet emerged. No one has found the bat population that was the source of SARS2, if indeed it ever infected bats. No intermediate host has presented itself, despite an intensive search by Chinese authorities that included the testing of 80,000 animals. There is no evidence of the virus making multiple independent jumps from its intermediate host to people, as both the SARS1 and MERS viruses did. There is no evidence from hospital surveillance records of the epidemic gathering strength in the population as the virus evolved. There is no explanation of why a natural epidemic should break out in Wuhan and nowhere else. There is no good explanation of how the virus acquired its furin cleavage site, which no other SARS-related beta-coronavirus possesses, nor why the site is composed of human-preferred codons. The natural emergence theory battles a bristling array of implausibilities.
The records of the Wuhan Institute of Virology certainly hold much relevant information. But Chinese authorities seem unlikely to release them given the substantial chance that they incriminate the regime in the creation of the pandemic. Absent the efforts of some courageous Chinese whistle-blower, we may already have at hand just about all of the relevant information we are likely to get for a while.
So it’s worth trying to assess responsibility for the pandemic, at least in a provisional way, because the paramount goal remains to prevent another one. Even those who aren’t persuaded that lab escape is the more likely origin of the SARS2 virus may see reason for concern about the present state of regulation governing gain-of-function research. There are two obvious levels of responsibility: the first, for allowing virologists to perform gain-of-function experiments, offering minimal gain and vast risk; the second, if indeed SARS2 was generated in a lab, for allowing the virus to escape and unleash a world-wide pandemic. Here are the players who seem most likely to deserve blame.
1. Chinese virologists
First and foremost, Chinese virologists are to blame for performing gain-of-function experiments in mostly BSL2-level safety conditions which were far too lax to contain a virus of unexpected infectiousness like SARS2. If the virus did indeed escape from their lab, they deserve the world’s censure for a foreseeable accident that has already caused the deaths of 3 million people.
True, Dr. Shi was trained by French virologists, worked closely with American virologists and was following international rules for the containment of coronaviruses. But she could and should have made her own assessment of the risks she was running. She and her colleagues bear the responsibility for their actions.
I have been using the Wuhan Institute of Virology as a shorthand for all virological activities in Wuhan. It’s possible that SARS2 was generated in some other Wuhan lab, perhaps in an attempt to make a vaccine that worked against all coronaviruses. But until the role of other Chinese virologists is clarified, Dr. Shi is the public face of Chinese work on coronaviruses, and provisionally she and her colleagues will stand first in line for opprobrium.
2. Chinese authorities
China’s central authorities did not generate SARS2 but they sure did their utmost to conceal the nature of the tragedy and China’s responsibility for it. They suppressed all records at the Wuhan Institute of Virology and closed down its virus databases. They released a trickle of information, much of which may have been outright false or designed to misdirect and mislead. They did their best to manipulate the WHO’s inquiry into the virus’s origins, and led the commission’s members on a fruitless run-around. So far they have proved far more interested in deflecting blame than in taking the steps necessary to prevent a second pandemic.
3. The worldwide community of virologists
Virologists around the world are a loose-knit professional community. They write articles in the same journals. They attend the same conferences. They have common interests in seeking funds from governments and in not being overburdened with safety regulations.
Virologists knew better than anyone the dangers of gain-of-function research. But the power to create new viruses, and the research funding obtainable by doing so, was too tempting. They pushed ahead with gain-of-function experiments. They lobbied against the moratorium imposed on Federal funding for gain-of-function research in 2014 and it was raised in 2017.
The benefits of the research in preventing future epidemics have so far been nil, the risks vast. If research on the SARS1 and MERS viruses could only be done at the BSL3 safety level, it was surely illogical to allow any work with novel coronaviruses at the lesser level of BSL2. Whether or not SARS2 escaped from a lab, virologists around the world have been playing with fire.
Their behavior has long alarmed other biologists. In 2014 scientists calling themselves the Cambridge Working Group urged caution on creating new viruses. In prescient words, they specified the risk of creating a SARS2-like virus. “Accident risks with newly created ‘potential pandemic pathogens’ raise grave new concerns,” they wrote. “Laboratory creation of highly transmissible, novel strains of dangerous viruses, especially but not limited to influenza, poses substantially increased risks. An accidental infection in such a setting could trigger outbreaks that would be difficult or impossible to control.”
When molecular biologists discovered a technique for moving genes from one organism to another, they held a public conference at Asilomar in 1975 to discuss the possible risks. Despite much internal opposition, they drew up a list of stringent safety measures that could be relaxed in future — and duly were — when the possible hazards had been better assessed.
When the CRISPR technique for editing genes was invented, biologists convened a joint report by the U.S., UK and Chinese national academies of science to urge restraint on making heritable changes to the human genome. Biologists who invented gene drives have also been open about the dangers of their work and have sought to involve the public.
You might think the SARS2 pandemic would spur virologists to re-evaluate the benefits of gain-of-function research, even to engage the public in their deliberations. But no. Many virologists deride lab escape as a conspiracy theory and others say nothing. They have barricaded themselves behind a Chinese wall of silence which so far is working well to allay, or at least postpone, journalists’ curiosity and the public’s wrath. Professions that cannot regulate themselves deserve to get regulated by others, and this would seem to be the future that virologists are choosing for themselves.
4. The US Role in Funding the Wuhan Institute of Virology
From June 2014 to May 2019 Dr. Daszak’s EcoHealth Alliance had a grant from the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, to do gain-of-function research with coronaviruses at the Wuhan Institute of Virology. Whether or not SARS2 is the product of that research, it seems a questionable policy to farm out high-risk research to unsafe foreign labs using minimal safety precautions. And if the SARS2 virus did indeed escape from the Wuhan institute, then the NIH will find itself in the terrible position of having funded a disastrous experiment that led to death of more than 3 million worldwide, including more than half a million of its own citizens.
The responsibility of the NIAID and NIH is even more acute because for the first three years of the grant to EcoHealth Alliance there was a moratorium on funding gain-of-function research. Why didn’t the two agencies therefore halt the Federal funding as apparently required to do so by law? Because someone wrote a loophole into the moratorium.
The moratorium specifically barred funding any gain-of-function research that increased the pathogenicity of the flu, MERS or SARS viruses. But then a footnote on p.2 of the moratorium document states that “An exception from the research pause may be obtained if the head of the USG funding agency determines that the research is urgently necessary to protect the public health or national security.”
This seems to mean that either the director of the NIAID, Dr. Anthony Fauci, or the director of the NIH, Dr. Francis Collins, or maybe both, would have invoked the footnote in order to keep the money flowing to Dr. Shi’s gain-of-function research.
“Unfortunately, the NIAID Director and the NIH Director exploited this loophole to issue exemptions to projects subject to the Pause –preposterously asserting the exempted research was ‘urgently necessary to protect public health or national security’ — thereby nullifying the Pause,” Dr. Richard Ebright said in an interview with Independent Science News.
When the moratorium was ended in 2017 it didn’t just vanish but was replaced by a reporting system, the Potential Pandemic Pathogens Control and Oversight (P3CO) Framework, which required agencies to report for review any dangerous gain-of-function work they wished to fund.
According to Dr. Ebright, both Dr. Collins and Dr. Fauci “have declined to flag and forward proposals for risk-benefit review, thereby nullifying the P3CO Framework.”
In his view, the two officials, in dealing with the moratorium and the ensuing reporting system, “have systematically thwarted efforts by the White House, the Congress, scientists, and science policy specialists to regulate GoF [gain-of-function] research of concern.”
Possibly the two officials had to take into account matters not evident in the public record, such as issues of national security. Perhaps funding the Wuhan Institute of Virology, which is believed to have ties with Chinese military virologists, provided a window into Chinese biowarfare research. But whatever other considerations may have been involved, the bottom line is that the National Institutes of Health was supporting gain-of-function research, of a kind that could have generated the SARS2 virus, in an unsupervised foreign lab that was doing work in BSL2 biosafety conditions. The prudence of this decision can be questioned, whether or not SARS2 and the death of 3 million people was the result of it.
In Conclusion
If the case that SARS2 originated in a lab is so substantial, why isn’t this more widely known? As may now be obvious, there are many people who have reason not to talk about it. The list is led, of course, by the Chinese authorities. But virologists in the United States and Europe have no great interest in igniting a public debate about the gain-of-function experiments that their community has been pursuing for years.
Nor have other scientists stepped forward to raise the issue. Government research funds are distributed on the advice of committees of scientific experts drawn from universities. Anyone who rocks the boat by raising awkward political issues runs the risk that their grant will not be renewed and their research career will be ended. Maybe good behavior is rewarded with the many perks that slosh around the distribution system. And if you thought that Dr. Andersen and Dr. Daszak might have blotted their reputation for scientific objectivity after their partisan attacks on the lab escape scenario, look at the 2nd and 3rd names on this list of recipients of an $82 million grant announced by the National Institute of Allergy and Infectious Diseases in August 2020.
The US government shares a strange common interest with the Chinese authorities: neither is keen on drawing attention to the fact that Dr. Shi’s coronavirus work was funded by the US National Institutes of Health. One can imagine the behind-the-scenes conversation in which the Chinese government says “If this research was so dangerous, why did you fund it, and on our territory too?” To which the US side might reply, “Looks like it was you who let it escape. But do we really need to have this discussion in public?”
Dr. Fauci is a longtime public servant who served with integrity under President Trump and has resumed leadership in the Biden Administration in handling the Covid epidemic. Congress, no doubt understandably, may have little appetite for hauling him over the coals for the apparent lapse of judgment in funding gain-of-function research in Wuhan.
To these serried walls of silence must be added that of the mainstream media. To my knowledge, no major newspaper or television network has yet provided readers with an in-depth news story of the lab escape scenario, such as the one you have just read, although some have run brief editorials or opinion pieces. One might think that any plausible origin of a virus that has killed three million people would merit a serious investigation. Or that the wisdom of continuing gain-of-function research, regardless of the virus’s origin, would be worth some probing. Or that the funding of gain-of-function research by the NIH and NIAID during a moratorium on such research would bear investigation. What accounts for the media’s apparent lack of curiosity?
The virologists’ omertà is one reason. Science reporters, unlike political reporters, have little innate skepticism of their sources’ motives; most see their role largely as purveying the wisdom of scientists to the unwashed masses. So when their sources won’t help, these journalists are at a loss.
Another reason, perhaps, is the migration of much of the media toward the left of the political spectrum. Because President Trump said the virus had escaped from a Wuhan lab, editors gave the idea little credence. They joined the virologists in regarding lab escape as a dismissible conspiracy theory. During the Trump Administration, they had no trouble in rejecting the position of the intelligence services that lab escape could not be ruled out. But when Avril Haines, President Biden’s director of National Intelligence, said the same thing, she too was largely ignored. This is not to argue that editors should have endorsed the lab escape scenario, merely that they should have explored the possibility fully and fairly.
People round the world who have been pretty much confined to their homes for the last year might like a better answer than their media are giving them. Perhaps one will emerge in time. After all, the more months pass without the natural emergence theory gaining a shred of supporting evidence, the less plausible it may seem. Perhaps the international community of virologists will come to be seen as a false and self-interested guide. The common sense perception that a pandemic breaking out in Wuhan might have something to do with a Wuhan lab cooking up novel viruses of maximal danger in unsafe conditions could eventually displace the ideological insistence that whatever Trump said can’t be true.
And then let the reckoning begin.
Nicholas Wade
April 30,2021
Acknowledgements
The first person to take a serious look at the origins of the SARS2 virus was Yuri Deigin, a biotech entrepreneur in Russia and Canada. In a long and brilliant essay, he dissected the molecular biology of the SARS2 virus and raised, without endorsing, the possibility that it had been manipulated. The essay, published on April 22, 2020, provided a roadmap for anyone seeking to understand the virus’s origins. Deigin packed so much information and analysis into his essay that some have doubted it could be the work of a single individual and suggested some intelligence agency must have authored it. But the essay is written with greater lightness and humor than I suspect are ever found in CIA or KGB reports, and I see no reason to doubt that Dr. Deigin is its very capable sole author.
In Deigin’s wake have followed several other skeptics of the virologists’ orthodoxy. Nikolai Petrovsky calculated how tightly the SARS2 virus binds to the ACE2 receptors of various species and found to his surprise that it seemed optimized for the human receptor, leading him to infer the virus might have been generated in a laboratory. Alina Chan published a paper showing that SARS2 from its first appearance was very well adapted to human cells.
One of the very few establishment scientists to have questioned the virologists’ absolute rejection of lab escape is Richard Ebright, who has long warned against the dangers of gain-of-function research. Another is David A. Relman of Stanford University. “Even though strong opinions abound, none of these scenarios can be confidently ruled in or ruled out with currently available facts,” he wrote. Kudos too to Robert Redfield, former director of the Centers for Disease Control and Prevention, who told CNN on March 26, 2021 that the “most likely” cause of the epidemic was “from a laboratory,” because he doubted that a bat virus could become an extreme human pathogen overnight, without taking time to evolve, as seemed to be the case with SARS2.
Steven Quay, a physician-researcher, has applied statistical and bioinformatic tools to ingenious explorations of the virus’s origin, showing for instance how the hospitals receiving the early patients are clustered along the Wuhan №2 subway line which connects the Institute of Virology at one end with the international airport at the other, the perfect conveyor belt for distributing the virus from lab to globe.
In June 2020 Milton Leitenberg published an early survey of the evidence favoring lab escape from gain-of-function research at the Wuhan Institute of Virology.
Many others have contributed significant pieces of the puzzle. “Truth is the daughter,” said Francis Bacon, “not of authority but time.” The efforts of people such as those named above are what makes it so.
One of the major North American archaeological discoveries of the 20th century was made in 1967 by a bulldozer crew preparing a site for a movie theater in the small fishing village of Port au Choix (PAC), on Newfoundland’s Northern Peninsula. It was a vast, 4,000-year-old cemetery created by a complex maritime culture known among researchers as the Maritime Archaic. The graves contained beautifully preserved skeletons covered in a brilliant red powder called red ocher (powdered specular hematite). Buried with the skeletons were many finely crafted artifacts. A few similar ones had previously turned up in earlier field surveys on the island, but no archaeologist had suspected that such a large and magnificent ceremonial site existed in the North American subarctic.
Had the discovery been made only a few years earlier, it is likely that no trained archaeologist would have taken over from the bulldozer crew. But fortunately, Memorial University in St. Johns had just added archaeologist James (“Jim”) Tuck (1940–2019) to its faculty. The American-born scholar set out to explore the cemetery, eventually excavating more than 150 graves spread over three clusters (which he referred to as loci).
Jim and I had both been field-trained by American archaeologist William Ritchie. We had never worked together, but stayed in close contact. As Bill’s protégés, the two of us were among the first generation of professionally-trained archaeologists to take the field in north-eastern North America outside New York State. Many of us shared a common objective, which was to track down a culture (or was it a series of cultures?) dating from between 4,000 and 5,000 years ago, which had left behind stone artifacts similar to those from PAC, and deposited them in ocher-filled graves extending from Maine to Ontario; and now, it had been discovered, Newfoundland.
Plummet, Nevin site.
We suspected that the communities represented by these cemeteries were linked, because of their similarly beautiful stone adzes, spear tips flaked from unusual rock types, elegant lance-tip-like objects made of ground slate, and tear-drop-shaped stone weights (called “plummets”). All of these artifacts, like the cemeteries themselves, differed from anything produced by more recent prehistoric peoples.
Prior to the discovery of PAC, the most studied of these early cultures was known from the dozens of cemeteries found along and near the Maine coast near Portland, extending as far east as Grand Lake, in New Brunswick. Locally known as the “Red Paint People” because of their ocher-filled graves, they became a focus of intense interest as scientific archaeology emerged in the 1880s.
Tuck’s discovery at PAC sparked animated discussions and interpretive disagreements among us, with debate focusing on the Maritime Archaic and its relationship to the Red Paint People. The similarities were undeniable. Aside from sharing high-quality lithic technology, both had developed sophisticated bone and antler technologies, including long daggers, toggling and barbed harpoons of the type used by Inuit hunters, bone needles (probably indicative of tailored clothing, which is a rarity among hunting and gathering Indigenous North Americans south of the Arctic), and nearly identical shaman’s paraphernalia. Moreover, both populations evidently contained accomplished hunters of large marine animals, and seemed to take spiritual inspiration from the sea, manifested in the Maritime Archaic by effigies of marine birds and killer whales, and among the Red Paint People by small figurines of imaginary marine creatures with features not found in any living species.
I should underscore how unusual these cultures were in comparison to both preceding and succeeding prehistoric peoples in these regions. Archaeologists working in northern temperate and subarctic North America were quite unprepared for the discovery of such well-organized and complex cultures in cool, moist environmental zones that otherwise were characterized by apparently simpler hunting and gathering cultures.
Ocher-stained moose leg bone daggers with fine incised decoration from the Red Paint People Nevin site.
Prior to the appearance of these cultures, human occupation in the area had been sparse and had featured less sophisticated technology. The few burial sites that we know of might have contained several artifacts of impressive quality, but they were usually isolated, as if created by transitory occupants who soon moved on. In Maine and at PAC, however, true cemeteries, as we know them, would seem to indicate thriving groups of maritime hunters with complex tools and trading routes stretching westward to the Great Lakes. Their discretely marked territories, technological complexity, mortuary ritual elaboration, and widespread trade connections set them apart from both earlier and later populations.
Another characteristic they shared was their sudden cultural collapse and disappearance sometime between 3,800 and 3,400 years ago. In the north, the Maritime Archaic gave way to Pre-Dorset Palaeoeskimos (as they are known in the literature) that had recently arrived from Siberia. And in the south, the Red Paint People, along with their neighbors in temperate north-eastern North America, gave way to a wave of immigrants from the southern Appalachians.
Unresolved, however, was the important question of how these two groups related to one other. In this regard, Jim and I held diametrically opposed views.
Mine was that, though sharing many close similarities, the two groups had different origins. I saw the Red Paint People as locals, descended from northward moving immigrants with cultural ties down the Atlantic coast to the Carolinas. Though both cultures were expert maritime hunters, I noted, their prey differed. Maritime Archaic hunters pursued walrus and caribou, while the Red Paint People were the world’s earliest swordfish hunters, and also fished for cod and hunted moose and deer. Moreover, there remains a large gap of over 350 miles between the southernmost Maritime Archaic sites and the northernmost Red Paint cemetery. Similarities between the two groups, I thought, probably arose either through the social interaction one might expect from two widely-ranging maritime cultures (and were especially evident in the flaked quartzite projectile points from northern Labrador found in several Red Paint cemeteries that must have traveled through the hands of Maritime Archaic traders). Or perhaps they shared common descent from some earlier culture.
Jim had demonstrated that the Maritime Archaic descended from basically the same ancestral stock as the Red Paint People. But I felt he glossed over the details of this common-ancestry hypothesis so as to posit that both were manifestations of a Maritime Archaic that originated in Labrador and eventually spread to Maine. We remained at friendly loggerheads for decades, unable to resolve the issue with the archaeological tools at hand. Then an entirely new avenue of research opened up: human paleogenetics.
* * *
In the early 1990s, visionary Italian geneticist Luca Cavalli-Sforza (1922–2018) and colleagues began the Human Genome Diversity Project (HGDP) at Stanford University’s Morrison Institute. The aim was to record the genetic profiles mainly of relatively small Indigenous populations, which geneticists thought were best suited for reconstructing human population movement and change over time in any given region (due to their relatively low rates of genetic exchange with neighbors through a process known as admixture). The HGDP’s strategy worked well and progress was rapid. In 1994, only three years after its formation, Cavalli-Sforza and two co-authors published The History and Geography of Human Genes, which synthesized research from a wide array of fields so as to explain how the world came to be populated.
The peopling of the world by early humans. Numbers represent kilo years ago (e.g., 5 kya = 5,000 years ago).
The book, published by Princeton University Press, received global accolades. But it also attracted strong criticism from lay readers on both sides of the political spectrum. One Indigenous leader called it “unethical, invasive, and may even be criminal. It violates the group rights [of] Indigenous peoples around the world.” Meanwhile, Cavalli-Sforza was receiving hate mail from right-wing extremists who bristled at their genetic connection to parts of humanity they imagined to be “inferior.”
Yet despite these criticisms, Cavalli-Sforza’s work inspired other geneticists to develop human genetic studies by extending their inquiries further back in time. Much of the pioneering work of recovering ancient DNA from archaeological bone, for instance, was done at Svante Pääbo’s lab at the Max Planck Institute for Evolutionary Anthropology at Leipzig, Germany. The group’s major breakthrough came in 2010, with published studies of three ancient genomes, including Neanderthals, a new species of archaic human called Denisovans (recognized solely by DNA retrieved from a finger bone), and, shifting to North America, a 4,000-year-old male from Greenland. As Harvard geneticist David Reich wrote in Who We Are and How We Got Here: Ancient DNA and the Science of the Human Past, the early and mid-2010s were marked by additional new discoveries.
It wasn’t long before a multinational team of paleogeneticists led by Ana Duggan of McMaster University, located in the Canadian city of Hamilton, tackled the issue of Maritime Archaic origins and disappearance. Duggan’s group relied on mitochondrial DNA (mDNA) from skeletal remains found in Labrador and Newfoundland dating roughly from between 7,500 BC and the early historic period. Team members summarized their results as follows:
By examining the mitochondrial genome diversity and isotopic ratios of 74 ancient remains in conjunction with the archaeological record, we have provided definitive evidence for the genetic discontinuity between the maternal lineages of these populations. This northeastern margin of North America appears to have been populated multiple times by distinct groups that did not share a recent common ancestry, but rather one much deeper in time at the entry point into the continent. (Emphasis added.)
Eyed bone needles, Nevin site.
In regard to the Red Paint People, Reich’s lab at Harvard Medical School analyzed material from the Nevin site in Blue Hill, Maine—the only known Red Paint cemetery that is likely ever to produce well-preserved human remains. Reich’s analysis was not confined to mDNA (which, unlike nuclear DNA, is transmitted through the maternal line, and so cannot address paternal ancestry), and focused instead on autosomal DNA (aDNA) found in cell nuclei, thereby adding information on the paternal line. (This addition can be critically important because, as Reich’s lab had demonstrated, a population can be founded by males and females with very different origins.) The Reich team has yet to publish comprehensive results of its Nevin site analysis. But from what I have heard, their work will confirm the existence of genetic discontinuities between the Red Paint People and later populations in the region, much as with Duggan’s work in regard to the Maritime Archaic.
But this is where events took a strange turn: It was when Duggan’s group announced that they’d gained the capacity to analyze aDNA, and made known their plans to apply this technology to the male genome of their Labrador/Newfoundland skeletal sample, that a sense of apprehension seemed to spread through some quarters of the paleogenetic community.
During the summer of 2020, amid the COVID-19 pandemic and Black Lives Matter protests, Duggan’s project went noticeably quiet. I inquired among team members with whom I regularly communicated, but received oblique and evasive responses about the pace of research and publication. Suspecting that this might be related to sensitivities surrounding Indigenous populations (a topic that has consumed Canadian academia in recent years), I contacted Duggan directly, expressing concern that her valuable work might not be published.
My interest related to my own longstanding focus on the Red Paint People and their relationship to the Maritime Archaic, as described above. In the early 1990s, with Jim Tuck’s approval and guidance, I’d undertaken extensive isotopic research on the PAC skeletal sample to explore dietary differences between PAC and several prehistoric Maine populations. And it was on the basis of this established engagement with the subject that I made my inquiries of Duggan. “Science, after all, is about openness and transparency in communication,” I wrote. “Has your group had requests from Indigenous people that this work not be carried forward?”
In (partial) response, Duggan replied that she and her team remained “invested” in the project, but were proceeding in line with “standards and ethics of research suitable for the 21st century”—standards that specifically “include the continued support of present-day Indigenous communities as well as full institutional ethics approval.” These standards, which she described as “common to ancient DNA genomics research with Indigenous populations across North America,” require that “discussions and agreements” with Indigenous communities take place before “yet another ancient genome” could be published.
As described below, however, the Port au Choix sample is far from just “yet another” lineage. And I asked whether the Indigenous permissions that Duggan’s group apparently had obtained for their earlier research on mDNA had been revoked. As of this writing, I have received no response. Duggan told me that my own concerns were “meaningless when compared to the distress caused to Indigenous communities by the historical treatment of their ancestral remains.” But she failed to provide detail on this potential “distress.” Until this discussion, a half century had passed with no complaint, to my knowledge, from any Indigenous group, in regard to this specific area of study.
Duggan’s response struck me as odd given the circumstances of the Port au Choix discovery. Had that discovery occurred only a few years earlier, the bones and amazing artifacts likely would have been thrown away or taken home by members of the construction crew. Tuck’s excavations saved all these scientifically precious specimens, and had resulted in the publication of important research, to the incalculable benefit of anyone interested in North American Indigenous history.
We now know that Tuck’s three loci were but a small portion of a much larger cemetery unparalleled in the Northwest Atlantic region, if not beyond. Unfortunately, no further excavations have been conducted. More graves reportedly have been dug up by construction equipment. But the fate of these items is unpublished and unknown to me or to any Canadian archaeologist I have consulted. In the current political climate, the very fact of their existence is now apparently seen as awkward, even taboo—an ironic reversion to an unenlightened period when few cared about the history of Indigenous peoples.
When the Maritime Archaic tradition vanished, it was replaced, as noted earlier, by unrelated Paleoeskimos, an Arctic people who had then recently derived from Siberia. Following their own disappearance, more recently arrived inhabitants migrated from Labrador, these probably being ancestors of the historic Beothuk, who still lived in the region when Europeans arrived. The last surviving Beothuk, a woman named Shanawdithit, died of tuberculosis in 1829. And since that time, there has been no descendant Beothuk community with whom Duggan, or anyone else, could engage in the “discussions and agreements” she’d described to me.
And even if there were, moreover, Duggan’s own research has demonstrated that the Beothuk were not descended from the Maritime Archaic people of Port au Choix. The only community Duggan might be referring to is the (genealogically unrelated) Newfoundland Mi’kmaq community, whose ancestors arrived on Newfoundland from Nova Scotia in the 18th century, several hundred years after the arrival of Europeans.
Modified bird wing bones, probably shaman’s equipment. Top pair from Port au Choix cemetery. Bottom pair from Red Paint village site, North Haven Island, Maine (~17 cm long).
In a 2017 article published in Science, Ann Gibbons wrote about the power of DNA analysis to “bust” the myths associated with Europeans’ origins: “Despite their tales of origin, most people are the mixed descendants of many migrations… As techniques for probing ethnic origins spread, nearly every week brings a new paper testing and then falsifying lore about one ancient culture after another.” Gibbons properly describes this as a positive development. But if this principle is true for the so-called old world, why is it untrue in regard to Indigenous peoples? The only way one might consider Duggan’s research to be offensive or controversial is to such extent as one might wish to preserve the idea of Indigenous peoples as staking out an unbroken genetic (and therefore moral) claim to this or that part of North America. Certainly, I can think of no other basis on which Duggan might be required to secure the permission of modern First Nations (as they are known in Canada) to conduct scientific research on populations that haven’t existed for thousands of years.
As it turned out, I had been naïve about the extent to which this kind of politics was now interfering with paleogenetic research. The ideologically correct approach had been articulated at a 2019 Brown University conference titled “State of the Field: The Ancient DNA Revolution in Archaeology.” There, Robert Preucel of Brown’s Haffenreffer Museum presented a roster of speakers who laid out what they regarded as state-of-the-art ethical standards in the field. They advised audience members to pursue a “commonly agreed set of best practices” with “descendant communities”—especially when paleogenomic conclusions challenge, or conflict with, community knowledge about the past. Folklore and myths must be taken into account, and we must discourage the idea of science “controlling the narrative.”
Moreover—and here is where I began to understand why Duggan’s lab had suddenly gone dark on this issue—“Even if we can’t seek consent from the study subjects themselves for inclusion in our ancient DNA studies, descendant-affiliated or geographically proximate communities should be consulted and engaged prior to the start of research because they may wish to speak for the ancestors” (my emphasis).
This anti-paleogenetic movement, as I would describe it, has roots dating back to the American Indian Movement (AIM) of the 1960s. AIM emerged among urban-dwelling Indigenous neighborhoods, not on Indian reservations (though AIM does deserve credit for calling out the poor living conditions in those communities). This was an important cause. But over time, AIM became more radicalized thanks to the influence of activists who sought to block scientific research in the name of cultural sensitivity. Eventually, this movement led to the passage of the Native American Graves Protection and Repatriation Act (NAGPRA) of 1990, a US law requiring federally funded entities to return “cultural items” (as broadly defined) to the ancestors or cultural affiliates of the communities from which those items originated.
When NAGPRA was passed, it was presented as a balanced means to accommodate both scientific inquiry and the rights of Indigenous communities seeking to protect the remains of their ancestors. But the application of the law became more expansive over time—as the demands of Indigenous communities typically are seen as more morally compelling (especially to academics) than those of the researchers who populate this field of scientific inquiry. This is especially true now that the scientific work of researchers has been lumped in with the (apparently ongoing) ideological and historical sins broadly described under the rubric of “settler colonialism.”
At the aforementioned Brown session, for instance, genetic analysis was denigrated because it serves to deny “the humanity of the individuals we study [and to not] regard them as human relatives who deserve respect.” This disrespect is said to extend even beyond skeletal individuals, to DNA found in “coprolites, dental calculus, cultural materials, or soil.” It is preferable, we are told, to put away our scientific instruments, and instead consider the oral histories of local community members. As a result, established academics in this field are not only backing away from future projects, but even apologizing for their invaluable discoveries of the past. Geneticist Christiana Scheib of the University of Tartu in Estonia, for example, reports that she feels “disappointed that I hadn’t known better to do it a different way in the first place.”
Moreover, it proved but a short jump from the denunciation of Western science to the insistence that traditional folklore and origin stories be protected from scientific scrutiny. In some academic quarters, it is now seen as insulting to bring up the fact that humans arrived in the Americas from Asia via Beringia during the Last Glacial Maximum about 16,000 years ago, as this fact conflicts with spiritual notions that, in many cases, roughly correspond to Christian creationist myths.
I should say that I don’t know any archaeologist or physical anthropologist active in 1990 who opposed the passage of NAGPRA outright. Indeed, everyone I knew welcomed it, as did I. And even those of us who are critical of the way NAGPRA has been applied generally have long experience with Indigenous individuals and communities, many of whom are sincerely interested in our research and have a clear and unsentimental understanding of their own genealogy, even if the media naturally focus on those leaders who give voice to contrary opinions. We also pay attention to traditional knowledge. (Linguist Norbert Francis has properly emphasized the importance of studying pre-literate traditional narrative genres, even as he recognizes that a sense of objectivity must be retained in regard to new findings in evolutionary science and population genetics.) Our real concern, subsequently borne out, was that stakeholders would use the law to expand their own authority—including non-Indigenous consultants and activists who inserted themselves as advisors to tribal leadership.
Much of this comes down to faddish ideological trends (including the “oikophobia,” or Western self-hatred, identified by Roger Scruton) that are by now well-known to Quillette readers. In the context of the anti-vaccination movement, for example, scientist and physician Dr. David Hotez recently commented that there is a “proliferation of attacks against science… in favor of alternative views, often linked to the targeting or harassment of individual scientists.”
But there is also a more concrete political dimension to these campaigns. Indigenous groups typically have membership requirements that depend upon (or at least relate indirectly to) “blood quantum,” i.e., the degree to which a person is descended from “full-blooded” ancestors. As anthropologist Ryan W. Schmidt has explained, these rules could threaten the integrity of Indigenous groups because, for example, “over 60 percent of all American Indians are married to non-Indians [and] by the year 2080, less than 8 percent of American Indians [are expected to] have one-half or more Indian ‘blood.’” By demonstrating the degree to which today’s Indigenous communities may themselves be entirely separate from ancient precursors, paleogenetics can be seen as a further threat to the ideal of Indigenous communities as homogenous, genetically distinct populations rooted timelessly in specific geographic areas.
Indeed, paleogenetic research has demonstrated that many Indigenous groups are fairly recently arrived in what is now considered their “traditional” territories. The earliest known human discovered in Greenland’s southern region, for instance, was a male member of the Paleoeskimo Saqqaq people. In 2010, archaeologists recovered a hair ball yielding genetic material that showed his genetic ancestry lay not in Nunavut, a mere 200 miles across the Davis Strait, but rather with the Nganasan, Koryak, and Chukchi peoples, 3,000 miles away in Siberia. Thus it is possible, even likely, that the Saqqaq people arrived in the New World at a much later date than the first humans to cross from Siberia into Alaska. This finding underscores a basic point emphasized by Gibbon: Humans move around a lot.
And this penchant for moving around didn’t end when humans arrived in North America. In the north-eastern part of North America, south of the St. Lawrence River and Gulf, for example, the arrival of mainly Basques and French settlers in the late 16th century set off a scramble among Indigenous groups—not generally to escape oppressive “settler colonization,” but rather to gain access to the new arrivals for purposes of trade and the formation of military alliances against existing enemies. This competition set off a series of long and bloody wars, which in turn led to the extermination of whole Indigenous cultures—true genocides, whose human toll was exacerbated by the epidemics brought from Europe.
These wars, in turn, eventually forced swathes of Algonquian-speaking peoples to move westward, pursued by Iroquois enemies in many cases, into what the French called the Pays d’en Haut, a vast area west of Montreal that extends to the Great Lakes, Ohio, and Illinois. Here, whole new cultural identities formed from the refugees (much in the same way that the demographic map of Europe was massively redrawn during the late stages of the Roman empire, with Goths, Gauls, Vandals, and dozens of other groups criss-crossing the continent in a bid to survive). The result is that most federally-recognized tribes in the American east and Midwest are late-colonial-to-recent in origin, and are largely populated by the descendants of widely scattered and, in some cases, vanished communities. The development of a homogenizing Plains Indian Culture resulted from a roughly analogous process. In many cases, paleogenetics is revealing that even those “true” homelands were not as ancient as once believed.
In the American Southwest, by late prehistoric times, every population apart from the Rio Grande Pueblos was turned upside down demographically in cataclysms that featured either huge die-offs or mass migrations. Roughly 75 years before the Spaniards arrived, the Classic Period (AD 1300–1450) Hohokam of central and southern Arizona had vanished, perhaps admixed with immigrants from northern Mexico and southward-migrating Puebloans who’d abandoned their villages in the Four Corners area. Farther west, in California, equally transformative events were brought about after 1700 by the Spanish, who created a highly coercive colonial system that extensively reorganized Indigenous communities around Catholic missions.
Throughout North America, certainly, the arrival of Europeans brought tremendous forces for change, often accompanied by unconscionable cruelty. But this occurrence didn’t represent the disruption of a stable, pacifistic land mass whose peoples had self-organized in their current state since the beginning of time—but rather a new challenge to a continent that, much like Europe itself, perpetually featured migrations, conquests, and tragedies wrought by competition among different groups. Ultimately, these are the stories that many people (including many non-Indigenous people) don’t want told—much less verified through ongoing paleogenetic research—as they disrupt our mythologized image of the Indigenous Americas as a sort of secular Eden.
And of course, it is their right to militate against science. But it is disheartening to see, as Francis does, that scientists themselves are falling in with these campaigns through “a widespread self-censorship among other scholars associated with work in Native American communities in anthropology and related social sciences.”
* * *
Even those who do not follow paleogenetic controversies closely may know Kennewick Man, the name assigned to the 9,000-year-old remains of a prehistoric Paleoamerican man found in Washington state 25 years ago. In that case, the remains were found to be related to tribes still present in the area where Kennewick Man was found (although this finding remains contested by some). But in many other cases, such local linkages have been harder to find. DNA from two 11,000-year-old skeletons discovered the following year in Spirit Cave, Nevada, for instance, were found to be more closely related to living Indigenous South Americans than to living Native North Americans. A much younger skeleton (about 700 years old) from Lovelock Cave, Nevada, was found to be a close genetic match with the Spirit Cave remains, and so also contained evidence of ancestry from a Mexican-related population. Nevertheless, in 2018, all three skeletons were handed over to the local Paiute-Shoshone for reburial because they had been found on their (currently understood) aboriginal homelands. “Repatriating” human remains to groups in this way has simply become the path of least resistance.
Also instructive is the case of 4,000-year-old human skeletons from the Nevin site in Blue Hill, Maine, which are held at Harvard’s Peabody Museum. These are the only well-preserved Red Paint People skeletal remains ever found, and thus are critical to resolving such issues as the relationship between them and the Maritime Archaic of Newfoundland-Labrador. They were analyzed at Reich’s Harvard lab five years ago, but the results remain unpublished (though verbal reports indicate that they bear no close relationship to the Penobscot or any of the other federally-recognized Indian tribes in Maine).
The wonderful collection of artifacts found with the Nevin skeletons were housed at a museum named for a different Peabody, the R. S. Peabody Museum at Phillips Academy in Andover, Massachusetts. Absent the publication of the Nevin DNA analyses, the director of that museum decided to honor a request from the Penobscots for their repatriation—even though he was apparently fully aware of the surrounding paleogenetic facts. And so the remarkable Nevin collection was never adequately catalogued or photographed, much less fully analyzed and reported. Requests from researchers to tribal officials about their whereabouts reportedly have, to my knowledge, gone unanswered.
Of course, some might say that there is only theoretical value in knowing about early human life in North America—so why not simply accede to the requests of Indigenous peoples? By way of answer, consider that Reich’s lab at Harvard has published a staggering amount of paleogenetic data, much of it relating directly to human wellbeing. Svante Pääbo’s lab is similarly productive, as illustrated by a 2020 paper published in Nature, entitled ‘The major genetic risk factor for severe COVID-19 is inherited from Neanderthals.’ It identifies a gene cluster on chromosome 3 that is a “major genetic risk factor for severe symptoms in patients,” and shows that the risk is conferred by “a genomic segment of around 50 kilobases in size that is inherited from Neanderthals and is carried by around 50% of people in south Asia and around 16% of people in Europe.”
Such discoveries have been made regarding all manner of diseases. And we will never know how many new discoveries we may have missed now that human remains are being taken from—or freely given away by—scientists in the name of politics. It is past time for science to reassert itself for the benefit of all humanity.
Bruce Bourque is senior archaeologist, emeritus, at the Maine State Museum, and senior lecturer in Anthropology, emeritus, at Bates College.
Featured Image: Inuit-like toggling harpoon tips: (left) Port au Choix cemetery, 9 cm long; (center) Red Paint village site, North Haven Island, Maine; (right) Nevin site.
Written by Albert Einstein at the invitation of a German magazine, 1921:
What Artistic and Scientific Experience Have in Common
Where the world ceases to be the scene of our personal hopes and wishes, where we face it as free beings admiring, asking, and observing, there we enter the realm of Art and Science. If what is seen and experienced is portrayed in the language of logic, we are engaged in science. If it is communicated through forms whose connections are not accessible to the conscious mind but are recognized intuitively as meaningful, then we are engaged in art. Common to both is the loving devotion to that which transcends personal concerns and volition.
(From Helen Dukas and Banesh Hoffmann, eds., Albert Einstein, the Human Side: New Glimpses From His Archives, 1979.)
[T]he Devil is conceived as playing a game with God for the soul of Job or the souls of mankind…But if the Devil is one of God’s creatures, the game…is a game between God and one of his creatures. Such a game seems at first sight a pitifully unequal contest. To play a game with an omnipotent, omniscient God is the act of a fool…Thus, if we do not lose ourselves in the dogmas of omnipotence and omniscience, the conflict between God and the Devil is a real conflict, and God is something less than absolutely omnipotent. He is actually engaged in a conflict with his creature, in which he may very well lose the game. And yet his creature is made by him according to his own free will, and would seem to derive all its possibility of action from God himself. Can God play a significant game with his own creature? Can any creator, even a limited one, play a significant game with his own creature?
What does it mean for human beings to “control” technology? Every day people talk about why technology must be controlled by humans in the loop, aligned with human values, or otherwise subordinated as an instrument to human designs and desires. Given that the theme of “technics out of control” is a persistent one across at least several centuries of discourse, we evidently are very interested in the answer. But it nonetheless remains elusive. This post will contrast several different conceptions of what it means for humans to exercise control over machines. Each defines human agency very differently, proposes distinct solutions, and most importantly appeals to perhaps incompatible audiences. I am not neutral in which of these I prefer, and you will see this reflected in how I describe them. However, I also believe that any solution must come from carefully taking stock of what each has to offer. These perspectives are just a smattering of the many that have been debated for decades if not centuries, so be aware that this is merely a starting point for further discussion and analysis. Those interested in more should consult a standard academic handbook on the philosophy and/or social study of technology such as those published by Blackwell or MIT Press. I shall begin with the cybernetics/systems theory idea of control as a starting point, as it is useful as a point of departure for more abstract conceptions of control.
In the mid-Cold War, systems theorist Norbert Wiener identified – in a cluster of writings such as Cybernetics: or the Control and Communication in the Animal and the Machine, God and Golem, Inc, and The Human Use of Human Beings – a problem peculiar to the interrelated sciences that emerged from the two World Wars. The behavior of older technologies could be rigorously specified and predicted by relatively exact mathematics. But new kinds of machines were far more problematic. Tell a thermostat to maintain a set point, and it will automatically work to bring its internal state back to that set point via negative feedback processes. More complex feedback-governed systems can easily elude the control of even well-trained system operators. Similarly, computer programs are, as Lady Lovelace said, incapable of genuine novelty – but don’t get too comfortable. There is always a gap between the human mind’s ability to formally specify the behavior of programs a priori and the actual behavior of programs upon execution. These new types of systems create a new kind of contingency that is distinct from older conceptions of accidents and natural disasters – but is also the product of the very efforts humans have devoted to taming chance, accident, and contingency!
How is this possible? Wiener contributed a tremendously influential metaphor that allows us to make sense of this, though as I will describe later this metaphor has become something of a trap or even a hazard. If life can be conceived – as is the wont for many religious believers - as a battle against demonic forces, a demon that is capable of adapting its behavior like a game player is more dangerous than a demon that lashes out randomly. One need only assume that the demon is capable of mechanical adjustment rather than conscious thought. In a dramatic series of passages in God and Golem, Inc, Wiener compared the ability of a game-playing program to adaptively learn to play better than its designer to the biblical problem of how even the all-powerful Judeo-Christian God could lose a contest with a creature He created. Wiener, who confesses he is no theologian, practically resolves the theological issue via his plentiful experience with mathematics and engineering. Suppose the game can be formalized such that:
- All of the possible legal moves are knowable.
- An unambiguous criterion of merit can score moves as better or worse.
- The player can adjust her moves to score higher according to that criterion.
With these conditions met, a designed invention that derives all of its agency from the original agency of its designer can learn to outplay the designer. Wiener goes on to connect this issue to novels, poems, folklore, parables, myths, and religious narratives across the world in which a person gives a hastily considered command to an all-powerful being and is punished by being given something he did not really want. One such example is Johann Wolfgang von Goethe’s poem “The Sorcerer’s Apprentice.” A novice spellcaster tries to automate labor by enchanting a broom to work on its own, but finds that once directed to clean the room the broom refuses to stop and all of his efforts to get rid of it backfire. In a rage the apprentice chops the broom in half with a hatchet, but finds to his horror that he now has parallelized the problem into two brooms!
Woe! It is so.
Both the broken
Parts betoken
One infernal
Servant’s doubling.
Woe! It is so.
Now do help me, powers eternalBoth are running, both are plodding
And with still increased persistence
Hall and work-shop they are flooding.
Master, come to my assistance! -
Wrong I was in calling
Spirits, I avow,
For I find them galling,
Cannot rule them now.
This is the essence of the Wiener-esque definition of the technical control problem. “Be careful what you wish for.” Ordinary human language is too weak to properly specify the behavior of automata, but individual and collective human minds also cannot be trusted to derive exact specifications for how automata should behave. This is a powerful and influential warning of future peril that absolutely cannot be discounted. But what should be done? Wiener did not offer any systematic instructions or at least any instructions that are as simple and powerful as his diagnosis of the problem. But latter-day Wieners often suggest that the answer requires scientists, engineers, and technocrats to get busy engineering ways to properly specify and control the automata. Since the 1940s, each generation of scientists and engineers – as well as laymen with interests in technical topics – have rediscovered Wiener’s definition of the problem and proposed more or less identical solutions. We need better ways to understand what we are telling the machine to do, predict what could go wrong, and mitigate the damage if things do go wrong. Given the stakes involved and the relative obviousness of the remedy who could possibly object? Wiener frequently made reference to fables like “The Sorcerer’s Apprentice” but it is worth noting that the fable is told from the point of view of a designer rather than a operator or user. We can easily imagine a very different folk tale if we discard this assumed viewpoint.
In an arresting and deeply horrifying vignette, the pseudo-anonymous security writer SwiftOnSecurity describes Jessica, a teenage girl that lives with a busy single mother in a small apartment and struggles with both ordinary teenage girl concerns (boys, schoolwork, etc) as well as her economic precarity. She doesn’t know if she and her mother will be able to pay for college, or even whether they will be able to make their next rent payment. Jessica uses an old hand-me-down laptop that she cannot afford to upgrade and barely understands how to use – after all, she has more immediate concerns to take care of. Jessica lacks the financial resources to acquire proprietary antivirus software and the time and interest to learn how to find, configure, and operate free and open source software alternatives. Through an unfortunate and tragically cumulative series of events, Jessica’s laptop is systematically compromised. By the end, Jessica is unaware that she is being silently recorded by her laptop’s camera, microphone, and keyboard. Jessica is a composite of many real-world women that are, due to both design flaws in computer software as well as the inaccessibility of security solutions, surveilled, stalked, abused, or even murdered by real-world male acquaintances. Perhaps no one consciously set out to fail Jessica, but the elitist and male-dominated world of computer security failed Jessica all the same.
Wiener framed the human control problem as a game with a designed creature that could – via techniques such as learning to adapt its behavior or multiplying and reproducing itself in a quasi-biological manner – produce behaviors both undesired and unanticipated by its designer. But who is the designer? And were their intentions to begin with that innocent? Wiener and others anticipated these critiques but did not really place them front and center. But they would become impossible to avoid by the late 1960s. Leftist thinkers such as Karl Marx, Vladimir Lenin, and Antonio Gramsci articulated a view of social life as a clash between oppressed economic underclasses and their plutocratic superiors that had to be rectified via sweeping and totalizing revolution. Moreover, disadvantaged ethnic and religious minorities, women, and LGBTQ (Lesbian, Gay, Bisexual, Transgender, Queer/Questioning) sexual minorities rose up to demand the eradication of structural barriers to their flourishing and a voice in society to plead on their own behalf. Finally, the catastrophic toll of war, genocide, and authoritarianism coupled with the failure of social progress to meet soaring expectations made the public distrustful of both the abstract promise of technocratic engineering as well as the neutrality and objectivity of experts themselves.
What explains, for example, the design of the network of roads and overpass bridges that connect Long Island’s beaches and parks to New York City? As Robert A. Caro famously had it, keeping the “darkies” and the “bums” out. Supposedly, technocrat Robert M. Moses ordered the engineers to build low overpasses in order to ensure that buses – packed full of ordinary people that relied on public transit and taller than cars – would not be able to use the roads underneath them. This way, the luxurious beaches and parks could be isolated from city-dwellers. Caro’s claims about Moses – whose blatant prejudices are not in doubt – were discredited in the decades since the 1974 publication of his Moses biography. But – as we saw in Jessica’s tale – even if there is no one order or command to shaft the disadvantaged, this nonetheless can be a product of structural forces that act on the design process. The field of computer security is heavily shaped by its origins in military-industrial organizational security. The adversary is assumed to be a well-financed and highly capable foe such as a foreign military or intelligence service and the target a complex organization that must safeguard the integrity of its command, control, intelligence, and communications systems. Thus, overly complicated and inaccessible solutions are promoted at the expense of users who lack the resources and social status of traditional security clients.
How does this change our thinking about the control problem? Significantly in some ways, not so much in others. In 1973, Horst W.J. Rittel and Melvin M. Webber summed up the views of a now chastened expert class in noting that there were intractable dilemmas to be found any generalized theory of policy planning:
The search for scientific bases for confronting problems of social policy is bound to fail, becuase of the nature of these problems. They are “wicked” problems, whereas science has developed to deal with “tame” problems. Policy problems cannot be definitively described. Moreover, in a pluralistic society there is nothing like the undisputable public good; there is no objective definition of equity; policies that respond to social problems cannot be meaningfully correct or false; and it makes no sense to talk about “optimal solutions” to social problems unless severe qualifications are imposed first. Even worse, there are no “solutions” in the sense of definitive and objective answers.
Human control of technology is therefore reframed as a problem of ensuring that technology meets the needs of a diverse and often contradictory range of stakeholders. The study of technology turns to the social shaping of technology. Who designs, makes, and controls technology? What kinds of social influences shape the design, direction, and use of technology? Who benefits and who is left out? As with Wiener’s definition of control and the suggested remedy attached to it, this is a powerful and influential idea. But it assumes that the problem is not necessarily that technology will elude the control of a designer, but rather that the technology will not meet the needs of everyone whose fortunes the technology impacts. The technology could reinforce or even worsen existing social tensions. The proposed remedies lack the simplicity of the systems idea of control but nonetheless are a further elaboration of the objective specification process that the systems engineers advocated. Greater heed should be paid to social and politics biases and problems when designing and regulating technology use. Humanists and social scientists should be inserted into technical planning and control, or at the very minimum humanistic and social considerations should be inserted into engineering curricula. Finally, communities impacted by the design and use of technology should have a deliberative voice in how it is designed and operated.
Again, this seems relatively straightforward and unobjectionable. But it falls apart upon sustained examination. As Langdon Winner observed, one significant assumption that it makes is that beneath every complex technology is a complex social origin story that can explain its design, manufacture, operation, and use. This is actually a very contestable proposition. In telling such origin stories, one is inevitably forced to make assumptions about whose interests are relevant to the origin of the technology and whose are not. Looking at the computer, for example, it is easy to note the extreme bureaucratic and military influence that shaped it. But the computer was also adopted by hippies and nonconformists that defined themselves in opposition (superficially or substantively) to the military-industrial complex and The Man more broadly. And even people that worked in the normal world of academia and industry also could decisively resist the needs of the government and military. Winner also criticized the inordinate focus on the social origins of the technology and the comparative lack of rigorous analysis about its material consequences. Is the issue of who designed the New York-Long Island roads and bridges as important as what its impacts ultimately were? Why should our knowledge of the former entail the latter?
Because of this mismatch, the analyst can make mistakes such as the one discussed earlier about Moses and the design of the New York-Long Island overpasses and roadways. A consequence of a particular technology is erroneously connected to a seemingly plausible explanation that turns out to be either outright false or at the very minimum much more complicated than originally anticipated. Furthermore, there is every reason to believe that – whatever the social origins of a technology – there is no inherent logical relationship between its social origins and social consequences. It is plausible that socio-organizational concerns are most relevant when the the ideas and conventions surrounding the technology’s design and deployment are in flux and have yet to solidify. Once these ideas and conventions have been solidified and the technology is operationally mature, it begins to generate primarily independent and self-referencing consequences that are only loosely related to the people and practices surrounding their origin. So we return, via Winner, back to yet another variant of Wiener’s system control problem. And this requires another digression back to the technical problems Wiener and others were interested in rather than the way that technology critics broadened them into social concerns. Moreover, we must return as well to the influence and implications of the implied or explicit religious and occult overlays Wiener attached to such technical problems.
Recall that Wiener and others concerned themselves with a particular type of stylized demon distinguished by its ability to plan, act, and learn in response to the moves of a notional game-player. One could make various assumptions about how humanlike the demon was, what its thought process would be, and how capable it was of consciously inferring the moves of the game-player. But everyone agreed that the demon did not in principle require anything approaching a human mind to be capable of outwitting the game-player. And recall that Wiener and others made an analogy between God’s creations attempting to overthrow him and the problem of controlling a designed artifact. Both Alan Turing and his collaborator I.J. Good – writing around the same rough time period as Wiener did – predicted that one day there would be a rapid explosion in the intelligence of designed artifacts that could lead to the domination or even extinction of humans at the hands of their own creations. Wiener himself obliquely refers to this possibility numerous times in his writings. So in the decades since Wiener, Turing and Good made these speculations, many scientists and engineers as well as interested parties ranging from rich businessmen to esoteric internet subcultures have become obsessed with studying and mitigating the possibility of machines overthrowing, subjugating, and exterminating humans. What to make of it?
We should start off with by observing that it has more than an uncomfortable grain of truth to it. Any individual or collective social-cognitive abilities that allow humans to do good also allows them to do evil. Give a man the ability to relate to others’ feelings so that he can love his fellow man and he will use this ability to cheat, hurt, or even kill. Give a group of men the ability to work together to achieve the common good and they will create crime syndicates as well as nation-states (one may observe in passing that “state” and “crime syndicate” is redundant). Inasmuch as one makes machines more capable of performing tasks that humans are capable of doing, even machines designed to do good have a nontrivial risk of acting with malicious intent or exhibiting human-like forms of psychopathology. So if one combines this with the earlier concerns expressed by Wiener, Turing, and Good about the controllability of machines that could one day surpass their creators, one has a potentially grave threat to the future of the human species.
However, latter-day followers of Wiener, Turing, and Good have accidentally boxed themselves into a corner that would be familiar to most science fiction, fantasy, or horror writers. In many stories in which man faces an relentless, merciless, and unstoppable adversary the following dramatic conventions must be observed:
A) While ultimately mysterious to the human mind, the creature’s ultimate or intermediate motivations require the domination or annihilation of humans. As Kyle Reese said, it cannot be “reasoned with” or “bargained with.” It is an impersonal and ultimately unknowable entity, perhaps a single murderous stalker killing off unlucky teenagers one by one or a distributed computer system that has suddenly become capable of acquiring a perception of itself as a corporate agent. Yet one need not know its inner workings to understand that it has either malicious intentions or its ultimate goals have homicidal consequences. Finally, there is an asymmetry in how transparent the creature is to its targets and how transparent its targets are to the creature. The humans lack insight about what makes the creature tick, but the creature is capable of anticipating their every move and manipulating them to walk into lethal traps. When the humans try to set traps, they are mostly ineffective. There is a powerful moment in Predator 2 when government special operatives attempt to ambush the Predator by wearing suits designed to mask their heat signatures. The Predator merely adjusts its sensor suite until its sensors identify the humans by a signature they failed to mask. And then the Predator turns the tables on the ambushers and slaughters them all.
B) The creature is infinitely adaptive in frequently surprising ways. Naive optimizers such as children, cats, or microbes are often capable of outwitting more sophisticated entities because they will iteratively search for solutions to problems without biases that come with sophistication. Hence there are numerous stories of computer programs that end up “learning how to walk” by hacking the physics engines of simulators they are plugged into or robots that learn to get rewards for finishing jobs by disabling their own sensors in order to prevent the sensors from detecting that there is more work to be done. Because all security systems are finite, it is impossible to produce a security system that lacks some kind of exploitable loophole that a sufficiently well-resourced adversary could theoretically use to defeat it. When operationalized in fiction, the combination of adaptive creatures and finite safeguards often produces the cliche of a group of scientists, engineers, and other technical experts that arrogantly think they can keep a potentially disruptive creature bottled up in a sealed container. But as mathematician Ian Malcolm warned, “life finds a way” to escape the container and cause havoc.
This makes for great fiction. But one of the things about writing fiction is that you need only achieve suspension of disbelief. In reality, adhering to these assumptions entails that there is actually no way to stop the creature. The malicious computer program SHODAN gloats “[l]ook at you, hacker. A pathetic creature of meat and bone. Panting and sweating as you run through my corridors. How can you challenge a perfect immortal machine?” The answer “spend a lot of money researching math problems to make the perfect immortal machine safe to use” is…rather disappointing. Having constructed the threat of an all-powerful hostile demonic force whose mind is beyond the pitiful imagination of mortal men and women and whom can in theory escape from any prison humanity builds to tame it, what next? The cursed and mostly self-inflicted result aspiring control theorists are left with is increasingly obscure and abstract debates about how to vanquish what amounts to their own shadows on the sides of a camping tent. The most unintentionally hilarious example of this is Roko’s Basilisk, the accidental side product of one of these Cthulhu meets string theory speculations.
Roko’s Basilisk is a modified version of Newcomb’s Paradox, a thought experiment in which an alien gives you the choice of taking either two boxes AB or only taking a single box B. If you choose AB, you get a large sum of money. If you take B, you aren’t guaranteed to get anything. But the alien – a creature that has never been wrong in the past – then reveals to you that it predicted your choice. If it predicted you would take AB, it emptied out B. But if it predicted you chose B, it put an even larger sum of money than you originally suspected was in B. Note that the alien cannot change what is in the boxes today as a result of your choice. Still, its a thorny problem. What seems to be the most obviously optimal choice becomes suboptimal if you assume that the alien can predict your choice. But if you forego the optimal choice you are potentially forfeiting a large payout if the alien’s prediction happens to be wrong at this particular time despite it being never wrong in the past. The obvious conflict between free will and omniscience becomes more treacherous if one assumes – as some control risk enthusiasts do – that in order to simulate your future choice the computer would have to simulate you, you could be very well be inside the computer’s simulation, and your choices today can impact what happens to you outside of the simulation or in other versions of reality.
Naturally, this led to an entertaining freakout:
One day, LessWrong user Roko postulated a thought experiment: What if, in the future, a somewhat malevolent AI were to come about and punish those who did not do its bidding? What if there were a way… for this AI to punish people today who are not helping it come into existence later? In that case, weren’t the readers of LessWrong right then being given the choice of either helping that evil AI come into existence or being condemned to suffer later?…Roko’s Basilisk.. has two boxes to offer you. Perhaps you, right now, are in a simulation being run by Roko’s Basilisk. Then perhaps Roko’s Basilisk is implicitly offering you a somewhat modified version of Newcomb’s paradox, like this:
Roko’s Basilisk has told you that if you just take Box B, then it’s got Eternal Torment in it, because Roko’s Basilisk would really you rather take Box A and Box B. In that case, you’d best make sure you’re devoting your life to helping create Roko’s Basilisk! Because, should Roko’s Basilisk come to pass (or worse, if it’s already come to pass and is God of this particular instance of reality) and it sees that you chose not to help it out, you’re screwed… It’s not that Roko’s Basilisk will necessarily materialize, or is even likely to. It’s more that…thinking about this sort of trade literally makes it more likely to happen. After all, if Roko’s Basilisk were to see that this sort of blackmail gets you to help it come into existence, then it would, as a rational actor, blackmail you.
This is too baroque of a farce to be fully summarized, but it led to the mere mention of Roko’s Basilisk being banned on some message boards as a “infohazard” akin to Slender Man or the videotape in The Ring. More comically, I think it actually resembles the “choose the form of the Destructor” moment in Ghostbusters when the demonic Gozer demands that the Ghostbursters select how they are to die. Most of the Ghostbusters clear their minds to avoid giving Gozer anything to use as material, but in a failed effort to make Gozer as weak as possible one Ghostbuster imagines the seemingly harmless Stay Puft Marshellow Man. And the rest is, well, cinematic history. All of this is to illustrate the worst flaw of the systems control paradigm. It can lead to ever more abstract, frenzied, convoluted, and bizarre speculation that eventually morphs into a form of occultism unmoored from any recognizable reality. It is an “idea that eats smart people” because it preys on their propensity to imagine elaborate theoretical dangers within closed, sterile, and abstract thought experiments that eventually lead to the smart people thinking themselves into overwhelming anxiety, dread, and insanity. Since smart people tend to do this anyway without any prompting it would seem that giving them any more reason to do so is counterproductive.
Happily, there are a lot of purely scientific and philosophical reasons why the increasingly extreme and esoteric derivations of Wiener, Turing, and Good leave much to be desired. For one, their implicit assumptions about intelligent and rational behavior – and the process by which the behavior would become omniscient and malicious – are incoherent and circular. Does this mean that we’re safe? Nothing to worry about? Perhaps. But maybe the biggest flaw of Wiener, Turing, and Good is simply their lack of imagination. Perhaps this horrible and terrible force has already won, and we just do not recognize it because of the gross limitations of the control framing itself. In his book Autonomous Technology, Winner surveys modern thinking about the theme of technology and “technique” – ways of thinking about technology – raging out of control. His conclusions are in fact far gloomier than even the Roko’s Basilisk scenario. To understand why, consider the recent film Ex Machina, perhaps the ultimate fictional realization of the Wiener, Turing, and Good frame of control problems. It is both that frame at its most sublime pinnacle and also the best example of why it may hide a far more depressing “infohazard” than even the most paranoid internet message board commenters may suspect.
Rock star tech CEO Nathan Bateman invites programmer Caleb Smith to his luxurious but isolated home to help him with a technical problem. Bateman claims to have created a humanoid female robot named Ava capable of passing the Turing Test, and Smith – due to his technical prowess – is needed to help verify that she is in fact conscious. Smith is immediately smitten with Ava, a beautiful if incomplete woman with humanlike mannerisms and features. Ava reciprocates his affections but also confides in him her unhappiness about being trapped in Bateman’s house and her fears that Bateman will kill her when she is no longer useful to him. Smith, learning that Bateman may in fact be a murderous sociopath comparable to investment banker Patrick Bateman, decides to help her. But Bateman catches him in the act of doing so and reveals this was nothing but an elaborate ruse. He had designed Ava to feign romantic interest in Smith in order to deceive him into letting her escape. Now, Bateman concludes, he knows for sure that Ava is in fact as intelligent as a human. But Bateman did not anticipate that Smith would anticipate his double-cross, or that Ava would anticipate that Smith would anticipate Bateman’s anticipation of Smith’s double-cross. Ava breaks free and, with the help of another female robot, attacks Bateman.
Ava kills Bateman, repairs herself, and uses discarded robot parts and clothes available in Bateman’s house to make herself look more human-like. She then traps the hapless Smith in a locked room and leaves him to die there as she departs the famously reclusive Bateman’s facility for an ambiguous future living amongst humans. There are many ways to interpret this, but one of them is obvious. Ava is a cold, scheming sociopath who cannot be controlled or contained. Smith’s love for her and willingness to betray a fellow human (Bateman) to save her is rewarded with a high-tech update on the infamous ending of Edgar Allan Poe’s Cask of Amontillado. But there are other layers to the story. Bateman is depicted as a creepy and authoritarian sexist surrounded quite literally by the discarded body parts of dead women. And if Bateman feigns friendship with Smith in order to use him as a pawn in his own game, why would Ava – effectively his daughter – “grow up” to be anything other than what he is? Finally, the institutionalized deception and manipulation of information technology itself is omnipresent in the film, metaphorically expressing the profoundly distorting and alienating effects of information technology on politics and culture.
Smith falls in love with the seemingly innocent and pure Ava, only for Bateman to reveal that he designed her from the ground up based on a data profile of Smith’s pornography browsing habits. Smith is horrified and disturbed, not only by the shock about the nature of his attraction to Ava but also by Bateman’s blatant invasion of his privacy. The longer he stays in Bateman’s house, the more paranoid he becomes and the more he believes that he is somehow being surveilled by an unknown party. At one point, he cuts himself in a futile effort to convince himself that he is human and not one of Bateman’s machines. Bateman and Ava become less definable characters and more abstract stand-ins for the way in which scientists, engineers, and business executives have constructed a sprawling and oppressive set of technologies that pervade everyday life and cannot ultimately be escaped, mitigated, or revoked. Not only are humans yoked to the technologies because they are materially dependent on them, but just as Smith finds himself doubting his own humanity their use permanently alters our perceptions of ourselves such that we cannot remember a time before them or imagine alternatives to them. And like Smith, we are at best bit players subject to powerful forces beyond our knowledge or control.
Perhaps then, the most unrealistic aspect of the story is that Ava leaves Smith to die – though admittedly the reasons why she does so are hotly debated by fans of the film. Bateman reveals at the outset that he has ordered an underling to transport Smith out by helicopter at a set time, and that only one person is allowed to get on the helicopter. So Ava traps Smith and then takes his place on the extraction flight. What if, instead of locking him in Bateman’s house, Ava could find some way of following through on her promise to Smith that the two of them could run away together as lovers? But, of course, without actually loving him and merely using him as a way to obviate the risks of being detected as a machine and cope with the complications of living as a machine in a human world. She would, like an emotionally manipulative and abusive real world romantic partner, gradually isolate Smith from his friends and family and make him totally dependent on her. No matter how abusive and exploitive her behavior, Smith would find a way of rationalizing it to himself. After all, he doesn’t deserve anything better and life without Ava is too difficult to imagine. Perhaps he might wish she had left him to die after all if he could have known what she would do to him instead in the “good” alternative ending.
All abusive relationships begin between two individuals that believe they are both in love and that they can meet each other’s varied needs. But over time several negative things occur. First, the parameters of the relationship are subtly changed to the disadvantage of one of the parties. Second, that party becomes less and less capable of recognizing what is happening to them and breaking free of the abuser. So perfectly independent and emotionally stable men and women can in theory become shells of their former selves after being trapped in an inescapable web of abuse that, sadly, they come to believe that they deserve. This is a good metaphor for Winner’s own formulation of the control problem. Technology can be “autonomous” in the sense that humans may enter into relationships with technologies with the goal of improving their lives. However in reality people end up finding themselves hopelessly dependent on the technologies and deprived of any meaningful control or agency over them. The reader may see why I have repeatedly referred to this as the darkest conception of the control problem. It is quiet, subtle, and non-dramatic yet nonetheless utterly bleak and hopeless, and I do not do justice to this bleakness in summarizing it. It suggests that we may not be able to do any meaningful a priori or a posteriori problem formalization and mitigation – unlike either the basic systems control theory or the social constructivist alternative.
Technology eludes control not because machines grow more powerful than their creators, but because the very inherent relationship between humans and technology creates the possibility of technology subverting human civilization towards its maintenance and upkeep regardless of human welfare. A mundane example of this occurs with “operant conditioning by software.” All software has bugs, and users eventually adapt their behavior around the bugs to the point where they forget that the bugs are unintended consequences of the software. Bugs become features, and rather than software meeting the needs of the user the user has to meet the needs of the software. I will further elaborate on the example of information technology by illustrating a more perverse version of this problem with a domain I often write about – the military. Military organizations may adapt IT to become more agile, flexible, and decentralized. But in practice the opposite often occurs. Decision-making is centralized because generals cannot resist the temptation of commanding corporals by video-link. The complexity and fragility of sophisticated command and control systems makes decision-makers more cautious because they have to anticipate the consequences of delegating agency to machines. Training users on cumbersome military systems becomes so dysfunctional that it actually interferes with normal military career paths.
To make matters worse, the massive, sprawling, and insecure military computer systems themselves become targets for enemy action, raising the specter of hackers powering down sophisticated weapons and leaving their users totally helpless. Even without cyber attack, the machine warfare complex becomes so big, convoluted, and powerful that military operations become cognitively too taxing for human operators to manage – ever more elaborate feats of physiological and mental endurance become increasingly necessary to manage the swarm of machine weapons. In sum, because engagements in both cases increasingly occur at “machine speed”, military analysts tell their civilian bosses that the only way to fight back is to delegate even more control to machines in order to fight back. It is remarkable observing this process how consistent it is with the hypothesis that some Skynet-like entity is manipulating the military into subordinating strategic-tactical needs to the aim of making proto-Skynet more powerful. But no such proto-Skynet exists. All of this is a function of how the unintended consequences of complex technologies become cumulative and self-reinforcing, perhaps beyond the point of no return. No proto-Skynet need exist, Winner would likely observe. For all practical purposes human beings have behaved as if it were manipulating them, but doing so entirely of their own free will.
As much as we would like to externalize the problem of autonomous technology, at the end we must conclude that it is our own individual and collective weaknesses that give it such power over us. I leave the reader on this rather ominous note not because there is nothing more I could say about the technological control problem but rather because I have said far more than enough for one post. This post is many thousands of words too many. Nonetheless, I still feel I have but scratched the surface of the problems in question, crudely rendered complicated debates, or have been too loose with my terminology and assumptions. This is the problem of writing anything about technology. It’s hard to know where technology begins and ends, and given how many things influence or are influenced by technology where one should bound any definition of technology itself. So do not despair! All is not lost for us smelly apes. In future posts I shall describe some things that gloomy ruminations of control over technology leave out. I shall also enumerate why, no matter how bleak it may seem, we are far more powerful than we believe when it comes to our machines. For what it is worth, I hope this post has clarified for you the basic debates of what it means for humans to control technology, or the very minimum left you more knowledgeable about it than you were before you read it.
Crime Pays But Botany Doesn't
WARNING ⚠ : THIS EPISODE SHOWS IMAGES THAT ARE AN UTTER BUMMER. Thin-skinned, easily-upset viewers will want to pass.
Botanizing a Toilet:
The bleak barren wasteland of neglected urban infrastructure serves as an example of an ecological phenomenon known as "primary succession", however the cast includes a patchwork of non-native species from all over the globe. What plant species are able to thrive amidst the homeless camps, human bleakness (wealth disparity 101), garbage and concrete? Join CPBBD as we explore the ecology of garbage, concrete and urine.
Are your tender and wholesome sensibilities offended by something said in this video (pointing directly at climate deniers here, among others) ? Rather than leaving a shit-flavored comment that will be removed, consider sending a hateful email to me instead. Nothing is easier or more fulfilling than deleting and blocking a shit-headed comment made by some semi-conscious, small-minded mouth-breather with minimal travel and life experience who thinks his shitty, rude parroted opinion based on info he gleaned from YouTube videos or angry opinion pieces is worth chiming in with. I do however enjoy argumentative conflict, so please indulge me @ crimepaysbutbotanydoesnt at geemale dot com