Mike Powers, for many years a close friend with the author Jan Myrdal, who just passsed a way, writes an obituary for our englishspeaking friends.
"Even as many cultural figures in Sweden after his death were quick to witness to his importance, they pissed on his memory by pointing out the many issues they disagreed with him. Their problem was often that they never read what he said correctly".
The Swedish writer Jan Myrdal, a giant among European intellectuals, has passed away after a 75 year career as a political writer of thousands of articles and more than 100 books. Above all he was a political activist who in the spirit of Karl Marx believed it was more important to change the world than only to understand it. But he was best known for his encyclopedic knowledge of social history and discourse and his constant encouragement to readers to “Go to the sources!” to be able to speak truth to those in power. In his youth he was an organized communist but later chose to describe himself as an independent socialist thinker who could contribute more to popular struggles in his role as a disciplined and factually unchallengeable political writer.
An avid reader as a youngster he credits the influence of a school year in NY as a child with his early exposure to the debates of class struggle and mass politics in the era of the New Deal and his appreciation of freedom of speech as well as American writers.
One of his first books, The Careerist, exposed the corruption and degeneration afflicting the reformist social democratic movement whose ideals had begun to transform Sweden into a modern welfare state. In the sixties he began producing his travel books which examined societies he stayed long periods in, such as Turkmenistan and Afghanistan and Angkor Wat as well as the first of a series on Life in a Chinese Village during the Cultural Revolution which was followed by many return visits.
But it was his book Confessions of a Disloyal European that was to have an enormous impact on anti-imperialists and leftists in many countries with its third world perspective. His wife of more than 50 years, the painter and photographer Gun Kessle worked with him in documenting these travels. He returned to Sweden after these travels and became a leading Vietnam activist and supporter of national liberation struggles.
During these years he also finished his monumental work India Waiting. His anti-imperialism led him to visit the Khmer Rouge held areas during the Vietnamese invasio and the liberated Naxalite controlled areas of India in his 80´s. He participated in the World Tribunal on Iraq in Istanbul in 2005 where he compared the war propaganda used by the imperialists in the two world wars which he had explored in his Selling war like margarine with the false language used by the US in the Iraq invasion and occupation. Al Intiqad, the magazine of Hezbollah, described him as one of Europe´s leading intellectuals and published a long interview with him in which he discussed the historical role of religion and war.
But he was also an important figure in Swedish literature, like the Swedish writer August Strindberg, who he considered his inspiration. His writing had a distinct, direct style, often using spoken and understood language and moving comfortably and without difficulty between tenses so that the past became the present for the reader. His texts were often long, logical reasoning with countless references or asides for further insights. He was as challengeable to read as a Chomsky who did not use commas.
Myrdal was famous for his texts on Balzac, Twain and Dickens, all of whose work he explored and cited in his own writings. He became famous as a prose writer with his I-stories. The first three, The Childhood Trilogy used the perspective of growing awareness of a child growing up to young adulthood. This resulted in his victory in a court case for libel from his Nobel Prize winning parents.
The series continued with Tomorrows, where as a young teenage radical he commits himself to the political and personal discipline that the struggle demanded. of a revolutionary committed to making a better world. His last work in this genre used the perspective of old-age to relook at the recollections of his part. Even that became a bestseller in Sweden.
A true historical materialist all his writings were characterized by an underclass perspective against the superstructure of capitalist politics and culture. He was one of the founders of Folket i Bild/Kulturfront (People in Pictures/the Cultural Front) a broad based movement and magazine to defend People´s culture, freedom of speech and anti-imperialism, in which he continued to have a regular column for almost half a century.
Myrdal was a controversial figure, admired greatly or hated by his detractors. He had no regrets for the positions he took. Even as many cultural figures in Sweden after his death were quick to witness to his importance, they pissed on his memory by pointing out the many issues they disagreed with him. Their problem was often that they never read what he said correctly. For those of us who followed his writings for half a century, he provided inspiration and pointed out a way to understand our world and to act to change it.
Mike Powers
The ritual was to arrive at work half an hour early, so I could gradually wake up in the car listening to the radio, drinking coffee, and eating doughnuts. I’d park my Honda Fit beside the site foreman’s pick-up truck. His morning pre-shift was like mine, except that his breakfast was vodka-soda with painkillers. Another two labourers usually arrived after I did: an irritable six-foot-three indigenous guy called by everyone, including himself, the “BFI,” which stood for “Big F-cking Indian”; and a cocaine-addled Italian who split “a gram or two with the wife” nightly, pairing it with a three-litre bottle of red wine. He claimed to sleep only two hours, which I never doubted, since he had to commute an hour to get on site at 6am. Of my colleagues, only the BFI always worked sober, having survived years of alcoholism (not to mention some prison time).
At age 20, I’d started my first week in construction, excavating a commercial space for a liquor store. The dark pits of freshly-dug soil gave the air a musty sweetness that stuck in the back of the throat. We’d spend 12-hour days digging trenches in the subterranean dark, and then fill them up with concrete. The ready mix splashed onto my skin and made my eyes burn, while men yelled monosyllabic instructions over the din of engines. The air smelled of diesel, with notes of liquid metal thanks to the welders. On break, we made our way outside, the only time we saw the sun, to immediately contaminate the fresh air with a round of cigarettes. True to stereotype, not one woman escaped our gaze. They were something to look at that wasn’t steel, dirt, dust, or rock.
This is how some men spend the majority of their lives.
I say “men” because, in my chosen subspecialty of concrete (whose ranks include those formally designated in the United States under the category of “cement masons, concrete finishers, and terrazzo workers”), the work force is 98.9 percent male. According to 2018 data (collected well before the COVID-19 pandemic), the average annual salary is about $42,000, significantly less than the national average of $54,000. In this industry, 50 is considered old. And working past 60 is almost unheard of. Most of the men I worked with had little formal education. Many had a criminal record. Men working in construction and extraction have the highest suicide rate of any industry, as well as the highest rate of opioid addiction, and (predictably) [overdoses](https://www.sciencedaily.com/releases/2019/10/191030082825.htm#:~:text=The%20researchers%20found%20that%2C%20compared,cocaine%20use%20(1.8%20percent%20vs.). Alcoholism rates are second only to the mining industry. It’s a rough crowd doing hard work. So you can see why employers might have difficulty addressing their gender imbalance.
Men work construction jobs because they need the money. But they also take pride in their daily work product, and the more general fact that they build and fix the concrete world that we all need. There’s usually a strong work ethic on display, too, even if it doesn’t always manifest itself as what many of us would describe as professionalism per se.
To the extent construction workers are discussed at all in the media or popular culture, it’s usually by reference to stereotypically negative attributes, such as sexist leering, foul language, and substance abuse. Unless you are embedded in this world, you’ll miss the offsetting positive aspects, including the unspoken code that exists among most crews: (1) Do the best work you can, without creating more work for others; (2) don’t shirk the dirtiest or hardest task; (3) obey your direct boss, but remain suspicious of authority more generally, especially when it walks on to the site with clean hands and nice shoes. (Young engineers tend to be particular objects of scorn); (4) never rat. If someone’s alcohol or drug problem is out of hand, let the supervisor address it. If your colleague gets fired because you blew the whistle, you may lose something more precious than a job.
While doing interviews for this article, two unionized municipal construction workers told me, off the record, “There are only two rules with Percocet: One, never talk about perkies. Two, do you have any?” The high level of opioid use among construction workers arises from the need to alleviate pain. Many workers freely offer stories about past accidents and the ensuing surgeries. In other cases, it’s a case of repetitive stress and bodily wear and tear, including slipped disks and rotator-cuff issues. Opioids are especially helpful for contract labourers who don’t have union protection or job benefits. Without work, they have no money, so they rely on pills to stay on site.
Eventually, of course, avoidance of withdrawal symptoms becomes the dominant priority. And one friend of mine fell off the workforce when he could no longer find a steady supply of pills. The symptoms of sudden abstinence, which often start with vomiting and diarrhoea, can sometimes be life-threatening. To save a colleague from unemployment, and possibly from falling into a deadly spiral, a few men relinquished some of their own pills as an act of charity, knowing the roles could be reversed one day.
On the sites I worked, Percocet went for between $3 and $5 per 5mg dose. The more potent 80 mg OxyContins went for $80. (The active ingredient in both is oxycodone.) Labourers are rarely prescribed enough by their doctors to feed their addictions, and so they buy or trade amongst one another. Some spend upward of $500 per week, and have to enter into informal buy-and-sell agreements, somewhat comparable to stock options. One supervisor habitually secured a high volume of Percocets through his monthly prescription, and would sell a portion at the beginning of the month, with the understanding that he’d buy some back at an agreed upon price when his supply ran out. He did this to prevent himself from doing all his pills at once.
During my first four years of occasional construction work, from 2014 to 2018, almost 5,000 workers in this field died on the job in the United States. But those figures include only private-sector construction work, and exclude associated suicides, accidental overdoses, as well as traffic accidents while commuting to and from work. Even according to the lowest figures, the on-the-job fatal injury count for hardhats is higher than for any other occupation, mostly because of what’s sometimes referred to by the US Labor Department as the “fatal four”: falling, getting struck by an object, electrocutions, and “getting caught in (or in between) things.”
America’s most revered professions include emergency responders: police officers, firefighters, and paramedics. During the COVID-19 pandemic, health workers of all kinds have been properly showered with gratitude as well. In 2019, however, workers in these areas lost a combined 150 lives, about one-seventh the total deaths in construction in a typical year. Even America’s soldiers have suffered fewer absolute losses than construction workers in recent years.
Soldiers and first responders enjoy an elevated status because they work to protect us from obvious threats—foreign attack, terrorism, disease. If construction is done successfully, on the other hand, there is no threat (unless you count nature, which few do, since most of us now simply take protection from the elements for granted). In addition, it is assumed that soldiers and first responders choose their jobs, whereas labourers have merely accepted theirs with resignation, because they couldn’t find anything better.
Construction workers lack the aesthetic of heroes. George Orwell observed in The Road to Wigan Pier (1937) that the main reason the working classes, coal miners, in particular, were looked down upon was not because of some abstract quality such as mind or character, but because of the way they struck the senses of more refined observers:
It may not greatly matter if the average middle-class person is brought up to believe that the working classes are ignorant, lazy, drunken, boorish, and dishonest; it is when he is brought up to believe that they are dirty that the harm is done… And in my childhood, we were brought up to believe that they were dirty. Very early in life, you acquired the idea that there was something subtly repulsive about a working-class body; you would not get nearer to it than you could help.
Orwell’s view is somewhat dated, of course. Mines and other industrial facilities now require fewer workers, and are more dependent on highly skilled technicians to operate the machines that do most of the work. In our post-industrial world, moreover, hipsters now have become enamored with certain kinds of blue-collar work. But these infatuations tend to focus on artisanal subcultures, such as fine woodworking, custom-made bicycles, or craft cideries. Day-to-day construction work doesn’t qualify: I’ve yet to encounter an ambitious student who dreams of tying rebar or pouring concrete. In fact, the lifestyle is sometimes similar to one that Orwell might recognize. There were some weeks when, after dawn-to-dusk shifts, I would climb into bed without showering, in my dirty and smelly workwear, from sheer exhaustion, and for the convenience of not having to change in the morning.
Scene from Total Recall
In fiction, labourers have been featured prominently in niches such as communist proletariat literature and gay romance. Though neither present any kind of realistic image of working life. On television, arguably the most popular show with a blue-collar construction theme is the children’s cartoon, Bob the Builder (which portrays the life of a building contractor with the same level of accuracy as an anthropologist might find in a plotline from Dora the Explorer). On the silver screen, similarly, we got Emmet from The Lego Movie in 2014. One of the few memorable construction-worker heroes in a Hollywood movie aimed at adults was Douglas Quaid, played by Arnold Schwarzenegger in Total Recall. The 1990 movie had been adapted from Philip K Dick’s novel, We Can Remember It for You Wholesale (1966), whose protagonist was an office clerk. It was only because of Schwarzenegger’s physique that a blue-collar back story was substituted.
In music, there was once a fashion for socialist propaganda songs, including those produced by the Wobblies, short-hand for the International Workers of the World Union. Perhaps the most famous was Joe Hill’s The Preacher and The Slave (1911). As consolation for their meager rations and impoverished lives, a preacher assures workers, they’ll get food in heaven—which is how we got the expression “pie in the sky”:
Long-haired preachers come out every night
Try to tell you what’s wrong and what’s right
But when asked about something to eat
They will answer in voices so sweet
You will eat, you will eat, by and by
In that glorious land in the sky, way up high
Work and pray, live on hay
You’ll get pie in the sky when you die, that’s a lie
(In Animal Farm, Orwell borrowed a similar concept, with Moses the crow encouraging farm animals to ignore their agonies on Earth and instead imagine the pleasures to be enjoyed in the great beyond, a land that he promised was real, and which he called “Sugarcandy Mountain.”)
The revival of Zionism in the 19th century provides one notable cultural genre in which the common labourer received heroic treatment in a way that transcended merely socialist tropes. This included the so-called Muscular Judaism movement presented by Zionist leader Max Nordau as an answer to caricature of Jews as meek and cowardly parasites who got by on guile instead of effort. Long before the rise of the Nazi menace, he argued in a 1903 article (Muskeljudentum, “Jewry of Muscle”) that physical strength was essential to enable Jews both to combat anti-Semitic prejudice and develop a revived national identity. He called on the diaspora to “let us once more become deep-chested, sturdy, sharp-eyed men.”
A second influence was Avoda Ivrit, or “Hebrew Labour,” which had a mystical element. Championed by Zionist ideologue A. D. Gordon, this movement held that Jewish immersion in the holy land could be properly achieved only through manual work. Of course, these labourers were not like the modern wage-earners I’ve written about here. These were self-sustaining agrarians who channeled their efforts into an explicitly nationalistic, collective project. However, Israel’s founding fathers did at least give workers their proper due alongside other callings.
A few years ago, while walking with a friend who had worked alongside me at the same construction company, we saw a car veer off the road and smash through the wall of a convenience store. Working a few blocks away, members of a construction crew heard the accident and sprinted over. They pulled the female driver out of her smoking car and laid her out on the sidewalk. These were the only people to help before the ambulance arrived, while the other, more “respectable,” bystanders held up their phones to record everything. My friend reflected, “It makes me proud to have worked construction.”
This sentimental bit of camaraderie stuck with me, but only until the next day. Back on site, a co-worker noticed a girl across the street while we were on break. He shouted my way, “I’d f–ck the hole off of her.” If anyone had been filming him, it’s a scene that would have gone viral among those with clean hands and nice shoes. In Orwell’s day, the privileged set had to get up close and personal to develop their disdain for the working class. These days, thanks to Twitter, they can do it without getting out of bed.
Michael Humeniuk is a former construction worker.
A considerable amount of baggage has become attached to the word “European” over the half-millennium that Europe has dominated the world. There’s the geographical meaning – from the Atlantic to the Urals – but, because Europe is a peninsula on the western end of Asia, the frontier is subject to debate. Diplomats sometimes use the word to mean members of the European Union. But the most important meaning is the value-laden one – to be “European” is to be modern, civilised, rational, to hold “values”, to be successful. To be powerful. Not to be “European” is to be none of these things, perhaps even their opposites. Europeans are rulers and exemplars; the others are subjects and inferiors. Throughout the period of European domination, to be considered “European” was favoured and to adopt European habits, dress styles, education and appearance was desirable. Not to be “European”, on the other hand, was an insult: your culture didn’t make the grade. This meaning is commonly found today, especially in the smug phrase “European values“.
I have been considering writing this essay for some years but have put off doing so because I know that for many readers “Europe” means “best” and to say Russia is not European is to say that it’s not good enough. But at last President Putin has given me the opening: “Россия – это не просто страна, это действительно отдельная цивилизация“. “Russia, it’s not simply a country it is certainly a separate civilisation”. And who would dare disagree with him?
I have always regarded Russia, to quote Macron’s term, as a civilisation-state. It is its own thing – not European not Asiatic, it’s Russian. If we use Toynbee’s nomenclature it, like Western Christendom, is a daughter society of the Hellenic society.
To make my argument I will use Toynbee’s methodology in his Study of History to determine what he calls a “society” – a distinct, self-contained entity about which history in the largest scale can be studied. Is Britain one of these? is it, as many Britons thought in his day, a stand-alone culture? His argument was to imagine a history of Britain in a series of chapters. Let us start the book with a first chapter: Celtic Britain. Immediately there is a problem because a huge footnote has to be inserted to explain who the Celts were and where they came from because they didn’t originate in Britain; they arrived there fully-formed, so to speak. Then Chapter 2 might be Roman Britain. Again a huge footnote to explain their non-British origins and history. Then Chapter 3 about the Saxons and again a big footnote. Chapter 4 The Normans and so on. In short each chapter of British history leads one to huge digressions outside of Britain; therefore, Toynbee argued, Britain must be a part of some other society which has a more-or-less self-contained story – Celts, Romans, Saxons and Normans all originate in Europe; no footnotes are needed. This seems to me to be a powerful argument.
Let us apply it to Russia and Europe. We’ll start our European history – you have to start somewhere – with Chapter 1 The Roman Empire. We’d speak about its origin, its conquests, its decay, its legacy. There’s no similar chapter in our Russian book: Russia wasn’t part of the Roman Empire and, in fact, there isn’t much history of Russia up until the 800s. Chapter 2 of our Europe history book would probably be Christianity; Russia and Europe share that but again there’s a big difference. The Roman Empire became officially Christian in the early 300s and the religion spread throughout the Empire. Missionaries from Europe spread the word out to and past the limits of the Empire to Germany and Ireland. The Russian experience is both later and different: Grand Duke Vladimir made a conscious, top-down decision to Christianise and adopted the Christianity of Byzantium; European Christianity was Rome-centred from the start. Chapter 3 of our European history book would cover Charlemagne and the re-creation – independently of Constantinople – of a Christian Roman Empire centred on the formerly pagan and barbarian invaders; nothing like that in Russia which still has two centuries to go before it’s Christianised. Chapter 4 might be the Empire-Papacy struggle – nothing like that in Russia. Chapter 5 is The Renaissance and again there no equivalent in Russia. In fact, you could write most of the European history book without ever mentioning the word “Russia” up until the 1700s.
What of the Russian history book? Its Chapter 1 would probably be about the Varangians and the creation of a region of loosely connected city states at least nominally Orthodox; much of this story would be somewhat mythical or archaeological. Chapter 2 would cover the development of what is now called Kievan Rus, the trade with Byzantium and the many contacts with Europe – a Russian became Queen of France. At this point one could argue (leaving aside the growing importance of the difference of religion particularly after the Great Schism of 1054) that Russia and Europe might have become so entwined as to become one. But our Russian Chapter 3 brings the difference that is all the difference: The Mongols. In a series of lightning campaigns the Mongol forces overran the Russias, destroyed Kiev and forced all the Russian principalities to submit to Mongol rule and to give tribute. Nothing like this happened in Europe, although it might have: the Mongol forces retreated from Hungary in 1242 and never returned. This is another Great If of history; had the Mongols continued to the Atlantic, a second possible entwining of Russia and Europe might have happened. But they departed Europe and remained in Russia.
Much has been written about the effect of Mongol rule on Russia’s development but all agree that it shaped its future very strongly. The two and a half centuries of what the Russians call the “Tatar yoke” cover a time in contemporary Europe that begins when Thomas Aquinas is a boy and ends when Columbus is a young man – a period of enormous change in European civilisation. But in Russia they are years of compliance, endurance and resistance. The recovery of the “Russian Lands” was led by Muscovy, formerly a not very important part of Russia. The textbook date for the end of the “Tatar yoke” was the withdrawal of Mongol forces in the face of a Russian army at the Ugra River in 1480 but it was actually only with Catherine’s regathering of Crimea and “New Russia” in the late 1700s that the very last Mongol ruler of Russian Lands was displaced.
So, our hypothetical European and Russian history books have quite different chapters and that means that they have quite different histories; we’re talking about two things, not one thing.
Europe became immensely powerful in the 1500s, conquered the rest of the world and minor European players like Belgium snatched a pierce for themselves. Even mighty China was subjugated – its “century of humiliation”. Russia was one of the very few exceptions; despite several tries, Europe never conquered it. Peter the Great Europeanised Russia, built a navy, founded the gun factories at Tula, shaved beards, eliminated caftans and required the upper classes to dress like French dancing masters. He did it in order to better prepare Russia to fight Sweden, at that time the dominant power in the area. When Charles XII was defeated by Peter at Poltava in 1709 Russia arrived on the European scene as a great power that had to be taken into account. A century later, Emperor Alexander was one of the five people who redesigned Europe.
Europeans underestimate the importance of their skill at war, preferring to think that it was their values or their political skills or their modernity or their science that made them pre-eminent for five centuries. But their killing power (and their killing diseases) were mighty allies: “Whatever happens we have got The Maxim Gun, and they have not“. Peter, facing attack from Europe, learned European killing ways and so Russia remained independent. Many resisted Western aggression and failed – Tecumseh, Túpac Amaru, Cetshwayo, the Rani of Jhansi – but Peter succeeded. In short, Russia’s (and Japan’s) voluntary Europeanisation was motivated by the desire to learn the European way of war so as to keep independence. At Poltava in 1709, at Vienna in 1814, at Berlin in 1945, an independent Russia became a major force in Europe.
The realities that Europe was never able to conquer it, that Russians look and sound like Europeans on the surface, that in the European constellation Russia is a Great Power have caused no little confusion. Many people have come to believe that Russia is a part of European civilisation but a defective part: a European country, but a bad one. But, once one realises that Russia is not a European country and has a quite different history that moved in parallel with little contact for centuries, one can see past these illusions. Different forces shaped it and different results happened.
Not inferior, not “Asiatic”, not uncivilised, not uncultured; different. A “civilisation state”. As is China.
Restoration of the automatons of Pierre and Henri-Louis Jaquet-Droz, the Writer, the Draughtsman and the Musician, by Thierry and Grégory Amstutz, Auvernier, Switzerland
Crime Pays But Botany Doesn't
WARNING ⚠ : THIS EPISODE SHOWS IMAGES THAT ARE AN UTTER BUMMER. Thin-skinned, easily-upset viewers will want to pass.
Botanizing a Toilet:
The bleak barren wasteland of neglected urban infrastructure serves as an example of an ecological phenomenon known as "primary succession", however the cast includes a patchwork of non-native species from all over the globe. What plant species are able to thrive amidst the homeless camps, human bleakness (wealth disparity 101), garbage and concrete? Join CPBBD as we explore the ecology of garbage, concrete and urine.
Are your tender and wholesome sensibilities offended by something said in this video (pointing directly at climate deniers here, among others) ? Rather than leaving a shit-flavored comment that will be removed, consider sending a hateful email to me instead. Nothing is easier or more fulfilling than deleting and blocking a shit-headed comment made by some semi-conscious, small-minded mouth-breather with minimal travel and life experience who thinks his shitty, rude parroted opinion based on info he gleaned from YouTube videos or angry opinion pieces is worth chiming in with. I do however enjoy argumentative conflict, so please indulge me @ crimepaysbutbotanydoesnt at geemale dot com
Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker, had no trouble raising his fists to get his way. The neighborhood boys weren’t much better. One afternoon in 1935, they chased him through the streets until he ducked into the local library to hide. The library was familiar ground, where he had taught himself Greek, Latin, logic, and mathematics—better than home, where his father insisted he drop out of school and go to work. Outside, the world was messy. Inside, it all made sense.
Not wanting to risk another run-in that night, Pitts stayed hidden until the library closed for the evening. Alone, he wandered through the stacks of books until he came across Principia Mathematica, a three-volume tome written by Bertrand Russell and Alfred Whitehead between 1910 and 1913, which attempted to reduce all of mathematics to pure logic. Pitts sat down and began to read. For three days he remained in the library until he had read each volume cover to cover—nearly 2,000 pages in all—and had identified several mistakes. Deciding that Bertrand Russell himself needed to know about these, the boy drafted a letter to Russell detailing the errors. Not only did Russell write back, he was so impressed that he invited Pitts to study with him as a graduate student at Cambridge University in England. Pitts couldn’t oblige him, though—he was only 12 years old. But three years later, when he heard that Russell would be visiting the University of Chicago, the 15-year-old ran away from home and headed for Illinois. He never saw his family again.
McCulloch was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m.
In 1923, the year that Walter Pitts was born, a 25-year-old Warren McCulloch was also digesting the Principia. But that is where the similarities ended—McCulloch could not have come from a more different world. Born into a well-to-do East Coast family of lawyers, doctors, theologians, and engineers, McCulloch attended a private boys academy in New Jersey, then studied mathematics at Haverford College in Pennsylvania, then philosophy and psychology at Yale. In 1923 he was at Columbia, where he was studying “experimental aesthetics” and was about to earn his medical degree in neurophysiology. But McCulloch was a philosopher at heart. He wanted to know what it means to know. Freud had just published The Ego and the Id, and psychoanalysis was all the rage. McCulloch didn’t buy it—he felt certain that somehow the mysterious workings and failings of the mind were rooted in the purely mechanical firings of neurons in the brain.
Though they started at opposite ends of the socioeconomic spectrum, McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence. But this is more than a story about a fruitful research collaboration. It is also about the bonds of friendship, the fragility of the mind, and the limits of logic’s ability to redeem a messy and imperfect world.
Walter Pitts Photo by: Estate of Francis Bello / Science Source
Standing face to face, they were an unlikely pair. McCulloch, 42 years old when he met Pitts, was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m. Pitts, 18, was small and shy, with a long forehead that prematurely aged him, and a squat, duck-like, bespectacled face. McCulloch was a respected scientist. Pitts was a homeless runaway. He’d been hanging around the University of Chicago, working a menial job and sneaking into Russell’s lectures, where he met a young medical student named Jerome Lettvin. It was Lettvin who introduced the two men. The moment they spoke, they realized they shared a hero in common: Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.
McCulloch explained to Pitts that he was trying to model the brain with a Leibnizian logical calculus. He had been inspired by the Principia, in which Russell and Whitehead tried to show that all of mathematics could be built from the ground up using basic, indisputable logic. Their building block was the proposition—the simplest possible statement, either true or false. From there, they employed the fundamental operations of logic, like the conjunction (“and”), disjunction (“or”), and negation (“not”), to link propositions into increasingly complicated networks. From these simple propositions, they derived the full complexity of modern mathematics.
Which got McCulloch thinking about neurons. He knew that each of the brain’s nerve cells only fires after a minimum threshold has been reached: Enough of its neighboring nerve cells must send signals across the neuron’s synapses before it will fire off its own electrical spike. It occurred to McCulloch that this set-up was binary—either the neuron fires or it doesn’t. A neuron’s signal, he realized, is a proposition, and neurons seemed to work like logic gates, taking in multiple inputs and producing a single output. By varying a neuron’s firing threshold, it could be made to perform “and,” “or,” and “not” functions.
Late at night, McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up.
Fresh from reading a new paper by a British mathematician named Alan Turing which proved the possibility of a machine that could compute any function (so long as it was possible to do so in a finite number of steps), McCulloch became convinced that the brain was just such a machine—one which uses logic encoded in neural networks to compute. Neurons, he thought, could be linked together by the rules of logic to build more complex chains of thought, in the same way that the Principia linked chains of propositions to build complex mathematics.
As McCulloch explained his project, Pitts understood it immediately, and knew exactly which mathematical tools could be used. McCulloch, enchanted, invited the teen to live with him and his family in Hinsdale, a rural suburb on the outskirts of Chicago. The Hinsdale household was a bustling, free-spirited bohemia. Chicago intellectuals and literary types constantly dropped by the house to discuss poetry, psychology, and radical politics while Spanish Civil War and union songs blared from the phonograph. But late at night, when McCulloch’s wife Rook and the three children went to bed, McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up.
Before Pitts’ arrival, McCulloch had hit a wall: There was nothing stopping chains of neurons from twisting themselves into loops, so that the output of the last neuron in a chain became the input of the first—a neural network chasing its tail. McCulloch had no idea how to model that mathematically. From the point of view of logic, a loop smells a lot like paradox: the consequent becomes the antecedent, the effect becomes the cause. McCulloch had been labeling each link in the chain with a time stamp, so that if the first neuron fired at time t, the next one fired at t+1, and so on. But when the chains circled back, t+1 suddenly came before t.
Pitts knew how to tackle the problem. He used modulo mathematics, which deals with numbers that circle back around on themselves like the hours of a clock. He showed McCulloch that the paradox of time t+1 coming before time t wasn’t a paradox at all, because in his calculations “before” and “after” lost their meaning. Time was removed from the equation altogether. If one were to see a lightning bolt flash on the sky, the eyes would send a signal to the brain, shuffling it through a chain of neurons. Starting with any given neuron in the chain, you could retrace the signal’s steps and figure out just how long ago lightning struck. Unless, that is, the chain is a loop. In that case, the information encoding the lightning bolt just spins in circles, endlessly. It bears no connection to the time at which the lightning actually occurred. It becomes, as McCulloch put it, “an idea wrenched out of time.” In other words, a memory.
By the time Pitts finished calculating, he and McCulloch had on their hands a mechanistic model of the mind, the first application of computation to the brain, and the first argument that the brain, at bottom, is an information processor. By stringing simple binary neurons into chains and loops, they had shown that the brain could implement every possible logical operation and compute anything that could be computed by one of Turing’s hypothetical machines. Thanks to those ouroboric loops, they had also found a way for the brain to abstract a piece of information, hang on to it, and abstract it yet again, creating rich, elaborate hierarchies of lingering ideas in a process we call “thinking.”
McCulloch and Pitts wrote up their findings in a now-seminal paper, “A Logical Calculus of Ideas Immanent in Nervous Activity,” published in the Bulletin of Mathematical Biophysics. Their model was vastly oversimplified for a biological brain, but it succeeded at showing a proof of principle. Thought, they said, need not be shrouded in Freudian mysticism or engaged in struggles between ego and id. “For the first time in the history of science,” McCulloch announced to a group of philosophy students, “we know how we know.”
Pitts had found in McCulloch everything he had needed—acceptance, friendship, his intellectual other half, the father he never had. Although he had only lived in Hinsdale for a short time, the runaway would refer to McCulloch’s house as home for the rest of his life. For his part, McCulloch was just as enamored. In Pitts he had found a kindred spirit, his “bootlegged collaborator,” and a mind with the technical prowess to bring McCulloch’s half-formed notions to life. As he put it in a letter of reference about Pitts, “Would I had him with me always.”1
Pitts was soon to make a similar impression on one of the towering intellectual figures of the 20th century, the mathematician, philosopher, and founder of cybernetics, Norbert Wiener. In 1943, Lettvin brought Pitts into Wiener’s office at the Massachusetts Institute of Technology (MIT). Wiener didn’t introduce himself or make small talk. He simply walked Pitts over to a blackboard where he was working out a mathematical proof. As Wiener worked, Pitts chimed in with questions and suggestions. According to Lettvin, by the time they reached the second blackboard, it was clear that Wiener had found his new right-hand man. Wiener would later write that Pitts was “without question the strongest young scientist whom I have ever met … I should be extremely astonished if he does not prove to be one of the two or three most important scientists of his generation, not merely in America but in the world at large.”
So impressed was Wiener that he promised Pitts a Ph.D. in mathematics at MIT, despite the fact that he had never graduated from high school—something that the strict rules at the University of Chicago prohibited. It was an offer Pitts couldn’t refuse. By the fall of 1943, Pitts had moved into a Cambridge apartment, was enrolled as a special student at MIT, and was studying under one of the most influential scientists in the world. It was quite a long way from blue-collar Detroit.
Wiener wanted Pitts to make his model of the brain more realistic. Despite the leaps Pitts and McCulloch had made, their work had made barely a ripple among brain scientists—in part because the symbolic logic they’d employed was hard to decipher, but also because their stark and oversimplified model didn’t capture the full messiness of the biological brain. Wiener, however, understood the implications of what they’d done, and knew that a more realistic model would be game-changing. He also realized that it ought to be possible for Pitts’ neural networks to be implemented in man-made machines, ushering in his dream of a cybernetic revolution. Wiener figured that if Pitts was going to make a realistic model of the brain’s 100 billion interconnected neurons, he was going to need statistics on his side. And statistics and probability theory were Wiener’s area of expertise. After all, it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.
The scientists in the room were floored. And yet, everyone who knew Pitts was sure that he could do it.
As Pitts began his work at MIT, he realized that although genetics must encode for gross neural features, there was no way our genes could pre-determine the trillions of synaptic connections in the brain—the amount of information it would require was untenable. It must be the case, he figured, that we all start out with essentially random neural networks—highly probable states containing negligible information (a thesis that continues to be debated to the present day). He suspected that by altering the thresholds of neurons over time, randomness could give way to order and information could emerge. He set out to model the process using statistical mechanics. Wiener excitedly cheered him on, because he knew if such a model were embodied in a machine, that machine could learn.
“I now understand at once some seven-eighths of what Wiener says, which I am told is something of an achievement,” Pitts wrote in a letter to McCulloch in December of 1943, some three months after he’d arrived. His work with Wiener was “to constitute the first adequate discussion of statistical mechanics, understood in the most general possible sense, so that it includes for example the problem of deriving the psychological, or statistical, laws of behavior from the microscopic laws of neurophysiology … Doesn’t it sound fine?”
That winter, Wiener brought Pitts to a conference he organized in Princeton with the mathematician and physicist John von Neumann, who was equally impressed with Pitts’ mind. Thus formed the beginnings of the group who would become known as the cyberneticians, with Wiener, Pitts, McCulloch, Lettvin, and von Neumann its core. And among this rarified group, the formerly homeless runaway stood out. “None of us would think of publishing a paper without his corrections and approval,” McCulloch wrote. “[Pitts] was in no uncertain terms the genius of our group,” said Lettvin. “He was absolutely incomparable in the scholarship of chemistry, physics, of everything you could talk about history, botany, etc. When you asked him a question, you would get back a whole textbook … To him, the world was connected in a very complex and wonderful fashion.”2
The following June, 1945, von Neumann penned what would become a historic document entitled “First Draft of a Report on the EDVAC,” the first published description of a stored-program binary computing machine—the modern computer. The EDVAC’s predecessor, the ENIAC, which took up 1,800 square feet of space in Philadelphia, was more like a giant electronic calculator than a computer. It was possible to reprogram the thing, but it took several operators several weeks to reroute all the wires and switches to do it. Von Neumann realized that it might not be necessary to rewire the machine every time you wanted it to perform a new function. If you could take each configuration of the switches and wires, abstract them, and encode them symbolically as pure information, you could feed them into the computer the same way you’d feed it data, only now the data would include the very programs that manipulate the data. Without having to rewire a thing, you’d have a universal Turing machine.
To accomplish this, von Neumann suggested modeling the computer after Pitts and McCulloch’s neural networks. In place of neurons, he suggested vacuum tubes, which would serve as logic gates, and by stringing them together exactly as Pitts and McCulloch had discovered, you could carry out any computation. To store the programs as data, the computer would need something new: a memory. That’s where Pitts’ loops came into play. “An element which stimulates itself will hold a stimulus indefinitely,” von Neumann wrote in his report, echoing Pitts and employing his modulo mathematics. He detailed every aspect of this new computational architecture. In the entire report, he cited only a single paper: “A Logical Calculus” by McCulloch and Pitts.
By 1946, Pitts was living on Beacon Street in Boston with Oliver Selfridge, an MIT student who would become “the father of machine perception”; Hyman Minsky, the future economist; and Lettvin. He was teaching mathematical logic at MIT and working with Wiener on the statistical mechanics of the brain. The following year, at the Second Cybernetic Conference, Pitts announced that he was writing his doctoral dissertation on probabilistic three-dimensional neural networks. The scientists in the room were floored. “Ambitious” was hardly the word to describe the mathematical skill that it would take to pull off such a feat. And yet, everyone who knew Pitts was sure that he could do it. They would be waiting with bated breath.
In a letter to the philosopher Rudolf Carnap, McCulloch catalogued Pitts’ achievements. “He is the most omniverous of scientists and scholars. He has become an excellent dye chemist, a good mammalogist, he knows the sedges, mushrooms and the birds of New England. He knows neuroanatomy and neurophysiology from their original sources in Greek, Latin, Italian, Spanish, Portuguese, and German for he learns any language he needs as soon as he needs it. Things like electrical circuit theory and the practical soldering in of power, lighting, and radio circuits he does himself. In my long life, I have never seen a man so erudite or so really practical.” Even the media took notice. In June 1954, Fortune magazine ran an article featuring the 20 most talented scientists under 40; Pitts was featured, next to Claude Shannon and James Watson. Against all odds, Walter Pitts had skyrocketed into scientific stardom.
Some years earlier, in a letter to McCulloch, Pitts wrote “About once a week now I become violently homesick to talk all evening and all night to you.” Despite his success, Pitts had become homesick—and home meant McCulloch. He was coming to believe that if he could work with McCulloch again, he would be happier, more productive, and more likely to break new ground. McCulloch, too, seemed to be floundering without his bootlegged collaborator.
Suddenly, the clouds broke. In 1952, Jerry Wiesner, associate director of MIT’s Research Laboratory of Electronics, invited McCulloch to head a new project on brain science at MIT. McCulloch jumped at the opportunity—because it meant he would be working with Pitts again. He traded his full professorship and his large Hinsdale home for a research associate title and a crappy apartment in Cambridge, and couldn’t have been happier about it. The plan for the project was to use the full arsenal of information theory, neurophysiology, statistical mechanics, and computing machines to understand how the brain gives rise to the mind. Lettvin, along with the young neuroscientist Patrick Wall, joined McCulloch and Pitts at their new headquarters in Building 20 on Vassar Street. They posted a sign on the door: Experimental Epistemology.
With Pitts and McCulloch together again, and with Wiener and Lettvin in the mix, everything seemed poised for progress and revolution. Neuroscience, cybernetics, artificial intelligence, computer science—it was all on the brink of an intellectual explosion. The sky—or the mind—was the limit.
He began drinking heavily and pulled away from his friends. He set fire to his dissertation along with all of his notes and his papers.
There was just one person who wasn’t happy about the reunion: Wiener’s wife. Margaret Wiener was, by all accounts, a controlling, conservative prude—and she despised McCulloch’s influence on her husband. McCulloch hosted wild get-togethers at his family farm in Old Lyme, Connecticut, where ideas roamed free and everyone went skinny-dipping. It had been one thing when McCulloch was in Chicago, but now he was coming to Cambridge and Margaret wouldn’t have it. And so she invented a story. She sat Wiener down and informed him that when their daughter, Barbara, had stayed at McCulloch’s house in Chicago, several of “his boys” had seduced her. Wiener immediately sent an angry telegram to Wiesner: “Please inform [Pitts and Lettvin] that all connection between me and your projects is permanently abolished. They are your problem. Wiener.” He never spoke to Pitts again. And he never told him why.3
For Pitts, this marked the beginning of the end. Wiener, who had taken on a fatherly role in his life, now abandoned him inexplicably. For Pitts, it wasn’t merely a loss. It was something far worse than that: It defied logic.
And then there were the frogs. In the basement of Building 20 at MIT, along with a garbage can full of crickets, Lettvin kept a group of them. At the time, biologists believed that the eye was like a photographic plate that passively recorded dots of light and sent them, dot for dot, to the brain, which did the heavy lifting of interpretation. Lettvin decided to put the idea to the test, opening up the frog’s skulls and attaching electrodes to single fibers in their optic nerves.
Pitts with Lettvin: Pitts with Jerome Lettvin and one subject of their experiments on visual perception Photo by: Wikipedia
Together with Pitts, McCulloch and the Chilean biologist and philosopher Humberto Maturana, he subjected the frogs to various visual experiences—brightening and dimming the lights, showing them color photographs of their natural habitat, magnetically dangling artificial flies—and recorded what the eye measured before it sent the information off to the brain. To everyone’s surprise, it didn’t merely record what it saw, but filtered and analyzed information about visual features like contrast, curvature, and movement. “The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959.
The results shook Pitts’ worldview to its core. Instead of the brain computing information digital neuron by digital neuron using the exacting implement of mathematical logic, messy, analog processes in the eye were doing at least part of the interpretive work. “It was apparent to him after we had done the frog’s eye that even if logic played a part, it didn’t play the important or central part that one would have expected,” Lettvin said. “It disappointed him. He would never admit it, but it seemed to add to his despair at the loss of Wiener’s friendship.”
Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology.
The spate of bad news aggravated a depressive streak that Pitts had been struggling with for years. “I have a kind of personal woe I should like your advice on,” Pitts had written to McCulloch in one of his letters. “I have noticed in the last two or three years a growing tendency to a kind of melancholy apathy or depression. [Its] effect is to make the positive value seem to disappear from the world, so that nothing seems worth the effort of doing it, and whatever I do or what happens to me ceases to matter very greatly …”
In other words, Pitts was struggling with the very logic he had sought in life. Pitts wrote that his depression might be “common to all people with an excessively logical education who work in applied mathematics: It is a kind of pessimism resulting from an inability to believe in what people call the Principle of Induction, or the principle of the Uniformity of Nature. Since one cannot prove, or even render probable a priori, that the sun should rise tomorrow, we cannot really believe it shall.”
Now, alienated from Wiener, Pitts’ despair turned lethal. He began drinking heavily and pulled away from his friends. When he was offered his Ph.D., he refused to sign the paperwork. He set fire to his dissertation along with all of his notes and his papers. Years of work—important work that everyone in the community was eagerly awaiting— he burnt it all, priceless information reduced to entropy and ash. Wiesner offered Lettvin increased support for the lab if he could recover any bits of the dissertation. But it was all gone.
Pitts remained employed by MIT, but this was little more than a technicality; he hardly spoke to anyone and would frequently disappear. “We’d go hunting for him night after night,” Lettvin said. “Watching him destroy himself was a dreadful experience.” In a way Pitts was still 12 years old. He was still beaten, still a runaway, still hiding from the world in musty libraries. Only now his books took the shape of a bottle.
With McCulloch, Pitts had laid the foundations for cybernetics and artificial intelligence. They had steered psychiatry away from Freudian analysis and toward a mechanistic understanding of thought. They had shown that the brain computes and that mentation is the processing of information. In doing so, they had also shown how a machine could compute, providing the key inspiration for the architecture of modern computers. Thanks to their work, there was a moment in history when neuroscience, psychiatry, computer science, mathematical logic, and artificial intelligence were all one thing, following an idea first glimpsed by Leibniz—that man, machine, number, and mind all use information as a universal currency. What appeared on the surface to be very different ingredients of the world—hunks of metal, lumps of gray matter, scratches of ink on a page—were profoundly interchangeable.
There was a catch, though: This symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything … can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.”
This universality made it impossible for Pitts to provide a model of the brain that was practical, and so his work was dismissed and more or less forgotten by the community of scientists working on the brain. What’s more, the experiment with the frogs had shown that a purely logical, purely brain-centered vision of thought had its limits. Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind. In his own mind, he had been defeated.
On Saturday, April 21, 1969, his hand shaking with an alcoholic’s delirium tremens, Pitts sent a letter from his room at Beth Israel Hospital in Boston to McCulloch’s room down the road at the Cardiac Intensive Care Ward at Peter Bent Brigham Hospital. “I understand you had a light coronary; … that you are attached to many sensors connected to panels and alarms continuously monitored by a nurse, and cannot in consequence turn over in bed. No doubt this is cybernetical. But it all makes me most abominably sad.” Pitts himself had been in the hospital for three weeks, having been admitted with liver problems and jaundice. On May 14, 1969 Walter Pitts died alone in a boarding house in Cambridge, of bleeding esophageal varices, a condition associated with cirrhosis of the liver. Four months later, McCulloch passed away, as if the existence of one without the other were simply illogical, a reverberating loop wrenched open.
Amanda Gefter is a physics writer and author of Trespassing on Einstein’s Lawn: A father, a daughter, the meaning of nothing and the beginning of everything. She lives in Cambridge, Massachusetts.
References
1. All letters retrieved from the McCulloch Papers, BM139, Series I: Correspondence 1931–1968, Folder “Pitts, Walter.”
2. All Jerome Lettvin quotes taken from: Anderson, J.A. & Rosenfield, E. Talking Nets: An Oral History of Neural Networks MIT Press (2000).
3. Conway F. & Siegelman J. Dark Hero of the Information Age: In Search of Norbert Wiener, the Father of Cybernetics Basic Books, New York, NY (2006).
Access to historical letters was provided by the American Philosophical Society.
Introduction
At the New Year’s public opening of the Imperial Palace on January 2 1969, a Japanese war veteran by the name of Okuzaki Kenzō (1920–2005) fired three pachinko pinballs from a slingshot aimed at Emperor Hirohito who was standing 26.5 meters away on the veranda greeting about 15,000 visitors. All three hit the bottom of the veranda, missing Hirohito. Not many people seemed to notice that it was Okuzaki who fired them. Okuzaki then shot off one more, calling to the ghost of his war comrade, shouting, “Yamazaki, Shoot the Emperor (Hirohito) with a pistol!” Again he missed. Policemen on guard duty searched frantically for the perpetrator but could not identify him in the crowd. It was not certain whether Hirohito himself noticed the pinballs hitting the bottom of the veranda. Together with Hirohito, his wife Empress Ryōko, his two sons – Princes Akihito and Masahito – as well as their respective wives were also standing on the veranda, but it remains unclear whether any of them were aware of this incident.
Okuzaki approached one of the policemen frantically moving around the crowd and grabbed his arm, telling him, “It is me who shot the pinballs. Let’s go to the police station.” Obviously he did this intentionally, hoping to be arrested on the spot. Later he confessed that yelling “Yamazaki, Shoot the Emperor with a pistol!” was his tactic to attract police attention. He expected that the word “pistol” would immediately alert the police to the possibility of danger and that he would be arrested forthwith. Yet, disappointingly, this did not happen and therefore he had to ask a policeman to arrest him.1
Scene of the Pachinko Ball Attack, Emperor Hirohito at the Imperial Palace, New Year 1969
Okuzaki took this bizarre action in order to be arrested so that he could pursue Hirohito’s war responsibility in the Japanese court system. In his trial, Okuzaki argued that Chapter 1 of Japan’s Constitution (The Emperor), in particular Article 1, is unconstitutional.2 Yet the judges of the Tokyo High Court, and subsequently the Supreme Court, ignored Okuzaki’s argument. As far as I know, Okuzaki is the only person in Japan’s modern history to legally challenge the constitutionality of the emperor system, and indeed to provide a compelling analysis.
This paper investigates how Okuzaki Kenzō, a survivor of the New Guinea Campaign of the Japanese Imperial Army, legally challenged Emperor Hirohito and his constitutional authority by pursuing his war responsibility in court. It particularly examines Okuzaki’s legal claim that Chapter 1 (The Emperor) in Japan’s postwar Constitution is incompatible with the fundamental principle of the Constitution elaborated in the Preamble.
Okuzaki’s Personal Background Prior to the New Guinea Campaign
In order to understand the above-mentioned bizarre incident, it is necessary to look into Okuzaki’s personal background and war experience, as well as his immediate post-war life.
Okuzaki was born on February 1, 1920 in Akashi City of Hyogo Prefecture. In1930, when he was 10 years old, Japan was hit by a severe economic slump triggered by the Great Depression, which began in the U.S. in October 1929. Consequently, in the first half of the 1930s 2.5 million workers in Japan lost their jobs.3 One was Okuzaki’s father. Because of the acute poverty of Kenzo’s family, he had to start work as soon as he finished his 6 years of elementary schooling. Unable to find a permanent job, he did odd jobs, mainly as a shop-boy at different shops in Kobe, Ashiya, and Nishinomiya. He also worked as a trainee seaman for two years.
It seems that he had a strong appetite for knowledge, and when he had some spare time he read many books including the Bible. He also attended church services for a short period.4 His interest in Christianity seemed to have contributed to creating his strong sense of justice and to formulating the unique idea of “god” he developed in the latter part of his life.
In March 1941, he was drafted into the Engineering Corps in Okayama, and as a member of a group of 60 newly conscripted soldiers he was sent to the Engineering Division in Jiujiang in Central China. Here they received training for three months, after which they engaged in construction of bridges and roads in the occupied territories as well as occasional combat fighting against Chinese troops.
At the end of January 1943, twenty soldiers including Okuzaki were transferred to the 36th Independent Engineering Regiment (hereafter the 36th IER), and Okuzaki became one of 350 members of the 2ndCompany of this Regiment. In late February of the same year, the 36th IER left China for Hansa on the north coast of East New Guinea on a convoy of transport ships, via Takao (Kao-hsiung) in Taiwan, Manila in the Philippines and the Palau.5
Historical Background of the New Guinea Campaign
It is necessary to briefly look at the historical background of the Japanese Imperial Forces’ campaign in New Guinea in order to understand Okuzaki’s long and agonizing struggle for survival in this campaign.
Japanese war leaders, feeling exhilarated by an unexpected series of victorious battles in the first four months of the Pacific War after the Pearl Harbor Attack in December 1941, became overconfident. They swiftly expanded their war operation zone far beyond their capability to dominate it, leading eventually to the complete self-destruction of Japanese Imperial Forces.
As soon as Japan seized the entire southwest Pacific, the Navy leaders, who were initially cautious of expanding the war zone, began seriously contemplating invading Australia, believing that occupation of Australia was essential for defending the Pacific war zone. The Army leaders, preferring to save manpower and reinforce their operational capability for the future war against Soviet forces, strongly objected. As a compromise, the Navy and Army agreed to jointly carry out the Operations MO and FS in order to cut off the transportation line between the U.S. and Australia. Operation MO was designed to capture Port Moresby on the southeast coast of New Guinea by May 1942, and Operation FS was intended to seize Fiji, Samoa and New Caledonia by July that year.6
However, as a result of Japan’s successive defeats in the battles of the Coral Sea, Midway, Guadalcanal, the Solomon Sea and the southwest Pacific between May and the end of 1942, the Imperial Headquarters called off Operation FS as well as the land attack on Port Moresby.
Yet, the Imperial Headquarters still had not given up on capturing Port Moresby. It drew up a new plan to send a large contingent of troops to Buna on the northeast coast of East New Guinea, and force them to march 360 kilometers to Port Moresby through dense jungle and the Stanley Mountains, which are 4,000 meters above sea level. For this plan, in mid-August 1942, about 15,000 soldiers from the South Seas Army Force (SSAF) and the 41st Infantry Regiment were sent to Buna. The food supply ran out within one month and the plan was a complete failure. When it was finally decided to withdraw the forces in January 1943, only 3,000 were rescued: more than 70 percent of the men had perished from starvation and tropical disease.7
Despite this series of colossal strategic failures of the Japanese military leaders and the resulting heavy casualties, less than a year after the opening of the war no one assumed responsibility. In fact neither the Army nor the Navy ever conducted serious studies designed to find reasons for those failures, and they made no effort to learn from them. On the contrary, Imperial Headquarters continued to provide false information to the nation regarding the state of the war. This total lack of a sense of responsibility on the part of the Japanese Imperial Forces was closely intertwined with the emperor system. Under the Imperial Constitution, the emperor, grand marshal of the Imperial Forces, was completely free from mundane responsibilities, being “sacred and inviolable.” Because the head of state and the military were free from any war responsibility, no one else accepted responsibility either.8
Despite the disastrous failure of the plan to capture Port Moresby, the Imperial Headquarters came up with a new plan, this time to recapture Buna and seize Lae and Salamaua (Salus). Taking these three places would allow the Japanese to advance to Kerema on the south coast of New Guinea, 200 kilometers northeast of Port Moresby. After surrounding and occupying Kerema, the Japanese would then proceed to their final destination, Port Moresby. However, in order to complete even the first half of this expedition, the troops would have to march several hundred kilometers from the northeast coast to Kerema through dense jungle and mountains.
The plan was prepared by staff officers of the Imperial Headquarters in Tokyo who had no knowledge of the topography of New Guinea. They drew it up based on their own experience of warfare conducted in China, i.e., on battlefields of flat, wide and open plains. Many soldiers mobilized for this operation were also sent from Manchuria. They were utterly unfamiliar with combat in the tropical jungle environment.9
To carry out this inept and futile plan, from March 1943, many troops of the 18th Army landed on the northeast coast of New Guinea. Eventually as many as 148,000 soldiers were mobilized for the campaign including 1,200 soldiers of the 36th IER to which Okuzaki belonged. Most of these men wandered aimlessly about in the jungle, constantly pursued by Australian and American troops, while hovering between life and death due to lack of food, water, medicine and ammunition. Many of them even turned to cannibalism in order to survive. Eventually 135,000 men perished, mainly due to starvation and tropical diseases such as malaria and dysentery, and only 13,000 survived – the death rate was 91 percent.10
On the other hand, the Australian and U.S. forces had conducted a close study of the geographical features of New Guinea and decided to fully utilize aircraft and battleships to counterattack the Japanese troops. They avoided as much as possible sending their own troops into dense jungle, being clearly aware of the dangers of jungle fighting. Instead they adopted the strategy called “leapfrog” or “stepping-stones,” by which they captured and occupied only the vital strategic places on the north coast of New Guinea such as Madang, Wewak, Aitape and Hollandia. By doing so, the Australian and U.S. forces chased the Japanese troops towards the northwest coast of New Guinea by continuously conducting aerial bombing and naval bombardment. Many Japanese troops were caught between the Allied troops stationed at these places, and, while hiding in the jungle they starved to death.11
Okuzaki’s Desperate Struggle for Survival in New Guinea
The 36th IER, which landed at Hansa in early April, moved towards the east down to Alexishafen, where they were assigned to build an airfield. Being the rainy season, it took three months to transport all the heavy construction gear on wagons 200 kms along the trackless seacoast. By the time they arrived at Alexishafen, many soldiers were suffering from malaria and could not work. Although they managed to complete the construction of the airfield within the following few months, the Allied forces gained command of the air in this area before the end of 1943 and started bombing the airfield. The 18th Army headquarters’ base on the mountain called Nagata located between Alexishafen and Madang also became the target of Allied bombing. In December 1943, the Japanese forces therefore decided to retreat to the base in Wewak, 400 kms west of Alexishafen.12
A long and desperate struggle for survival by Okuzaki and his fellow soldiers of the 36th IER and other troops of the 18th Army began at this point. When they reached Wewak in January 1944, they were ordered to retreat further west to Hollandia in West (Dutch) New Guinea, 400 kms from Wewak. There was a Japanese base in Aitape, which was located almost half way to Hollandia. Yet, as mentioned above, the Japanese bases in Aitape and Hollandia were attacked and taken over by the Allied forces well before the Japanese troops even reached Aitape.13
A picture drawn by one of the surviving soldiers in New Guinea
While walking in bush near Hollandia, Okuzaki was shot by a small group of Allied soldiers. His right thigh was wounded and the little finger of his right hand was severely injured. Yet, he managed to escape and still kept wandering around Hollandia for a few more days, searching for a passage towards Sarmi, a further 400km west of Hollandia. Eventually he realized that he did not have the strength to keep walking any longer and thus chose to be killed by enemy bullets. He boldly walked into Hollandia and surrendered, but he was taken prisoner and unexpectedly was treated well. From there he was sent to a POW camp in Australia where he remained until the end of the war.15Walking in the jungle and taking the long way around the Allied bases, it took Okuzaki 10 months to reach Hollandia, while most of his fellow soldiers perished in the jungle. Out of 350 members of the 2nd Company of the 36th IER, only Okuzaki and one other man survived – the survival rate was less than 0.006%. The number of survivors out of 1,200 men of the entire 36th IER was a mere six – the chance of survival was 0.005%.14
It seems that there are at least a couple of important reasons why Okuzaki survived. First, it was because he was selected as one of about 20 men on the reconnaissance patrol of the Regiment – i.e., four or five men from each Company. Their primary mission was to locate the Japanese food deposits, most of which were in the territories already occupied by the Allied forces, and to retrieve as many provisions as possible from them.16 It was quite a dangerous assignment, but by undertaking this task Okuzaki was able to gain sufficient food for himself from time to time. As time passed, Okuzaki and other members of this reconnaissance patrol became gradually separated from the rest of the troops as the patrol walked well ahead of them.
Eventually they were completely isolated from the many sick and starving soldiers left behind. As time passed, friction between the members of the reconnaissance patrol from different Companies also developed and eventually the patrols ceased to act in any cordinated fashion. For this reason, Okuzaki was not clearly aware at the time that cannibalism had become a widespread problem among the Japanese solders left behind in the jungle. It was not until 1982-1983, during the production of the documentary film “Yuki Yuki te Shingun (Onward Holy Army),” that he learned what had really happened among those starving fellow soldiers he had left behind.17
Another important factor for his survival was his personal character – a strong sense of justice and deep anger at unfairness. It is well know that, in the Japanese Imperial Forces, ill-treatment of soldiers by their officers and NCOs was endemic. Bentatsu (routine striking and bashing) was regarded by officers as a form of “spiritual training” for the soldiers. Defiance or mutiny by soldiers against their officers was severely punished, often brutally. Yet Okuzaki frequently resisted orders given by his superiors if he found them “unreasonable” or “unfair,” and he did so even by resorting to violence. Surprisingly, his officers and NCOs did not punish Okuzaki for his behavior. It seems that, because officers and NCOs felt ashamed to publicize the fact that they were beaten by a rank-and-file soldier like Okuzaki, they remained silent. Whatever the reason, Okuzaki soon became regarded an eccentric and his “temperamental behavior” went unpunished within his own unit. Okuzaki’s ability to distance himself from ironclad military rules and to maintain his independence was an important factor for his survival in the horrendous conditions of jungle warfare.
In 1969, while waiting for the trial of his “pinball incident” crime, Okuzaki wrote a long statement in preparation for the trial. He was at the time locked up in detention for many months. This statement can be called an “autobiography”; Part I is predominantly the detailed description of his horrific experience in New Guinea, and Part II is about his post-war life up to the “pinball incident” and the reasons for his action against Emperor Hirohito.18
Okuzaki’s depiction of the one and a half years long struggle for survival in New Guinea is strikingly graphic. Despite a 24-year time lag, his memories of what happened in New Guinea were so vivid that he could describe them as if they had happened yesterday. In other words, those memories were so powerful that it was impossible to eliminate them from his mind. He wrote of incidents such as a wild pig biting a sick soldier who could no longer stand up; a fellow soldier who had lost his mind due to an attack by a local villager with a poisoned arrow and could not stop calling Okuzaki’s name for help because of acute pain and deep fear of death; a soldier suffering from malaria and starvation begging Okuzaki to shoot and kill him (Okuzaki had walked away and left him behind); his own sense of shame for having blackmailed members of the reconnaissance patrol from a different Company in order to secure food provisions for the soldiers of his own Company; one of his comrades, Yamazaki, perishing in the jungle despite his strong desire to go home and his humane concern about his fellow soldiers’ fate.
In short, this statement is not a simple historical or intellectual account of the Japanese military campaign in New Guinea. Rather it is an intense and compelling accusation of the victimization of Japanese soldiers by their own military leaders led by the Grand Marshal, Emperor Hirohito. Although Okuzaki did not clearly express it in this statement, he was in fact suggesting that prosecutors and judges at the court, who would examine the “pinball incident,” must take “responsibility for this war victimization,” if they choose to condemn Okuzaki’s conduct against Hirohito. In other words, the heart of his argument was the _absurdity of the war_imposed upon millions of Japanese men by the nation and the ultimate liability that Hirohito had as the head of the state and its military forces.
The Postwar Life of Okuzaki
In Part II of the statement, Okuzaki explains how hard he worked in order to survive in the immediate post-war economy and society. Initially he worked as a coal miner but nearly died because of an accident in the mine. Then he worked as a factory worker, and married a young widow, who was working as a caretaker of the factory’s dormitory. He gradually set up a business selling car batteries.19 Undoubtedly he was diligent, yet it seems that his long and harsh war experience made him deeply distrustful of Japanese society, in particular people who abuse their power and exploit others.
In 1951, he opened a business selling car batteries and second-hand cars in a small shop in Kobe. The business prospered, benefiting from the Korean War special procurement boom of the 1950’s. As he needed larger premises for the shop, in early 1956 he decided to buy a house where he and his wife could live and run the business in the same building. He tried to secure a property through a real estate broker by the name of Nobuhara. However, Nobuhara was an infamous broker closely linked with yakuza gangsters. He made off with Okazaki’s money and Okuzaki was unable to secure the property.20
Infuriated, Okuzaki decided to attack Nobuhara. As he told his wife, he was prepared to go to jail for a short period but he had no intention of killing Nobuhara. In fact he gave his wife some money and asked her to pay for Nobuhara’s medical treatment if necessary. One day, Okuzaki went to see Nobuhara and stabbed him with a knife. Then he immediately took a taxi to a police station and confessed to the crime. About one hour later, while being investigated at the police station, Okuzaki was shocked learn that Nobuhara had died in hospital. Naturally he was immediately arrested.21
It was clearly a case of “bodily injury resulting in death,” in other words “manslaughter,” and according to Article 205 of the Criminal Law of Japan at the time, it was punishable by “more than two years’ imprisonment.” Considering the fact that Okuzaki voluntarily surrendered to the police, the prosecutors and the judge should have been lenient with him. Yet, the prosecutors accused Okuzaki of attacking Nobuhara with a clear intention of homicide. Okuzaki’s lawyer advised him to accept the prosecution’s charges and express his “remorse and repentance.” The solicitor said that such a humble attitude would bring a lenient judgment. In Japanese criminal trials, offenders’ sincere expression of “remorse and repentance” is deemed an important factor, often leading to a lenient judgment.
Yet, Okuzaki not only refused to compromise, but sent a statement to the prosecutors and the judge claiming that “this trial is a farce or a burlesque. Prosecutors should see the real world more clearly.” He was sentenced to 10 years’ imprisonment, the maximum punishment for such a crime. It is obvious that his anti-authority behavior did not help him to receive a fair trial. Over the following ten years, Okuzaki submitted a request for a retrial numerous times from prison, but to no avail.22
Development of Okuzaki’s Ideas on Japanese Society and the Emperor System While in Prison
As a result of his defiance of the prison authorities, he spent ten years in solitary confinement in Osaka Prison. He used this state of forced isolation to read numerous books and reflect on his life, as well as to think about various social and political issues. Soon he realized that there are many “enemies of the people” like Nobuhara, yet punishing such people or eliminating them would not solve the problem. He concluded that the real “enemy of the people” is the social structure, which keeps reproducing bad people and social problems including war.23
That social structure is, he believed, hierarchical, with the emperor residing at the top and every corner tainted with emperor ideology. In his eyes, this fundamental nature of Japanese society had not changed after Japan’s defeat in the war. Law, politics and religion still played vital roles in maintaining the inhumane social structure of the nation-state. He concluded that lawyers, politicians and religious leaders were obedient servants of the state and did not protect common people like Okuzaki. Thus he became deeply skeptical of the existing legal and political system.
Not long after he was imprisoned, he tried to gain permission from the head of the prison to send a telegram to the Minister of Justice, asking him to suspend the executions of all prisoners on death row. Following the telegram, he sought to send a statement to the Minister to explain his argument against capital punishment. He received no response to this request. Instead, he was examined by a psychiatrist, who diagnosed Okuzaki with “paranoia.”24
He was convinced that the problem lay with the vicious structure of the Japanese nation-state that mobilized tens of thousands of men for war and sent them to their death. Yet Hirohito, the person most responsible for this national tragedy, was not only free at large but admired by many Japanese. For Okuzaki, the same Japanese social structure constantly produced soial problems inluding crime, industrial pollution, unhappiness. He believed that it was imperative to destroy this venomous social structure based on the emperor system in order to create a new world, in which all could live happily and humanely. The new world should be constructed in accordance with “god’s will,” based on the principle of universality, equality and absolute truth. It is not clear what he really meant by “god,” as he did not elaborate upon this. He rejected state power, represented by the emperor, and claimed that people should be ruled by the principle of universality, equality and absolute truthfulness, not by the state. He did not elaborate upon “the principle of universality, equality and absolute truthfulness” either. However, here we can see a unique mixture of utopian anarchism and a vaguely Christian religious idea.25
It is interesting to note that Okuzaki tried to show continuity between the wartime “Emperor Fascism” and Japan’s so-called “post-war democracy.” He denounced post-war Japanese society, saying that it was not democratic at all. His claim was that “democracy” by nature was not compatible with “the emperor system.” He did not understand why Japanese people failed to realize this fact, which seemed self-evident to him.
The problem was that he could not articulate it lucidly, analytically or theoretically. It must have been extremely difficult to live for ten years without communicating with other people, except for a few prison guards. Although he had ample time to read, think, and write,26 solitary confinement prevented him from discussing his ideas with other people and re-examining his own thoughts from the perspective of others. He had no one to guide his private research towards more intellectual and constructive thinking.
It is therefore not surprising that, as time passed, he became more and more firmly convinced that his own ideas on such issues as society, politics, law, religion and the emperor system were absolutely correct. Based on his uncompromising belief that Japanese society had to be completely changed, he tried to take legal action 93 times from his prison cell, petitioning the Japanese government on numerous issues over ten years. Among those issues were cases involving the abolition of capital punishment, the unconstitutionality of the Self Defense Forces and the abolition of the emperor system. Indeed he submitted six petitions against the emperor system during his prison term.27 Although it was ironic that an anarchist like Okuzaki, who refused to recognize state power, submitted so many petitions to the government, it clearly demonstrates how deeply he felt about “injustice” in society. Sadly, however, the more obstinate his self-belief became, the more eccentric he was seen to be. This became a vicious cycle, particularly in his later life.
After his release from prison in August 1966, Okuzaki quickly re-established his business selling car batteries, working hard together with his wife. However, he was determined to disseminate as widely as possible the ideas that he had developed in prison. In pursuit of this aim, he attached banners to his business truck criticizing Hirohito as a war criminal and political statements such as “The real nature of the military and police force is violence! Nothing can be protected by violence!” Surprisingly many people expressed moral support for Okuzaki’s action. Encouraged by this, he contemplated taking some kind of “non-violent” action against Hirohito in order to publicize his idea of abolishing the emperor system and establishing a new society. At the end of December 1968, two years four months after his release from prison, he told only his wife of his plan, saying that there was no need to worry, as he had no intention to harm Hirohito.28
Okuzaki’s Solitary Battle Against Hirohito and the Emperor System
Okuzaki reasoned, however, that “because the emperor is the symbol of evil in modern society . . . , killing Hirohito per se would not solve the problem unless the current form of society, which keeps producing new emperors as well as imperial features in various places in society, would be fundamentally reformed.” Therefore he was not prepared to sacrifice his own life for such a futile act as killing Hirohito. His goal was to be arrested and have a chance to let the Japanese people know about his “idea of a new world without the emperor system.”29He knew that there was little likelihood that the pinballs aimed at Hirohito from a distance were likely to hit him. Even if they hit him, he thought it unlikely that Hirohito would be seriously injured. He nevertheless believed that “Hirohito deserves capital punishment for his crime of driving hundreds of thousands of Japanese men to their death in war.” He wrote that he would not mind killing Hirohito and even consequently receiving capital punishment himself “if that would bring truly eternal peace, freedom, and happiness to us.”
He succeeded in being arrested, yet, as mentioned above, few people around him noticed what Okuzaki had done while they were happily greeting the emperor and his family. Moreover, not even many among the 2,000 policemen standing guard became aware of what was happening at the time. The following day, all major national newspapers reported the incident, but all claimed that it was an act committed by a man suffering from paranoid personality disorder and amnesia, who had a criminal record of murdering a real estate broker. The Mainichi Newspaper was the only one to mention that Okuzaki was a survivor of the New Guinea Campaign in the Asia-Pacific War and that he had submitted six petitions holding the emperor responsible for the deaths of Japanese soldiers and calling for the abolition of the emperor system.30 Therefore, Okuzaki’s intention of politicizing his action against Hirohito and propagating his idea of establishing a new society miserably failed. In this sense, it was not Okuzaki but the majority of the Japanese population who were suffering from amnesia – i.e., remaining oblivious to the wartime suffering and the responsibility for it.
Shortly after Okuzaki’s arrest, he was sent to a psychiatric hospital for about two months. It seems that the prosecutors were trying to dismiss the case by handling it as “an act committed by a person suffering from paranoia and amnesia” in order to avoid a trial. The prosecutors may have realized that the trial of Okuzaki could become politically sensitive because it directly involved the person of Emperor Hirohito. Yet, since medical specialists did not diagnose Okuzaki as “psychopathic,” he was deemed capable of standing trial and so the trial had to be conducted.
The trial began in mid January 1970. In a minor assault case in which no injury occurs, the accused is usually released on bail prior to the trial. In fact, the Tokyo District Court accepted Okuzaki’s application for bail on January 24 1971, more than a year after he was arrested. However, the Tokyo High Court overruled it and therefore Okuzaki was not released until his second trial was completed on October 7, 1971. He was therefore detained for one year and ten months including two months in a psychiatric hospital.
Okuzaki’s Court Battle Against Hirohito
It seems that such harsh treatment of Okuzaki was due to the fact that the target of Okuzaki’s act of violence was not a common citizen but the emperor. If so, this was a violation of Article 14 of the Constitution of Japan, which guarantees the equality of all Japanese citizens under the law and forbids discrimination in political, economic or social relations because of race, creed, sex, social status or family origin. In other words, even the emperor must be treated equally as a Japanese citizen, otherwise Japanese citizens would be discriminated against on the grounds of “social status and family origin.” It seems that prosecutors and judges in the late 1960th were still under the influence of the old-fashioned concept of lese majesty (fukeizai, the crime of violating majesty, an offence against the dignity of a reigning sovereign) of the former Meiji Constitution.31 It can be said that, even before the trial actually started, the Okuzaki case clearly reflected an idea expressed by George Orwell’s phrase in Anima Farm, “All animals are equal but some animals are more equal than others.”
This trial is apparently the first and thus far the only case to involve the emperor personally under the new Constitution and after abolition of lese majesty in 1946. The prosecutors accused Okuzaki of committing a crime of assault against the emperor. A crime of assault is legally defined as a crime committed against “a natural person,” and therefore the emperor should also be regarded as a natural person, i.e., an individual, equal to all other Japanese citizens. Otherwise, as mentioned above, it would be a violation of Article 14 of Japan’s Constitution. Yet, in the indictment, the emperor’s personal name “Hirohito” was never mentioned, and only the term “Emperor” was used. In other words, the emperor was considered as some kind of “divine creature” and not as “a natural person.” It was and still is a custom in Japan not to mention the emperor’s personal name in public including newspapers and magazines because calling the emperor by his personal name is regarded discourteous.
Okuzaki and his lawyer strongly argued that “the crime victim” must be clearly identified as an individual in order to clarify the nature of the crime. Judge Nishimura Nori was sympathetic with this argument and advised the prosecutors to use the emperor’s individual name. The prosecutors refused to accept the judge’s advice. They appeared to see it as taboo to use the emperor’s personal name. Indeed, they claimed that the crime victim was “the emperor as a natural person, who is in the position of emperor,” and “there is no need to clarify his name as everyone knows who he is.”32 If that logic were followed it would mean that personal names of all the public figures such as Prime Minister, the Governor of Tokyo, and Vice Chancellor of Tokyo University would not be required in court cases. Clearly, the intention of the prosecutors was to preserve the special position of the emperor as opposed to Japanese citizens.
It was also extraordinary that the prosecutors presented no testimony of the crime victim, indeed, no evidence at all. It was and still is unimaginable to conduct the trial of a crime of assault without the victim’s testimony concerning the crime as well as his/her personal feeling as a victim. A crime of assault may provoke in a victim fear, anxiety or anger toward the perpetrator, even when no injury occurs. Thus it is essential to examine the experience and the feelings of the target of the assault. Without examining such essential matters, it cannot be proved that a crime of assault actually took place, and the court cannot assess the seriousness of the crime or the appropriate penalty. Nevertheless, the court heard only a limited number of eyewitnesses – several people from the crowd and a policeman whom Okuzaki approached after he shot the pinballs. No one testified against Okuzaki identifying him as the perpetrator of the assault and not a single affidavit was submitted. Indeed, the prosecutors did not even try to obtain Hirohito’s affidavit.33
Okuzaki requested that Hirohito appear as a witness, claiming that he had a right to a fair trial and to summon all the witnesses he required. He also submitted the following questions he wished to ask Hirohito during cross-examination.34
- Name, Position and Career of the witness.
- Do you know the accused Okuzaki Kenzo?
- Did you notice that the accused shot pinballs towards the right-hand side of the veranda of the Imperial Palace, where you and your family members were standing at the New Year’s public opening of the Imperial Palace on January 2, 1969?
- Did you know who on the veranda the pinballs were aimed at? Did you think that they were aimed at you?
- After this incident, did you read any press reports or watched TV news concerning the accused action? Have you received any account of this incident from your chamberlains? Have you ever discussed this incident with your family members? Have you ever seriously thought over this incident?
- Do you know that the accused is one of the surviving rank-and-file soldiers of the Imperial Army, who were drafted into the Pacific War conducted under the name of “the Holy War,” fought in New Guinea, wounded and narrowly escaped death?
- As a fellow human being, how do you explain the fact that you were the Supreme Commander of the Imperial Forces (the so-called “Holy Army”) in which the accused was drafted, and that the above-mentioned war was conducted under your authority, and that the accused was one of the victims of the above-mentioned war?
- How do you respond to the fact that the action by the accused was carried out to console the spirits of tens of thousands of his comrades who died as a result of starvation and injuries in New Guinea, and as a memorial service for them?
- You are regarded as the victim of this incident. How do you assess the action carried out by the accused? Do you wish for clemency for the accused or punishment of the accused? How do your family members, who were together with you on the veranda, feel about this incident?
- Other relevant questions.
The prosecutors opposed Okuzaki’s request to cross-examine Hirohito without explanation. Judge Nishimura also rejected the request, simply claiming “there is no necessity to do that.” When his request for summoning Hirohito was rejected, Okuzaki dismissed his lawyer and from this point the trial continued without a lawyer for the defense. By dismissing his lawyer, Okuzaki probably wanted to show his strong disapproval of the exercise of state power and the legal authority of the state. He might have thought that even his lawyer was part of the legal authority and thus of the state apparatus.35
As a result of this unexpected action by Okuzaki cross-examinations of witnesses including Hirohito – scholars, writers, war veterans, and relatives of the soldiers killed in action – were not conducted at all. It seems that Okuzaki could not organize those witnesses without his lawyer’s assistance. Thus the trial was concluded without identifying the name of the crime victim, without presenting the testimony of the victim, and without cross-examining the witnesses the accused requested. In other words, this was an extraordinary trial case, which can be called a quasi-trial of lese majesty. Strictly speaking, it appears to have been an unconstitutional trial, in which prosecutors tried to punish Okuzaki by applying lese majesty, despite the fact that such crime had been abolished in 1946.
It was also extraordinary that the prosecutors demanded three years imprisonment for the accused when the maximum punishment for a crime of assault at that time was two years imprisonment. Although, in the judgment handed down on June 8 1971, Judge Nishimura acknowledged that Okuzaki’s motivation for his action against Hirohito was to condemn Hirohito’s war responsibility, he did not discuss whether Hirohito himself was partly accountable for inducing Okuzaki’s crime. Judge Nishimura claimed “considering relevant matters directly related to the case in question such as the motivation of the accused, circumstances, behavior as well as the purpose of Article 14 of the Constitution, it is improper to impose a sentence that exceeds the punishment stipulated by Article 208 of the Criminal Law, which the prosecutors demand ……”36
Thus Okuzaki was sentenced to one and a half years imprisonment with credit for the 180 days spent in detention, although he had already spent more than one year in detention by then. This gave the impression that Judge Nishimura paid attention to Article 14 of the Constitution and thus treated Okuzaki and Hirohito equally as “natural persons.” Yet in the same judgment he discriminated against Okuzaki by treating Hirohito preferentially, stating that it was “a well prepared and planned crime carried out against the Emperor, ….. therefore the criminal liability of the accused is serious.”37 Moreover, as already noted, the way that the trial was conducted as a whole appears to have been unconstitutional.
On June 8, 1970, i.e., the same day the judgment was handed down, both Okuzaki and the prosecutors’ office appealed to the Tokyo High Court. This second trial, which was conducted by three judges – Chief Justice Kurimoto Kazuo, Judge Ogawa Izumi and Judge Fujii Kazuo, concluded on October 7, 1970. Okuzaki was found guilty again, and in the final judgment the judges strongly agreed with the prosecutors’ opinion that “the case in question is a crime committed against the Emperor, who is the symbol of the state and of the unity of the people as defined in the Constitution of Japan, and therefore it is a crime of a vicious nature with serious impact on society.”38
In other words, the judges condemned Okuzaki’s act a crime violating Article 1 of the Constitution. Yet, there is no such “crime against the symbol of the state and of the unity of the people” defined by present Japanese criminal law. As lese majesty was abolished in 1946, Okuzaki’s act could not be regarded as a criminal act except as a “crime of assault” under the current law. Therefore, as noted above, it is undoubtedly a violation of Article 14 of the Constitution to regard an act of assault against the emperor as particularly grave and serious in comparison with the same act committed against an ordinary Japanese citizen. Indeed, in this final judgment, unlike the judgment of the first trial, there was no reference at all to Article 14 of the Constitution. This judgment thus appeared a stronger application of lese majesty in comparison with the judgment of the first trial, and thus unconstitutional.
Nevertheless, as far as the actual penalty imposed upon Okuzaki was concerned, the final judgment supported the judgment of the first trial, i.e., one and a half years imprisonment, and rejected the prosecutors’ demand for three years imprisonment as an excess over the legally specified maximum punishment. Furthermore, it gave credit for one and a half years spent in detention instead of 180 days. That allowed Okuzaki to be released immediately.39 In this way, the reaction of the judges of the Tokyo High Court to this first criminal case committed against the emperor after the war was a strange mixture of the old fashioned idea of lese majesty and respect for the Criminal Law formulated under the new post-war Constitution promulgated in 1946.
Okuzaki’s Denunciation of Article 1 of the Constitution of Japan defining the position of the Emperor
Interestingly, Okuzaki’s struggle against Hirohito and the emperor system did not stop here. Soon he appealed to the Supreme Court. In his appeal, he stated:
Both the prosecutors and judges, who indicted or sentenced me at the first and second trials, respect the person, whom they regard as a victim of the case in question, as the Emperor. However, according to the Preamble of the Constitution, “we reject and revoke all constitutions, laws, ordinances, and rescripts in conflict” with the “universal principle of mankind.” It is our clear common understanding that the existence of the emperor is in conflict with the “universal principle of mankind.” The emperor’s authority, value, legitimacy and life are only temporary, partial, relative and subjective.Therefore, the fundamental nature of the emperor is absolutely, objectively, entirely and permanently depraved. Hence, Articles 1 to 8 of the current Constitution, which endorse the existence of the emperor, are definitely invalid. For a person with normal discernment and mind, those Articles are nonsensical, obsolete and foolish …… (emphasis added)40
This is an extremely powerful and logical argument, and as far as I know, so far no one has ever deliberated such a compelling denunciation of Chapter 1 (Articles 1 – 8) “The Emperor” of the Constitution of Japan. In the same appeal, Okuzaki also stated that both of his previous trials were violations of Article 14 and Article 37. Article 37 guarantees Japanese citizens’ right to a fair trial.41
On April 1 1971, the Supreme Court (Chief Judge Ōsumi Kenichirō, Judge Iwata Makoto, Judge Fujibayashi Masuzō, and Judge Shimoda Takezō) dismissed Okuzaki’s appeal in a very short statement (five lines). It claimed that Okuzaki’s argument on the invalidity of Articles 1 to 8 of the Constitution was “irrelevant to his case pertaining to Article 405 of the Criminal Law,” and that his condemnation of the violation of Articles 14 and 37 is simply due to his “misunderstanding of fact.” It gave no explanation whatsoever as to why Okuzaki’s argument was irrelevant, or what he had misunderstood.42
Such an abrupt statement by the judges gives an impression that they did not take Okuzaki’s case as a serious legal challenge to the Constitution. Or it could be speculated that Okuzaki’s argument was so forceful and compelling that they were incapable of refuting it. In fact, during the second trial, Okuzaki presented a similar argument on the denunciation of Article 1 of the Constitution of Japan, but the judges of the Tokyo High Court claimed that Article 1 explains that the emperor’s position derives “from the will of the people with whom resides sovereign power” and therefore it does not contradict the Preamble.43 It is obvious, however, that the judges of the Tokyo High Court also avoided discussing the crucial issue, i.e., the contradiction between the universal principle of mankind and the emperor system that Okuzaki had sharply pointed out.
In order to truly understand Okuzaki’s discussion of the relationship between the universal principle of mankind and the fundamental nature of the emperor system, we need to read the entire first paragraph of the Preamble including the part Okuzaki used in his appeal to the Supreme Court.
“We, the Japanese people, acting through our duly elected representatives in the National Diet, determined that we shall secure for ourselves and our posterity the fruits of peaceful cooperation with all nations and the blessings of liberty throughout this land, and resolved that never again shall we be visited with the horrors of war through the action of government, do proclaim that sovereign power resides with the people and do firmly establish this Constitution. Government is a sacred trust of the people, the authority for which is derived from the people, the powers of which are exercised by the representatives of the people, and the benefits of which are enjoyed by the people. This is a universal principle of mankind upon which this Constitution is founded. We reject and revoke all constitutions, laws, ordinances, and rescripts in conflict herewith”. (emphasis added)
As elaborated in the second paragraph of the Preamble, this universal principle of mankind also includes “the preservation of peace; the banishment of tyranny and slavery, oppression and intolerance for all time from the earth; and the right to live in peace, free from fear and want.” In Okuzaki’s mind, if we have truly “resolved that never again shall we be visited with the horrors of war through the action of government,” why is the person most responsible for causing “horrors of war” still free from punishment for his role as commander in chief in the Asia-Pacific War? If we have decided that we abide by the principle of mankind, why does such an irresponsible person, who destroyed peace, created tyranny, slavery, oppression and intolerance, and violated the right of many Japanese and Asian people to live in peace, free from fear and want, still enjoy the prestige defined by the Constitution supposedly established upon the universal principle of mankind?
In other words, Okuzaki was clearly pointing out the inherent contradiction between the basic philosophy of the Constitution and its Article 1 “The Emperor.” Although Okuzaki did not discuss the source of this contradiction, it was undeniably created by the GHQ of the US Occupation Forces of Japan which decided to acquit Hirohito of his war crimes in order to politically exploit his majesty for the smooth control of post-war Japan, and to present him as a symbol of “peaceful post-war Japan” for the purpose of American benefit.44
As mentioned above, no one except Okuzaki has challenged the constitutionality of the emperor so vigorously and persistently. Post-war Japan produced many eminent writers, who produced novels and semi-autobiographies based on their own experiences as Imperial soldiers. Among them are Ōoka Shōhei, Noma Hiroshi, Gomikawa Jumpei, and Shiroyama Saburō, who conveyed strong anti-war sentiment thorough their moving stories. Some former soldiers, in particular those returned from China several years after the war, having been re-educated by the Chinese communist government, published honest and critical accounts of atrocities they themselves committed.45 Yet, they hardly discussed Emperor Hirohito’s war responsibility, and virtually none of them questioned the post-war constitutionality of the emperor.
Watanabe Kiyoshi, a survivor of the battleship Musashi destroyed and sunk by the U.S. forces in the Battle of Leyte Gulf in October 1944, wrote excellent essays for many years after the war, criticizing Hirohito’s performance during and after the war. As far as I know he was the only former soldier, who sent a long open letter to Hirohito in 1961, harshly questioning him about his involvement in decision-making in various stages of the Asia-Pacific War. It is an excellent historical analysis of Hirohito’s war responsibility based on Watanabe’s thorough research of military and other official records. At the end of this open letter, however, Watanabe demanded that Hirohito abdicate the throne in order to show sincerity for his war responsibility, but did not question the constitutionality of Hirohito’s status as the emperor.46
Furthermore, as far as I know, no constitutional scholar in Japan has ever discussed the issue of the constitutionality of the emperor. We Japanese need to ask ourselves why we have failed to question such a crucial matter. It surely has to be faced if we are to establish a democratic society based upon a genuinely democratic constitution.
Conclusion: Who is Responsible for Creating an “Eccentric Person” like Okuzaki?
It is sad to see that Okuzaki, who had such a sharp mind, strong will and fervent sense of justice, broke down as a human being after the failure of this legal battle, which was fought for the purpose of promoting his bold idea of establishing “a happy and peaceful society without the emperor system.” The more people viewed Okuzaki as eccentric with extreme ideas, the more self-righteous and anti-authority he became, in particular toward lawyers and politicians. Okuzaki not only verbally condemned all who disagreed with him, but often resorted to violence in order to compel others to accept his ideas.
This is clear when we view his performance in the documentary film “Yuki Yuki te Shingun” produced between 1982 and 83. The dilemma and irony for viewers of this documentary film is that, without using violence to make former officers confess, Okuzaki probably could not reveal the fact that two of his comrades were executed by their officers 23 days after Japan officially surrendered to the Allied forces. Although they were executed on the excuse of “desertion in the face of the enemy,” the real reason was that they had refused to participate in group-cannibalism in New Guinea during the war. Officers wanted to silence them to cover up this dreadful fact.
Okuzaki believed that his idea was absolutely and always right, and eventually he saw himself as a martyr, who had a duty to follow a sacred calling from his “god” – a calling to establish a free, egalitarian and happy society like the utopia that Thomas More described, in which no one is controlled or exploited by anyone else. In December 1983, he committed manslaughter again – killing a son of former officer Muramoto Masao, who gave an order to execute the above-mentioned two soldiers. For this crime he was imprisoned again for 12 years.
Undoubtedly war, in particular, war of aggression, is an act of madness. Regardless of the official reason for the war, one cannot kill so many people without deadening the conscience of society. At the same time, one cannot be prepared to be killed unless one is prepared to kill others. For Okuzaki, who was forced to experience the madness of war and saw many people dying in front of his eyes, it was unimaginable that the person, who was most responsible for creating such madness and driving hundreds of thousands of fellow human beings to their deaths, seemed to have no conscience and no sense of accountability at all. Equally unimaginable to Okuzaki was the fact that society shielded the emperor from responsibility for the deaths of millions.
Indeed, while Okuzaki’s acts of violence against a few individuals bore opprobrium in postwar Japan, the person who created the madness of war that took the lives of millions continued to be venerated by the people as the symbol of a peaceful nation. For Okuzaki, this situation itself was mad. It must have been extremely difficult for him to encounter this madness, particularly to accept the fact that the large majority of his fellow citizens, including many former soldiers who had experienced that madness and saw their comrades die in vast numbers even as they barely survived themselves, saw this as neither “mad” nor “absurd.”
For Okuzaki, people who considered him eccentric and appalling had failed to understand the madness of war. “How could you forget this madness?” We can vividly feel Okuzaki’s intense anger when we read his appeal to the Supreme Court, or see the documentary film “Yuki Yuki te Shingun.”
“How could you forget this madness?” Okuzaki’s anger is palpable. But so engrossed did he become in pursuing Hirohito’s war responsibility that he lost the ability to remember his own responsibility to respect the lives and basic human rights of others.
The problem was, however, because he was so engrossed in pursuing the war responsibility of Hirohito and others that he lost the sanity of remembering his own responsibility to respect the lives and basic human rights of others. Indeed, he paid little attention to the fact that the war victims were not only Japanese soldiers but also many civilians, in particular those killed by indiscriminate bombings conducted by the US forces in the final stage of the war. Similarly, he hardly commented on the deaths of millions of Asian people, i.e., the victims of Japan’s atrocious war conduct. In other words, he was not really capable of internalizing the pain of war victims other than his own fellow soldiers.
Therefore, we need to remember that we Japanese including Hirohito, who have failed to internalize the pain of war victims as our own and to carefully pursue Japan’s war responsibility, are indeed responsible for creating a contradictory, complex and difficult person like Okuzaki Kenzō.
We may need to learn from Okuzaki’s life that we should not forget that the madness of war actually paralyzes our sanity to understand how mad and absurd all wars are.
Limping in crutches, his broken leg shielded in plaster following a jogging accident, the distinguished biologist Edward O. Wilson made his way slowly toward the stage at a convention of the American Association for the Advancement of Science in 1978. Climbing the stairs, taking his seat, and shuffling his notes, a sudden burst of activity punctuated the silence as the entire front row of the audience leapt onto the stage hurling insults. They jostled Wilson and then poured iced water over his head. The protesters would turn out to be Marxists, incensed by the publication of Wilson’s book Sociobiology.
This story has become a familiar feature of the nature/nurture debate, used to illustrate the vitriolic hostility expressed by ideological groups scrambling to silence what most people already take to be an incontrovertible fact: that humans, just like every other species on earth, have a nature. As crowds abandoned Wilson to evacuate the auditorium that day, one man at the back of the room tried to push his way forward against the multitude heading towards the exits. “It was the most hateful, frightening, and disgusting behavior I’ve ever witnessed at an academic assembly,” the famed anthropologist Napoleon Chagnon would later recall. He didn’t know it then, but the events of that day were an omen of things to come for Chagnon himself, whose Wilsonian worldview would help to bring about one of anthropology’s greatest controversies.
Napoleon Chagnon, 1938 – 2019. Photo courtesy of the Chagnon family
Chagnon, who passed away last week, has been remembered as one of the last titans of anthropology, and perhaps the last ethnographer in the vein of Mead and Malinowski to go deep into a remote part of the world and live among a relatively un-acculturated and unstudied people. The research he conducted there would inspire millions to take an interest in the world’s traditional cultures and the field of cultural anthropology. Dripping in perspiration, his hands and face swollen by stinging insects, Chagnon disembarked at a remote Venezuelan village along a piranha-infested river deep in the Amazonian interior in 1964. Stepping out of his aluminum rowboat, he was immediately nauseated by the smell of decaying vegetable matter and feces. He pushed his way through a wall of leaves and stepped into the open: “I saw a dozen burly, naked, sweaty, hideous men nervously staring at us down the shaft of their drawn arrows!” The Yanomamö people lived with the ever-present threat of violence from raiding villages, and from his very first encounter with them, Chagnon understood the paranoia that consequently pervaded their everyday life.
Napoleon Chagnon with members of the Yanomami in the late 1960s or early 1970s. Photo courtesy of the Chagnon family
It took a long time for Chagnon to acclimatize to the deep interior of the Amazon Rainforest and its unique threats. The insects continued to plague him—not just the flying stinging ones, but termites that claimed unguarded shoes as nests, and spiders and scorpions drawn to warm clothes in the middle of the night. He would later find himself face to face with an anaconda. “I laid my double-barrel twelve-gauge shotgun on the bank next to me,” he recalled. Moments later, “the water exploded in front of me: a very large anaconda head shot out of the water and whizzed just inches from my face. I immediately went into a rage: this son-of-a-bitch of a snake was trying to kill me!” Chagnon began firing rounds into the snake, which violently twisted and turned as he reloaded, fired, and reloaded. But Chagnon was more worried about jaguars, which were known to kill groups of men in a single attack. From time to time, Chagnon and his companions would find themselves stalked by these predators, sometimes for hours on end. They could be heard at night, prowling around the makeshift camps he slept in as he travelled between villages. One night, he awoke to find a jaguar baring its teeth at him as he lay in his hammock. But the mosquito net and the yelling of villagers confused the animal, which darted back into the bush.
In 1966, Chagnon began working with the geneticist James Neel. Neel had had managed to convince the Atomic Energy Commission to fund a genetic study of an isolated population and was able to pay Chagnon a salary to assist his research there. Neel’s team took blood samples from the Yanomamö, and began administering the Edmonston B vaccine when they discovered that the Yanomamö had no antibodies to the measles. In some ways, the Yanomamö sounded like something out of any anthropology textbook—they were patrilineal and polygamous (polygyny); like other cultures around the world, they carved a position for the levirate—a man who married his dead brother’s wife; they had ceremonial roles and practised ritual confinement with taboos on food and sex. But sometimes this exotic veneer would be punctured by their shared humanity, particularly their mischievous sense of humour. Early in Chagnon’s research, the Yanomamö pranked the anthropologist by providing him with vulgarities when he asked their names. He did not realise this until he began bragging to a group of Yanomamö about how well he now understood their genealogies. As he began, the Yanomamö erupted into laughter, tears streaming from their faces. They begged him to continue and, oblivious, Chagnon went on: “Hairy Cunt was married to the headman, Long Dong, their youngest son was Asshole, and so on.” When he discovered he’d been tricked, Chagnon was embarrassed and furious that five months of patient name gathering had yielded nothing but a litany of insults. From that day forward, he would cross-check all information between individual Yanomamö informants and villages.
But for all their jocularity, Chagnon found that up to 30 percent of all Yanomamö males died a violent death. Warfare and violence were common, and duelling was a ritual practice, in which two men would take turns flogging each other over the head with a club, until one of the combatants succumbed. Chagnon was adamant that the primary causes of violence among the Yanomamö were revenge killings and women. The latter may not seem surprising to anyone aware of the ubiquity of ruthless male sexual competition in the animal kingdom, but anthropologists generally believed that human violence found its genesis in more immediate matters, such as disputes over resources. When Chagnon asked the Yanomamö shaman Dedeheiwa to explain the cause of violence, he replied, “Don’t ask such stupid questions! Women! Women! Women! Women! Women!” Such fights erupted over sexual jealousy, sexual impropriety, rape, and attempts at seduction, kidnap and failure to deliver a promised girl.
Internecine raids and attacks often involved attempts by a man or group to abduct another’s women. “The victim is grabbed by her abductors by one arm, and her protectors grab the other arm. Then both groups pull in opposite directions,” Chagnon learned. In one instance, a woman’s arms were reportedly pulled out of their sockets: “The victim invariably screams in agony, and the struggle can last several long minutes until one group takes control of her.” Although one in five Yanomamö women Chagnon interviewed had been kidnapped from another village, some of these women were grateful to find that their new husbands were less cruel than their former ones. The treatment of Yanomamö women could be particularly gruesome, and Chagnon had to wrestle with the ethical dilemmas that confront anthropologists under such circumstances—should he intervene or remain an observer? Men frequently beat their wives, mainly out of sexual jealousy, shot arrows into them, or even held burning sticks between their legs to discourage the possibility of infidelity. On one occasion, a man bludgeoned his wife in the head with firewood and in front of an impassive audience. “Her head bounced off the ground with each ruthless blow, as if he were pounding a soccer ball with a baseball bat. The head-man and I intervened at that point—he was killing her.” Chagnon stitched her head back up. The woman recovered but she subsequently dropped her infant into a fire as she slept, and was later killed by a venomous snake. Life in the Amazon could be nasty, brutish, and short.
[
Chagnon would make more than 20 fieldwork visits to the Amazon, and in 1968 he published Yanomamö: The Fierce People, which became an instant international bestseller. The book immediately ignited controversy within the field of anthropology. Although it commanded immense respect and became the most commonly taught book in introductory anthropology courses, the very subtitle of the book annoyed those anthropologists, who preferred to give their monographs titles like The Gentle Tasaday, The Gentle People, The Harmless People, The Peaceful People, Never in Anger, and The Semai: A Nonviolent People of Malaya. The stubborn tendency within the discipline was to paint an unrealistic façade over such cultures—although 61 percent of Waorani men met a violent death, an anthropologist nevertheless described this Amazonian people as a “tribe where harmony rules,” on account of an “ethos that emphasized peacefulness.”1 Anthropologists who considered such a society harmonious were unlikely to be impressed by Chagnon’s description of the Yanomamö as “The Fierce People,” where “only” 30 percent of males died by violence. The same anthropologist who had ascribed a prevailing ethos of peace to the Waoroni later accused Chagnon, in the gobbledygook of anthropological jargon, of the “projection of traditional preconceptions of the Western construction of Otherness.”2
...
The quest for knowledge of mankind has in many respects become unrecognizable in the field that now calls itself anthropology. According to Chagnon, we’ve entered a period of “darkness in cultural anthropology.” With his passing, anthropology has become darker still.
Matthew Blackwell is an Australian writer and graduate of the University of Queensland where he studied economics and anthropology.
What has the ‘international style of architecture’ – now going out of fashion – but which arose as a universalist aesthetic intended as a weapon to counter the nationalist upheavals at the turn of the 20th century, got to do with today’s geo-politics?
Well – more than might be imagined.
We all can be only too aware of the so-called ‘culture wars’, which are rending Britain, the US and Europe apart. We can see plainly this fracture, around which are arranged the two warring armies: On one side fly the banners of the Enlightenment ideal of ‘incontrovertible’ reason, from which the leap the idols of technology, of cosmopolitan homogeneity – and too, the ‘progressive agenda’: i.e. the embrace of human rights, rights of immigration, diversity, ecology and gender politics. And on the other front, those like Philosopher Johann Gottfried Herder, who considered the great imperialists such as Charlemagne — the “villains of history” who “stomped out native cultures.” Herder believed every culture possessed a unique Volksgeist, or way of life, incommensurate with others.
In the end however, the internationalist values have been pursued overwhelmingly (and purposefully) to the cost of ‘belonging’.
The lesson of the ongoing backlash against globalization is that political and cultural logic – rooted in an emotional attachment to our own roots, and to a distinctive cultural way of life, cultivated among one’s own kind – belongs to a wholly different pole (and dimension) to that of a ‘rational’ and universalizing ethos of economics and technology.
Far from moving forward in lock-step progress, these two ‘poles’ of consciousness, when they meet, they clash. And clash bitterly (as recent events in Britain’s parliament exemplify). Is there a possibility for synthesis, for compromise? Possibly not. It is an ancient rift between global utopia and local sovereignty. The strength of the globalists lately has been waning, and the other pole notably strengthening.
Philosopher Roger Scruton explains this shift towards the ‘sovereignty-ists’: “We are, as the Germans put it, heimatlich creatures — we have an inherent need to belong, and to belong somewhere, in a place to which we commit ourselves as we commit ourselves to the others who also belong there. This thought is disparaged by those who see only its negative side — the side that leads to belligerent nationalism and xenophobia. But those are the negative by-products of something positive, just as the international style was the negative by-product of a laudable desire to soften the barriers and smooth out the suspicions that had been brought into prominence by World War I.”
A European or global ‘melting pot of identities’, in other words, is possible only at the cost of shedding community roots and particularities. But Scruton’s point about the internationalist style of architecture (those “glass boxes and concrete plazas” to which nobody could belong – a “nowhere” style), goes further.
His architectural metaphor extends to the globalist zeitgeist as a whole: “The evidence is overwhelming that ugly and impersonal environments lead to depression, anxiety and a sense of isolation and that these are not cured; but only amplified, by joining some global network in cyberspace. We have a need for friends, family and physical contact; we have a need to pass people peacefully in the street, to greet each other and to sense the safety of a cared-for environment, that is also ours. A sense of beauty is rooted in these feelings.”
Here is the point: Isaiah Berlin argued that cosmopolitanism was an empty vessel. “[I]f the streams dried up … where men and women are not products of a culture, where they don’t have kith and kin and feel closer to some people than to others, where there is no native language — that would lead to a tremendous desiccation of everything that is human.”
That ‘other’ political and cultural logic, rooted in an emotional attachment to our own roots, and distinctive ways of living life, cultivated among one’s own kind’, is of course the very rootstock for possessing the quality of empathy – of being capable to embrace ‘otherness’. Having a sense of one’s own roots, brings recognition that every culture possesses a unique Volksgeist, or way of life, incommensurate with others.
Washington today does not ‘understand’ otherness. It does not even try very hard. It cannot fathom Iran (or China, or Russia). These latter states seem to DC to reject the ‘incontrovertible rationality’ that the European Enlightenment bequeathed to the world. They too, are seemingly ‘irrationally opposed’ to the ‘progressive moral outlook’ that has, in past years, informed European and American foreign policy.
This lack of empathy precisely defines the multiple policy failures. ‘Internationalist foreign policy’, which like its namesake architecture – is a style, detached from any empathy with place or people. It is also a nowhere style (one size policy fits all), demanding global homogeneity and compliance.
Its root in an abstract, ‘irrefutable rationality’ clashes utterly with President Trump’s mercantilist foreign policy style. As a consequence, no one sees any point to negotiating with such a conflicted entity (as the US is), oscillating uncertainly between these two oppositional poles. No one knows where stands the policy, from day-to-day.
Let us illustrate with an example: President Trump – the mercantilist – wants to exit Syria. His Syria Envoy, James Jeffery, however, is explicitly ‘internationalist’. These two approaches do not march together in any complementary way — where they meet, they clash, and trip each other up.
Jeffrey, on Trump’s compromise of drawdown in Syria, rather than full withdrawal:
“There is some reduction in forces in Syria. [But] we are making up for that [the President’s drawdown order], by keeping a very strong presence in Iraq. We’re making up for that with very strong air components. We’re making up for that with more Coalition forces on the ground. So we’re finding ways to compensate for it.”
Interviewer: “I want to move on to the U.S. presence at al-Tanf. There are some 10,000 Syrians living in a remote settlement, living in squalor [on US militarily occupied Syrian territory]. There are reports of some having starved. And yet there’s a U.S. military base some 10 miles away. Why hasn’t the U.S. simply stepped in, and helped provide food?”
“Well, first of all because we’re not actually responsible for these people. The government of Syria is responsible for them. International agencies are responsible…”
Interviewer: I think critics of the U.S. approach to Rukban would say that in exercising military control over the area, the U.S. has certain responsibilities, certain legal ones, as laid out under the Fourth Geneva Convention. But I take it you don’t see it that way?
“First of all, on the Fourth Geneva Convention, I would check with that. I do not believe the Pentagon would claim that the Fourth Geneva Convention applies to the refugees in al-Tanf. That’s the first thing …”.
Interviewer: [US pressures on Assad?]
“… we’re doing a great deal. We’ve got a very broad sanctions program that Treasury runs. We have very close coordination with the E.U., which runs its own sanctions program. We have blocked all reconstruction assistance from anywhere including UNDP [U.N. Development Programme], World Bank, any place, anywhere, inside Assad’s part of Syria. We are pursuing aggressively a ‘no diplomatic recognition’ policy throughout the world. For example, the Syrians were not invited back into the Arab League. So we’re putting as much pressure on the regime, and on its supporters, Russia and Iran, as possible.
But also, although it’s not our purpose in being in northeastern Syria, we are in northeastern Syria. And that, by its nature, keeps the regime out. The Turks are in northwest Syria for their own reasons, but that keeps the regime out. The Israelis are going after Syria’s ally Iran for its long-range systems that it’s introduced into Syria. So there’s a great pressure we’re putting the regime under.
Interviewer: [Is the US fighting ISIS in Syria?]
“I’m concerned about, first of all, are they [ISIS] setting up another caliphate? Are they holding more territory? No. Are the incidents extraordinarily low by every measure we’ve made in Afghanistan and Iraq? Absolutely, yes. Do we have areas where they seem to be persistent, pervasive, resilient, especially in Iraq, yes? In certain areas. And that’s the thing that has concern.
“This, (an USAF attack on a base on the Tigris), is the only case I can think of in either country where we’ve actually had a little tiny military operation, or several military operations, to clean these guys up. Most of the time, they’re on the move. I know in the Badia desert, south of the Euphrates, and we’re very worried about that. We’ve taken certain actions that I can’t get into, against them. They float around like desert nomads. They strike the Russians. They strike the regime. They strike the Iranians. They stay away from us because they know what’s going to happen.”
A ‘progressive’ Israeli commentator, in a separate article, The progressive case for staying in Syria, for now, lauds how the “Pentagon and State Department have since been able to slow the pace of withdrawal of U.S. troops [that Trump wanted], and are looking for replacements from Coalition nations … Most importantly, perhaps, to progressives, this protection would prevent the grave human rights abuses that would otherwise await the millions of Syrians … [and additionally] Retreating from the northeast, would mean forfeiting U.S.-backed forces’ control of a third of Syria’s territory and 80% of its natural resources, eliminating what little leverage America has left in shaping the post-war landscape. The Assad regime, even at its weakest point, was unwilling to seriously negotiate with the opposition. Now that it feels confident and victorious, it is much less likely to accede to any Western demands to reform; or step down.” [Emphasis through-out has been added].
Scruton’s points about the ‘internationalists’ contingent loss of any sense of empathy and beauty – amidst the ugliness, and the de-cultured, drabness of our physical (and intellectual) environment – are evident in these excerpts on US policy: We are indeed living a strange de-humanised time, when it is desirable, according to Enlightenment rationality, to discard any attempt at (‘irrational’) empathy, or understanding, for the Syrian circumstance. And to consider the policy solution simply to be either technical (more, or different firepower); or mechanics: how and where, to move the levers of pressure.
And, secondly, to consider it ‘progressive’ to deny a stricken people (ordinary Syrians) the ability to go home, or to re-build their lives – and to deprive them of the chance still to think that there is something left to live for (unless they concede to submit to the Washington Consensus). And yet, still to regard this abstract ideological approach as somehow representing Europe holding the moral high ground? No wonder ‘otherness’ is tired of the Enlightenment’s ‘rational’ ‘order’.
To compensate for these lacunae associated with its attenuated style of consciousness, the US is resorting to technology and Artificial Intelligence to make good the gaps. It imagines that mining ‘big data’ – as is done in western elections, when 25 ‘likes’ on Facebook is said to be sufficient, to strip an individual politically ‘naked’ – might somehow compensate for the absence of empathy — providing the answers that this style of ‘reasoning’ can’t.
It is wishful thinking. Empathy is not machine generated. As Scruton points out, it derives from the aggregate course of individual lives pursued within the matrix of archetypal moral narratives that are the ancient skeleton to a community – which both bind it, and give ethos to that community. And which precisely are incommensurate with others.
As an undergraduate studying English at the University of Utah, I was required to take Introduction to the Theory of Literature. The course was a disaster. I was an awful student of critical theory. Like most burgeoning English majors I knew at the time (the early 1990s), I wanted to read and write literature, not to study what people had decided it meant to read and write literature. And then there was the professor who headed the class. He had a pretentious fondness for the French deconstructionist Derrida that I did not understand, partly because I did not understand Derrida himself, and partly because as a teacher this fellow was so single-minded that he could not reach any but the most earnest students. After class, I would often see him in the cafeteria, where he would practice his French with a colleague who also taught theory for the department. I guessed they were talking about Derrida, but who could say? Together, these elements would constitute my introduction to the baffling world of postmodern theory.
I wore the “D” I earned in that class like a badge of shame, but we shun what shames us, and so even though postmodernism of one form or another dominated the department, I managed to earn my degree while still avoiding this man, his ilk, and their “floating head” theories, the relevance of which would cease the instant I closed the book or left the classroom. Perhaps I was lucky. My orientation to literature was, if anything, romantic, and I would continue to entertain this view of language, literature, and life, ultimately, as an MFA student in poetry at Arizona State University. The creative writing program there did not proffer any alternatives. In fact, the only challenge came when I, along with several other teachers-in-training, attended a weekly seminar taught by seasoned rhetoricians. Not surprisingly, the poets resisted the rhetorical approach to language. Indeed, they were suspicious of—if not hostile to—any approach that sought to demystify the medium with which they rendered their personal afflatus. And I am sure that the rhetoric students in the classroom experienced their own unique dismay at what they must have seen as our beautiful but ineffectual and anachronistic conception of language. At the time, both approaches had their own appeal. But that was before I had perceived the world from outer space. Now I see that no subjective view of life is sufficient for addressing the nature and crisis of living in the Anthropocene.
When I started teaching in 1995, writing texts that emphasized multiculturalism were in vogue. More recently, social justice has emerged as the preferred context within which to teach writing, which may help to explain the continued interest in the work of social constructionists like Derrida and Michael Foucault, including the latter’s concept of the “discursive formation,” which is defined as the total set of relations that unite, at a given period, the discursive practices that give rise to epistemological figures, sciences, and possibly formalized systems. Foucault was likely trying to evoke ecology and thereby imbue his ideas with objective rigor with the phrase “total set of relations,” but I am not convinced that he succeeded.
Foucault was concerned with how different groups of people construct knowledge and, eventually, truth. In his view, what is true for one is not true for all: “Truth is always dependent on a particular discursive formation; that is, there is no underlying meaning or truth within or imposed on the things of our world, and the truth or knowledge of something rests entirely within the relations of statements inside a discursive formation.” To the extent that individuals and groups of people generate particular perspectives of the truth, Foucault was right. But the postmodern idea that there is “no underlying meaning” in the world apart from what people may produce is nonsense. That a certain perspective is exclusive and hinders access to other ideas is a comment on the limitations of the perspective, not on the degree to which truth can be known and shared. And yet Foucault’s oversight persists, as evidenced by the popularity of writing texts that privilege subjectivity over objectivity; lived experience over scientific fact; difference over similarity; orthodoxy over exploration.
Although, in principle, challenges to the mainstream tradition are much needed, they reveal a bias toward culture as a vehicle for determining truth: Competing narratives attempt to revise the dominant narrative according to their own particular ideological, racial, ethnic, or cultural experience. The voicing of these perspectives is, of course, long overdue, and because of it we have begun to appreciate the uniqueness and complexity of human experience. But as the work of Foucault and other social constructionists shows, if truth is subjective, and one truth is as plausible as another, as long as there is a dominant group, particular subjective truths will prevail. And where certain subjective truths do not prevail, violence of one form or another will likely ensue. According to this culturally relative view of the world, then, truth is arbitrary and exclusive, rather than evidentiary and shared. The consequence is divisiveness. Thus, the importance of hearing for the first time the distinct voices of silenced, marginalized, oppressed, and “invisible” peoples is coupled with an equally important need for uniting in order to address natural and social ills, a task that exclusive views of the self and world are not equipped to handle.
In his book Everybody’s Story: Wising Up to the Epic of Evolution, Loyal Rue, Professor of Religion and Philosophy at Luther College, explains the challenge of multiculturalism: “A particular story may be mine, and it may be worthwhile, and I may be diminished without it, but it is not a story that speaks for everyone’s experience. And as I discover the limitations of my own story there is born within me a longing to hear the larger story of which my own is a part—the universal story, everybody’s story.” Rue’s remarks speak to the core of the problem: despite having the goal of understanding and appreciating diversity, the various stories of a pluralistic society do not add up to anything we can all share. Similarly, postmodern theories may be useful insofar as they help us understand group dynamics, but as subjective accounts themselves they fail to honor the biological world of which all stories are part, on which all stories depend, and from which all stories ultimately arise.
Thus, Rue asks “Where do we go for a universal narrative account of how things are and which things matter?” His answer: The paradigm of Darwinian evolution. In contrast to the majority of cultural and religious narratives, which are anthropocentric, the paradigm of evolution is ecocentric and based on the fact that we originate from, live in, and depend on a physical world of interrelated systems, and that as the place where all stories happen, indeed, that makes all stories possible, we must care for the natural world above all else. This means that we must begin the difficult work of talking about things as they really are, and not just what we think they should be.
E.O. Wilson explains why this will not be easy: “Culture conforms to an important principle of evolutionary biology: most change occurs to maintain the organism in its steady state.” Understanding this underlying biological uniformity is the key to valuing our individual stories while still acknowledging and repairing their limitations. The world and what we know about it has changed and deepened, but many of our stories do not reflect this awareness. These accounts are therefore largely irrelevant to all except the individuals and groups who support them, which raises the question of what constitutes useful information. If I believe what social constructionists tell me, useful information is relative to the group that generates and shares it. Fine. But information that ignores biological fact, while meaningful, isn’t useful beyond the parameters of one’s subjective worldview. Only when this subjective information is examined in the context of the evolutionary paradigm does its objective relevance become clear: However unique our stories may seem, they are all expressions of a shared human nature.
Equally important is that this new definition of usefulness extends to nonhuman nature as well. Thus, when seen from an ecocentric or planetary perspective, the limitations of subjective accounts of existence are clarified but dissolved. What remains is a sense of biotic responsibility and relatedness. Rue writes: “As we discipline ourselves to take a wider view we begin to appreciate that the overlaps among species are much more profound and important than the differences. From the outer space of a Darwinian perspective life is a unity, a community of shared interest in the conditions of viability, apart from which there is no enduring promise. The driving theme of everybody’s story is to understand these ultimate conditions and to value them ultimately.” As Rue suggests, by emphasizing how organisms overlap, and by valuing what all organisms share—i.e., conditions of viability—the evolutionary perspective provides and encourages a foundation for biotic equanimity.
...
I will begin my counter-arguments to the author’s overarching thesis with the last named, always basing myself on what I see around me in Orlino and not on abstract considerations. The author is ignorant of an irrefutable trend among the Russian middle and upper classes: namely concern to live in ecologically pure environments and to eat organically grown food in which no pesticides or artificial fertilizers have been applied, which are not only GMO free but are coming from traditional seed pools as opposed to seeds merchandised globally by several (Western) multinational corporations. The bio food trend largely explains the latest fad observable everywhere in the Russian countryside: high technology greenhouses.
I noted the appearance of these greenhouses around me on Orlino properties last year. This year the trend has continued so that many homeowners, including those who otherwise do not have the land or inclination to maintain potato fields, now own two or more such greenhouses in a compact area next to their houses. In these greenhouses they grow a profusion of fruits and vegetables which by their nature or rarity are not sold by supermarket chains. Russian supermarkets, like supermarkets everywhere, depend on large scale and regular supplies of given produce, so that variety is always relatively limited.
The dachniki share what they grow with family; they tin the surplus, as applicable. As one neighbor deeply involved in this process replied when I asked what he buys when he goes to the supermarket: “bread.” The rest he provides for himself. In this respect, growing produce is one more dimension of self-reliance, alongside having one’s own artesian well, own septic system, own log-fed heating system and “own” bottled gas for the stove. The only regular bills to arrive from the outside world are for electricity: the Russian countryside has yet to discover the merits of solar panels for house roofs, though one day it may well do so, more for reasons of pride than for economy.
As regards the less affluent, particularly the older generation of pensioners, I have often wondered why year after year they put in 600 square meters of potatoes, beets and onions when these commodity products are so cheap at supermarkets and when their own produce in these categories is undistinguishable in taste characteristics from what is commercially available. After consulting with neighbors and friends, I conclude the reasons are love of tradition in what is undeniably a conservative society and creating a pastime that gives life purpose. As my regular taxi driver says about his mother living in the countryside, if she did not look after her extensive garden and process the harvest to gift to relatives and consume herself during the winters, she would spend the day watching soap operas on television and would likely lose her mental acuity.
Now turning to the question of travel abroad as a competing attraction to minding the dacha, I believe that Le Monde journalist Christophe Trontin is out of step with the times. To be sure, foreign travel is a significant factor when Russians choose how to spend their vacation time. After all, more than 10 million, or about 7% of the general population go abroad every year now, 6 million of them having chosen Turkey in the last year. However, judging by the behavior of our St Petersburg friends from the intelligentsia and economic middle classes, I believe that trend has peaked. Over the past decade, they have “seen it all,” traveled to all their dream destinations and returned home in the knowledge that there is no Eden abroad. Moreover, the Russophobia of Western Europe has turned our friends against return travel there. Instead, they are traveling around Russia, pursuing their interests in cultural or religious travel in ever more remote places.
...
A four-square-metre box with a screen and computer.
This is what Japanese cyber-cafes offer, around the clock.
Most customers just spend an hour or two here.
But there are thousands who spend their lives in them.
The Manboo in Tokyo has its own permanent residents: Masata and Hitomi.
It is a home for them, even though they sleep on the floor.
The banal, 1967 hit song, “San Francisco (Be sure to wear flowers in your hair)”, which was influential in enticing young people to come to San Francisco for the Summer of Love, was written by “Papa” John Philips, who attended the US Naval Academy at Annapolis and whose father was a Marine Corps Captain.
“Papa” John’s wife had worked at the Pentagon and her father was involved in covert intelligence work in Vietnam.
His neighbor and Laurel Canyon (Los Angeles) buddy was Jim Morrison of Doors fame, whose father US Navy Admiral George Morrison commanded U.S. warships in Vietnam’s Tonkin Gulf during the “Tonkin Gulf Incident.”
Frank Zappa, the father figure of Laurel Canyon’s many musicians who just happened to converge in one place at the same time where a covert military film studio operated, had a father who was a chemical warfare specialist at Edgewood Arsenal.
Stephen Stills, David Crosby and many other soon to be famous musicians all came from military and intelligence backgrounds and frolicked in Laurel Canyon. Although they were draft age, none of them was drafted as they played music, dropped acid, and created the folk-rock movement whose music was catchy but innocuous and posed no threat to the establishment.
But “shit happens.”
In his disturbing book, Weird Scenes Inside the Canyon, David McGowan raises the question:
…what if the musicians themselves (and various other leaders and founders of the ‘movement’) were every bit as much a part of the intelligence community as the people who were supposedly harassing them? What if, in other words, the entire youth culture of the 1960s was created not as a grass-roots challenge to the status quo, but as a cynical exercise in discrediting and marginalizing the budding anti-war movement and creating a fake opposition that could be easily controlled and led astray….What if, in reality, they were pretty much all playing on the same team?
Amazing documentary Film on two Zanskar girls. Written and Directed by Jean-Michel Corillion.
A visit to a new city that has emerged near the border between Kyrgyzstan and China. The city shows rapid growth due to the huge influx of Chinese goods. Heading west, we will see the cotton fields spread out across a huge oasis in Uzbekistan. We also explore the beautiful, white-bricked city of Bukhara, which still maintains an atmosphere of the Middle Ages. In one of the city's narrow streets, Uzbekis, Russians, Jews, and others live together side-by-side in harmony. Visit these people who live to the fullest amidst the uncertainties after the collapse of the former Soviet Union.
A visit to the Mosuo people of Lugu lake in South West China. Made by Alan Macfarlane and Xiaoxiao Yan in 2003. The absence of even the idea of marriage is an anthropological puzzle, but related to the trading system.
Ivy was meant to be a woman. She’s felt this way since she was a child and did everything in her power to become one but why is it so hard for Ivy to accept her new body? It’s a question she struggles with in this intimate film about her journey from being a man to a woman.
Since coming out at 24, Ivy was always sure about her decisions until a medical trip to Thailand triggered unexpected doubt. She talks about love, loss and reluctant acceptance in this beautiful narrative about her courageous and life-altering decisions to transform into who she wanted to be.
A documentary from a 2008 thesis project at New York University's Interactive Telecommunications Program (ITP).
I'd love to do a much higher resolution version with more subjects. If you're a child of hippies from the 1960s and 1970s, and you are interested in interviewing yourself on camera. I'm ready to do another version of this video! Contact: www.calebjc.org, calebjohnclark@gmail.com
Here's the background of this project.
I interviewed several children of hippies in 2008 and made an experimental interactive web-based interface to watch the interviews by time, question, or subject. Later I edited this video together as a documentary.
The kids of hippies are a fascinating subject to me, and not just because I am one. I believe they are a rare window into the results of an informal experiment in alternative child rearing.
Here's my thesis intro.
The Hippies of the 60s and 70s created a social revolution that impacted everything from recycling to relationships, from tofu to what was taboo. They made their mark from the halls of justice to the Halloween costumes you can buy for your kid ironically, at your local drug store. Many hippies were weekend hippies who wore the costume and went back to work on Monday.
A few hippies however totally changed their lives and dropped out of normal society altogether.
Some of these hippies then had kids.
These kids were not hippies like their parents. They did not decide to drop out. They were born dropped out.
In 2008 six of these children of hippies, now in their 30s and 40s, were tracked down and asked to answer the same set of questions while recording themselves using a webcam. Several were also recorded by Caleb Clark in New York City.
The interviews are from an experiment in new interfaces for documentary video as part of a 2008 Masters Thesis at the Interactive Telecommunications Program at New York University's Tisch School of the Arts, also by Caleb Clark.
Sunni General Hassan Turkmani had imagined defending Syria by counting on its inhabitants. According to him, it was possible to take care of one another and to involve each community, with its own particular cultural relations, in the defence of the country.
This was just a theory, but we have been able to verify that it was correct. Syria has survived assaults by the most massive coalition in human History, just as during the Roman era, it survived the Punic Wars.
« Carthago delenda » (Carthage must be destroyed), said Cato. « Bachar must go! » echoed Hillary Clinton.
Those who still hope to destroy Syria have now understood that they will first have to crush its religious mosaic. So they vilify the minorities and encourage certain elements of the majority community to impose their cult on others.
It so happens that Syria has a long history of collaboration between religions. In the 3rd century, Queen Septimia Zenobia, who revolted against the Western tyranny of the Roman Empire and took over the leadership of the Arabs of Arabia, Egypt and all of the Levant, made Palmyra its capital. She took care to develop the arts, but also to protect all the religious communities.
In France, during the 16th century, we experienced the terrible wars of religion between two branches of Christianity - Catholicism and Protestantism. This situation ended when philosopher Montaigne managed to imagine interpersonal relations, which allowed everyone to live in peace.
The Syrian project, as described by Hassan Turkmani, goes even further. It is not a question of simply tolerating that others, who believe in the same God, choose to celebrate Him in a different way. It’s about praying with them. So, every day, the head of John the Baptist was venerated in the Umayyad mosque by Jews, Christians and Muslims [4]. It is the only mosque where Muslims have prayed together with a Pope - Jean-Paul II - around their common relics.
In Europe, after the suffering of the two World Wars, priests of the different religions preached that we should fear God here on Earth, and that we would be rewarded in the Beyond. Religious practices have evolved, but the hearts of Mankind have weakened. In fact, God did not send his prophets to threaten us. Thirty years later, the young generation, who wanted to free themselves from these restrictions, suddenly rejected the very idea of religion. Secularism, [6], which was a method of government designed to allow us to live together in the respect of our differences, became a weapon against these differences.
Let’s not commit the same mistake.
The role of religion is not to impose the dictatorship of a way of life, which is what Daesh tried, nor to terrorise our consciences, as the Europeans had tried in the past.
The role of the state is not to arbitrate theological disputes, and even less to choose between religions. As in the West, political parties have aged badly in the Arab world, but as soon as they were created, the Syrian Social Nationalist Party (SSNP) and the Ba’ath Party intended to found a secular state, in other words, one which guarantees, to everyone equally, the freedom celebrate his own cult without fear.
That is Syria.
Everyone who visits this once ‘hermit Sultanate’ could easily testify: Oman is ‘different’ from the other countries of the Gulf Region. Its people are warm, talkative and proud. Despite the fact that Oman is poorer than Bahrain or Saudi Arabia, it actually feels richer, because there is no extreme misery there; the citizens are clearly well taken care of.
While in Saudi Arabia, during Ramadhan, outrageous orgies of wasting food and vulgar wealth-flashing are performed on a daily basis; Oman is quietly trying to save children in neighboring Yemen instead.
An airport employee, Muhammad, explained to me:
“My country is habitually sending two flights per week to neighboring Yemen. During Ramadhan, the frequency increases. Our airplanes bring gravely injured and very sick men, women and children to Oman. Here, they get first rate and free medical treatment. Our doctors are trying to save their lives, as if they were our own people. Yemeni people are our brothers.”
This is quite shocking, considering that the militantly anti-Shia regime that is Saudi Arabia (KSA) is actually bombing big parts of Yemen back to the Stone Age, while an Omani neighbor – the United Arab Emirates (UAE) – is occupying the coastal area of Yemen, including its most important port of Aden.
The Syrians also have plenty of good things to say about Oman. I have heard praise all over the country.
In turn, the Syrian government is generally admired by Omani people; not by all, but definitely by the majority. Oman has always maintained diplomatic relations with Damascus, and never joined any coalition that has been trying to destabilize or to overthrow the legitimate Syrian administration. All this is in sharp contrast to Qatar and Saudi Arabia – countries that have been, for years, on behalf of the West and Israel, injecting and then supporting various terrorist organizations that have been brutalizing millions of Syrian citizens.
Oman does not have any US or EU military bases on its territory. It does not need them. It is not at war with anybody, and it is not trying to overthrow any regional governments. Hosting strategic bombers, US Navy ships, and ‘Central Commands’ are not how Oman’s rulers want to guarantee their country’s prosperity.
Instead, there is a magnificent opera house near the coast in Muscat, and right next to it, a lavish public palace dedicated to the arts. Despite the proximity of some luxury 5-star hotels, the beach remains public. The Ruler of Oman apparently loves music and the arts. A shocking contrast to places such as Saudi Arabia, where the arts and music are discouraged, or out rightly banned; considered ‘haram’.
I spoke to Omani people, and they appear to be satisfied with their lives, and with the direction in which their country is evolving.
I stopped a group of men (Sunni Muslims), leaving a mosque, and asked them about the Sunni and Shia divide, as well as their feelings towards Iran, which is presently facing an imminent threat from the United States.
The Shia, they replied, “are our brothers”:
“Here, it is nothing like in Saudi Arabia where they kill Shia Muslims. Nothing like in Bahrain, where most of people are Shia, but are treated with horrible spite, often having to live in total misery. We don’t differentiate and do not discriminate against Shia. In Oman, we inter-marry, and it is not a big deal. Sometimes we break the fast together, and we bring gifts to each other. We help our neighbors, when they are in trouble, and it matters nothing whether they are Sunni or Shia.”
Almost everybody here feels great sympathy for Iran and its people.
My driver has travelled to Teheran and Shiraz on nine occasions. He admires Iran’s culture, as well as the kindness and determination of the Iranian people. He strongly believes that they have the full right to live their own lives, free from the illegal sanctions imposed on them by Washington.
A group of worshippers, also expressed great admiration for Syria and its government, and then of the two countries that are now, apparently, reshaping the world:
“Without Russia and China, the United States and its allies would have already swallowed us all.”
Their support for the Palestinians, and their outrage over the Israeli actions and apartheid, appears to be genuine, not hypocritical or ‘theoretical’, as it is in the rest of the Gulf.
I have always felt comfortable here, even during my previous visits, but this time, in the era of global madness that is being spread by the West, I felt greatly impressed by the wisdom, kindness and civility of this ‘forgotten Sultanate’, which possesses a big heart and an impressive understanding of the global situation.
...