Five Remarkable Facts about D-Day

Omaha-BeachMost of us are familiar the basic facts of D-Day from watching movies such as Saving Private Ryan and Band of Brothers.

So, today, I thought that I would share a few of the facts relating to the fascinating backstory of D-Day — a few of those factors and events of which most people are unaware but that account at least partly for why the Normandy Invasion, also known as Operation Overlord, is remembered as one of the most seminal events in history.

Yet the first remarkable fact about D-Day is that almost never happened. 

When a recently appointed Major General named Dwight Eisenhower was dispatched to England in 1942, part of his mission was to discuss the buildup of American forces in England so that an invasion of France could occur, preferably sometime that year.

But Britain regarded this idea with a measure of reluctance, if not a significant amount of dread.  It was locked in a desperate struggle to keep Rommel out of Egypt. And the thought of driving the scourge of Nazism out of the heart of western Europe seemed like a distant pipedream in early 1942.

Aside from that, the British had a long, very painful memory of the costs of fighting on the continent against Germans. For that matter, they regarded amphibious invasions in general with dread.  The ill-fated Gallipoli invasion, which was conceived by Churchill and aimed at knocking Turkey out of World War I almost put an end to the future prime minister’s political career.  And it is still remembered today as one of the bitterest disappointments in British military history.

And let’s not forget that the memories of World War I were still fresh in 1942.  And the evacuation of Dunkirk, in which most of the British Expeditionary Force and many French troops were miraculously snatched from the maw of the German Wehrmacht was still a very fresh memory.

Churchill had always regarded the invasion of France as a cosmic roll of the dice.

And what if it failed?  A defeat would not only kill any prospect for a return to continent but it might also lead to the Soviets working out a negotiated peace with the Germans.

Remember that the Soviets, to preserve their regime, sued for a similar peace in 1918, not only renouncing their claims on Poland, Finland, Belarus, the Baltic states, and Ukraine.  And considering that the Soviets were in a far better position, territorially speaking, in 1944, there was the distinct possibility that they would do so again.

The British held to a strategy that had served this maritime power reasonably well over the course of its 200-year imperial history.  They preferred a kind of Sun Tzu strategy by which choke-points were applied to German power, ideally in the vast and vulnerable underbelly of Southern Europe, employing its most formidable weapons:  the Royal Navy.

For a time, they had their way with the Americans, which is a long and fascinating story in its own right.  The British unwillingness to undertake a full, frontal assault of northern France in 1942, led the Allies instead into the Mediterranean and northern Africa, ultimately driving the Nazis off the continent and destroying Italian naval power in the Mediterranean — a crucial concern for the British, because this vast sea was not only considered their personal lake but a primary artery in the supply line to India and the dominions of South Africa, Australia and New Zealand.

The North African victory also positioned the Allies to invade Sicily and Italy — another key choke point that not only had the potential of taking Italy out of the war but also forcing the Germans to invest additional manpower that was sorely needed in which was fast becoming a desperate fight for survival in the Soviet Union.

Churchill also eyed the Dodecanes islands in Greece in the hopes of bringing Turkey on the side of the Allies, a getting a foothold in Greece, and ultimately taking part in the Soviet invasion of Eastern Europe.

But Americans also saw this ploy for what it was: a veiled attempt to preserve British power in the Mediterranean by guarding the Greece and other parts the eastern Mediterranean against the encroachment of Soviet power. Americans eventually bought into that argument, but only after the war as the Cold War heated up.

In the end, we went some distance with the British. We collaborated in the invasions of Sicily and Italy, but after that, we insisted that the next strategic target would be France.

But we remained in a bitter and sometimes recriminate war of words with the British.  The British were determined to soldier on in Southern Europe, probing for weak spots.  To be sure, some British military leaders, notably Gen. Alan Brooke, chief of the Imperial General Staff, were open to an all-out-assault on France, but only after it appeared that Germany had been mortally weakened elsewhere.

But we held firm. We owe much of that resolve to the Army Chief of Staff, George C. Marshall.  He remained the most unwavering proponent of the invasion of France.   In one notable respect, he knew better than the British.  He knew that democratic peoples lose interest in protracted wars. He knew that the fight would have to be taken to the plains of northern Europe where America’s industrial might, expressed in armor and air power, offered the best prospect for ending the war as quickly as possible.

In fact, a time or two, he even told the British pointedly that we would take our ships and tanks and troops and focus on the Pacific War with the Japanese.

And he was right.  The route through northern France offered the most direct path to the heart of Nazi military power: the Ruhr industrial region, which supplied the German war machine, and beyond that, Berlin.

Marshall was right — history has confirmed that.  But, in a sense, so were the British.  They were proven right in several notable respects.  There was no way that the Allies could have staged an invasion of northern France in 1943.  The comprehensive strategic bombing campaign over Germany, which ultimately badly disrupted the ability of Germans to move troops and supplies from one part of the continent to the other, had not yet begun.

And the Germans had only just begun sustaining what would ultimately become the mortal blows of stiffened Soviet resistance.

And that brings us to remarkable fact No. 2: that we have Hitler partly to thank for the success of D-Day.

Aside from Hitler’s decision not to invade Britain in 1940 — or, barring an invasion, not allocating sufficient naval forces to destroy her Merchant Marine and starve her out of the war — was his decision to invade the Soviet Union in June 1941.

Through the rest of that year, German victories were swift seemingly and effortless, but things began to change as the German’s drew closer to the defenses of Moscow.  Soviet resistance grew increasingly desperate and matched the Germans in its levels of ferocity and brutality.

To Hitler’s credit, his stand and fight order before the gates of Moscow in December, 1941, arguably was the one factor that prevented a complete German rout.

But following the disastrous battles of Stalingrad and Kursk, the most realistic generals had concluded that the German effort in the East was unsustainable.  It was no longer of a war of conquest but a life-or-death struggle against Bolshevism, one they feared, would be carried to the very gates of Berlin.

And the toll that this war in the East was taking on Germany’s military machine was evident in France.

French civilians in the town of Montebourg noticed the troops marching behind mounted officers down the main street were not singing the usual “Heidi-Heidi-Hos,” which had been such a familiar strain since 1940 but something strangely different.

They were so-called Ostruppen — eastern troops that had been recruited into the German armed forces largely to avoid death from starvation and disease in German POW camps.

Their deployment in the East had met with little success, so many of them were consigned to Andrei Vlasov’s Russian Liberation Army.  Many of these had been sent to France and organized into battalions, though the German attitude toward Slavs had changed little.

German attempts to stiffen their resolve with propaganda about the plutocratic Americans and British didn’t help.

Many of these troops were used primarily in anti-partisan activities, and, predictably, many ultimately escaped to join the Resistance.  After the invasion, many others surrendered to the Allies at the first chance.

Only a couple of these battalions actually fought with resolve.

One of the most remarkable examples of an Ostruppen in Normandy was a Korean named Yang Kyongjong, who had originally served as a Korean conscript in the Japanese Imperial Army — Korea at the time was a Japanese colony.  He was captured by the Soviets during the undeclared border war with the Japanese known as the Battles of Khalkhyn Gol in 1939.  He was subsequently consigned to a labor camp but was released and impressed into the Soviet Army, unwittingly becoming embroiled in one of the most desperate land wars in history.

He was later captured by the Germans at the Battle of Karkhov and then impressed into the German Army.  He was finally captured by the American Army near Omaha Beach.  Think about that for a moment: This hapless 24-year-old soldier had travelled the entire breadth of the Eurasian landmass at the hands of three different captors.

Initially misidentified as a Japanese in a German uniform, he eventually was released from POW camp and lived out his life in the United States.

The coastal defenses were also manned by a large number of German hard-luck cases from the Eastern Front.  The Army units consisted of 850,000 men of very mixed quality.  Many of the units were known as “ear and stomach” battalions, comprised of soldiers from the East who had suffered combat-related stomach wounds or significant hearing loss — a rather sobering reality for the German High Command considering that these troops were expected to follow oral commands.

Of the 36 Infantry divisions that comprised this group, more than half lacked transport and mobile artillery.

Curiously, in spite of these deficiencies, Hitler actually looked forward to the invasion. He believed the Allies would quickly be flung into the Channel.  And the Germans could then go about the desperate business of pushing the Bolshevik hordes back across the Russian steppes.

Remarkable Fact No.3: that Dwight D. Eisenhower was far from the natural pick to command D-Day.

We are often inclined to think of Ike as the inevitable pick for the D-Day.

He wasn’t.  Ike had proven himself as a capable commander in North Africa, Sicily and Italy, though he had made his share of mistakes.

Aside from that, the then-four-star general was a relative newcomer to high command.

Scarcely a year before the invasion, he still held the permanent rank of Lt. Colonel.

And to his credit, Ike had accumulated a significant amount of staff experience working as a military liaison to Congress and, later, as MacArthur’s right-hand man in the Philippines in the 1930’s.

During his brief tenure as a war planner, he drafted the initial plans for provisioning the southwest Pacific, which essentially involved writing off the Philippines as a loss and removing the bulk of Allied efforts to Australia.

But despite all this experience, Ike had never commanded a battalion in battle.

And there were two older, considerably more experienced men who enjoyed support in unusually high places: General Alan Brooke, chief of the Imperial General Staff, Churchill’s choice for Supreme Commander; and General George Catlett Marshall, Chief of Staff of the U.S. Army, whom Roosevelt considered to be the natural choice for this position.

In Brooke’s case, he had been promised the position by Churchill.  To underscore his determination to award Brooke with the command, Churchill had even mentioned it to Brooke’s wife.

And to be sure, Brooke was an impressive candidate.

Brooke had had significant combat experience in World War I, notably introducing to the French creeping barrage technique to the British during the Battle of the Somme.

He had had significant experience in northern France in 1940 helping extricate the battered British Army and directing it to Dunkirk, where it was miraculously rescued.  His organizational skills were so impressive that he was later sent by Churchill to France to oversee the repatriation of the remaining British troops.

He also possessed an unusually keen intellect and extensive strategic and logistical knowledge.

He had previously served as a lecturer at the Imperial Defense College and possessed a very strong knowledge of the men who would ascend to key leadership positions in the British Army in World War II.

As chief of the Imperial General Staff, he also commanded the strategic effort of the British imperial forces, largely through the force of his intellect and personality, even though Churchill served as his own defense minister.

But at staff meetings, he talked down to the Americans, regarding most of them as lacking strategic sense. This did not bode well for his prospects, considering that the Americans would have the chief input into the appointment.

George Catlett Marshall was Roosevelt’s choice.

His command experience was very similar to Brooke’s.  He was considered the country’s leading authority on logistical and strategic planning.

Marshall’s personal discipline was truly on the order of George Washington’s.  A mediocre student in school, he determined to follow older family members to Virginia Military Institute and to excel — a determination that only grew stronger when his overheard his older brother predicting to his mother that his attending VMI would end up bringing reproach to the entire family.

He exceeded all expectations.

Churchill described him as the organizational architect of victory in World War II.  He served as director of training and planning for the First Infantry Division in World War I and was responsible for planning the engagement that led to the first American victory at Cantingny in 1918.

Marshall was later transferred to the Headquarters Staff of General Pershing and became a key planner of military operations. He was instrumental in the planning and coordination of the Meuse-Argonne Offensive, which contributed to the defeat of the German Army on the Western Front.

After the war he was a key planner and writer in the War Department and later commanded the 15th Infantry Division.

In 1939, he was appointed Chief of Staff of the U.S. Army by President Roosevelt and thereafter invested his formidable logistical knowledge in the reorganization of the Army, laying the foundations that would transform this rather provincial institution into the global army that it is known today.

His command of the D-Day landings was his for the asking.  And there would be an American, rather than a British commander.  Roosevelt would see to that because beginning with D-Day, the American contribution to the war effort would increasingly dwarf that of Britain.

But in keeping with his deep for respect for democratic values and civilian control of the military, Marshall left this decision to President Roosevelt, who desperately wanted to keep Gen. Marshall in Washington to serve as Army chief of staff.

Even Roosevelt expressed his profound regrets.  As he acknowledged to Eisenhower shortly before his decision, every Civil War buff could name the principal field commanders in the conflict, though very few could name the Army chiefs of staff.

And for this reason, Marshall remains a relatively obscure American historical figure — one of the greatest tragedies of American history. He is arguably among the ten greatest Americans who ever lived, embodying a character and a commitment to American principles on par with that of Washington and Lincoln and serving a role every bit as indispensable to this nation’s fortunes.

We were fortunate to have Dwight Eisenhower at the helm on June 6.  But the man who conceived, planned, and lobbied for D-Day — typically against stiff opposition by the British — is the one who should have commanded it.

Remarkable Fact #4:

Another remarkable but largely unknown fact about D-Day was the operation that concluded the Allied Campaign in Normandy some two months after the initial assault on the beachhead: the battles of Falaise Pocket.

It’s an important operation not only for the role it played in ending the Normandy Campaign but also for the way it illustrates how Allied technology was brought to bear on the Germans with devastating and horrific effect.

In August, 1944, British General Bernard Montgomery had taken Caen, while Patton’s Third Army had wheeled around to the south of Montgomery’s operational area, leaving a large German-controlled remnant known as the Falaise Pocket.

The pocket was subjected to unremitting ground, artillery and aerial attacks.  Germans sources recall the conditions within the Kessel, or cauldron, as they typically described such pockets, as closely resembling conditions in Stalingrad.

I have little time to recount these horrific events, but the complete account is available via a simple Google search.

Unfortunately, Montgomery, who had primary responsibility for closing the Falaise Gap, allowed tens of thousands of Germans to flee.

But it did mark the beginning of the end of German resistance in France, and it drove home to growing numbers within the German military leadership that the continued resistance was futile.

As Dwight Eisenhower recalls after touring the pocket:

The battlefield at Falaise was unquestionably one of the greatest “killing fields” of any area.  Forty-eight hours after the closing of the gap, I was conducted through it on foot to encounter sees that could be described only by Dante.  It was literally possible to walk for hundreds of yards at a time stepping on nothing but dead and decaying flesh.

Fifth and finally, do you know the most remarkable thing of all?

I’m standing here this afternoon in an air-conditioned conference room in the most materially prosperous nation in the world, speaking about one of the most seminal events in history because tens of thousands of 18, and 19 and 20-year-old kids summoned the fortitude to storm a shore to rid the world of history’s greatest threat to freedom and human decency.

While you celebrate your weekend, which will mark the 71st anniversary of the D-Day landing, reflect for a moment on those 18-year-olds kid in that landing craft approaching the shores of Utah beach – shaking from fright, vomiting their guts out, thinking about how desperately they wanted to be home – on that farm in Kansas, on that factory floor in Detroit, in that college lecture hall in Auburn – anywhere but there.

D-Day and the military successes that followed were achieved by the sacrifices of young men, primarily between the ages of 18 and 22 who not only safeguarded but affirmed all of the values that we take wantonly for granted today in the 21st century: an America that, despite enduring some serious setbacks within the last 70 years continues to inspire millions throughout the world.  And through the peace secured by these young men, America stands alongside a prosperous, unified Europe, with a democratic, unified Germany at the center of it all.

The preceding was a series of reflections on D-Day delivered to the local Opelika Kiwanis Club commemorating the 70th anniversary of the landings. 

Advertisements
Posted in American History, History, Jim Langcuster, World War II | Tagged , , , , , , | Leave a comment

Curators of Our Lifetime Journeys: A New Form of Spirituality?

zagred-art-museum

An example of curating art at the Museum of Contemporary Art in Zagreb.  Photo: Courtesy of Myrian Thyes.

The last few years I’ve been acquainting myself with the writings of a British religious philosopher named Don Cupitt.  An ordained Anglican priest, Cupitt was once known as a theologian, but following his drift away from conventional Christianity, he now regards himself as a more of a religious philosopher.

I’ve formulated views about God and faith that conform somewhat closely with his. After many years of struggling with faith, I have reached a few conclusions.  First, I believe that God represents the deepest yearnings of many human beings for something beyond themselves — what is commonly described as transcendence.  As I see it, God also expresses the human yearning for certitude, rootedness and permanence.

The Embedded God

But I’ve also concluded that most of us, actually, the vast majority of us, fail to see what God truly is: a human projection, one that is both transactional and intersectional.  By this I mean that God was formed by and lives within the day-to-day transaction of language and also through the cultural intersections forged across five millennia.

God is also embedded within a vast ecosystem — an immense network that encompasses not only language and culture but also science and technology.  All of these facets are closely interconnected and, consequently, undergo constant change.  I have come to describe this vast network as the Noncorporeal Human Exoskeleton.  In a very real sense, this network, this vast interconnectedness,  has sustained and protected us through the ages, much as corporeal exoskeletons — shells — have protected a number of species, such as turtles and crustaceans.

Virtually all of us on earth are connected to this network to one degree or another, although we in the West are products of an unusually deep, vast and highly generative expression of this network that extends back thousands of years. And particularly within the West, we are afforded the added advantage of being tied to the rapidly expanding digital facet of this network, which affords us virtually instantaneous access to acquired knowledge associated with 5,000 years of human learning and striving.

As this network changes over time, so does our understanding of God.

Our Evolving Understanding of God

Indeed, I’m struck with how rapidly our perceptions of God have changed even within my lifetime.

As a boy growing up in northwest Alabama, I recall that many older adults still subscribed to a vengeful, wrathful Old Testament and Calvinistic God. Now, to an increasing degree, God is perceived as a benign, loving parent.  Many no longer consider him a father figure at all but a genderless entity who, far from imposing a kingdom of God on humanity, encourages humans to build a secular kingdom of their own.

Many adults who grew up in faith traditions very similar to my own childhood tradition have rejected conventional conceptions of God entirely.

I relate all of this with no intention of minimizing or denigrating the importance of God to the fortunes of the West and to humanity in general.  But the fact remains that in recent decades, quantum leaps in technological advancement, particularly digital technology, have contributed to an immense expansion of our network, our human exoskeleton.  And one effect of this has been reflected in the profound changes in our understanding of God.  Among many of us, our views of God and transcendence have grown too large and complex to be contained within and expressed by the traditional creeds and liturgies of the church.

Our Mutating Exoskeleton — and Its Implications

But this is really nothing new.  Over the last 500 years of history, beginning with the Reformation and the advent of the printing press, our mutating network — our exoskeleton — has afforded an enhanced number of ways to interpret God.  The advent of the Digital Revolution has only accelerated these processes.

Moreover, advances in textual criticism of the Bible and evolutionary science — two advances that left many of the world’s leading theologians and philosophers deeply troubled beginning roughly 200 years ago — are no longer the exclusive domain of academic specialists.  These insights are no longer confined to musty library books but are now available literally at the fingertips of ordinary seekers.

And as these discoveries have increasingly become accessible and popularized, the matter of interpreting and worshipping God becomes only more challenging and problematic. Perhaps even the Quaker’s traditionally open, informal approach to worship is no longer big enough.

We Are All Nietzscheans Now

In a sense, we are all Nietzscheans now. We have been liberated, however unwittingly, through our own technological advances.  Ordinary spiritual travelers, not just professionally trained theologians and clergy, are now being challenged as never before to look beyond the historically prescribed notions of good and evil, to search for grains of truth within an immense, radically flattened information landscape.

To be sure, there will always be spiritual versions of those whom the Objectivist Ayn Rand disparagingly and rather unjustly characterized as second-handers — people who are afraid to take a leap into the dark and who are content to follow prefabricated expressions of faith: Catholicism or one of the numerous iterations of Protestantism.

They will likely represent the overwhelming majority of seekers for the foreseeable future.

Curating and Creating Our Own Sacred Spaces

However, for the rest of us, an embattled but growing minority, it would seem that the only alternative is to become as imbued as possible with the attributes our civilization has historically associated with God and, over the course of the time, to serve as a beacon to others — in other words, to curate our faith — to become curators on behalf of others.

In carving out our own niches within this vast world in the course of our own solitary walks we create our own sacred spaces.  And as we reflect on these experiences, these personal acts of creation, many of us feel compelled to share them with others in the hope that this sharing will enhance their own walks.  We will learn to curate our life experiences, much as a museum curator arranges art or historical relics to educate their clients.

I’ve was reminded of this just today reading a splendid book by Lloyd Geering, who anticipated Cupitt’s views on God by a full decade.  As Geering observes, each of us is born into and cultivates over the course of a lifetime his or her own womb of culture.  He appropriately cites a memorable quote by by the Austrian scholar, writer and inventor Josef Popper-Lykeus: “Every time a man dies, a whole universe is destroyed” – all the more reason why we should feel compelling to curate the experiences of our lifetime journeys on behalf of others.

By curating what we learn and discover on our own, we not only conceive and create our own sacred spaces but also open these up to other spiritual wayfarers.

To view it another way, the advances in digital media and the tidal wave of information that has followed have transformed not only into unwitting Nietzscheans but also Martin Luthers.  Many of us, secularist and believer alike, have undergone our own transformational Tower Experiences, some spiritual, others entirely secular in nature.  These have not only placed us on a moral and ethical path but  have also, in many instances, proven so compelling that we are driven to share them with others.

Yet, in the midst of this spiritual or secular transformation, many of us have been confronted with a measure of moral and ethical disruption wrought by the Digital Revolution, one that has sown far more disorder and anomie than any previous social and cultural revolution.

Cupitt has observed this disruption in the course of his own efforts to impart his radical understanding of faith. Many of his admirers — followers would be too strong a word in this context — have improvised a series of loose networks in Britain and the Antipodes, known as the Sea of Faith Networks, to discuss and debate the implications of his writings.

Yet, from the beginning, these networks have evinced not only highly innovative and even fissiparous tendencies. They are yet another testimony to the difficulty of creating a faith community in the midst of a technological revolution, one in which knowledge is expanding at such volumes and speeds that human beings lack the ability to improvise new modes of thinking and social structures fast enough to accommodate these rapid changes.

Improvising Our Own Solitary Treks

We are being challenged — perhaps forced would be a choice of words in this context — to improvise our own solitary treks through life. We are being forced to undertake Kierkegaardian leaps into our own tailormade faiths.

And to be fair, what we are talking about isn’t so much a leap of faith as a solitary walk through the vicissitudes of life.

All we really have in the end is our own personal fortitude and courage to travel through this mortal existence and to follow the bread crumbs dropped by earlier generations of wanderers — that and the freedom to reflect on our own experiences and to leave behind bread crumbs for future generations of wayfarers.

Posted in Cultural Evolution, Humanism, Jim Langcuster, Religion, Religion and Culture | Tagged , , , , , , , , , | Leave a comment

A Strange Symbiosis in North Korea

kim-jong-un

Kim Jong-un. Photo: Courtesy of Blue House (Republic of Korea) 

A few years ago, National Geographic dispatched a documentary camera crew to North Korea to highlight the efforts of Nepalese opthamalogist and surgeon Dr. Sunduk Ruit, who performed small-incision cataract surgeries on roughly a thousand North Koreans.

Ruit has completed some 100,000 of these surgeries across the world as part of a charitable endeavor.

I was struck by what followed.  Sometime after these surgeries were completed, the patients assembled in a large, shabby auditorium for the post-operative checkup.  As soon as their bandages were removed, a remarkable thing followed:  All of them, apparently without exception and with no prompting, marched to the front of the crowd, bowed before large portraits of Kim il-Sung and his son, Kim Jong-il, effusively praising and thanking them for the surgery, seemingly indifferent to the indispensable role Ruik played in restoring their sight.

Perhaps none of these patients ever entertained the thought that the appalling material circumstances and nutritional deficiencies that contributed to their cataracts stemmed from the despotic exploitation of their rulers, the Kims.

This bizarre incident prompted some thoughts about the strange symbiosis that has prevailed over the last 70 years between ruler and ruled in the Hermit Kingdom.

The simple fact that there are so many wretched, ignorant masses in North Korea largely account for why this strange dynastic symbiosis exists in the first place.   I think the strong case can be made that despotisms survive only where the majority of people are ignorant of the real factors behind their suffering.  And this partly accounts for why the Kims so wantonly indulge their opulence amidst their countries appalling squalor.

Only a few days ago, the current Kim dynast, Kim Jong-un, undertook his first official international trip, traveling in his late father’s lavishly equipped train. The train functions essentially as an American Air Force One on rails — part mobile state palace, part command command center. lavishly equipped with conference rooms, an audience chamber, bedrooms and, of course satellite phone connections and flat-screen televisions.

Reflecting on a trip to Russia in 2011 by the previous dynastic despot, Kim Jong-il, a Russian official recalled that his heavily armored 90-car train, which was proceed by a reconnaissance train and followed by a security train, was conducted by beautiful women and sumptuously equipped with Russian, Chinese, Korean and French cuisines.

American and Western European heads of state have never traveled so opulently.

Whenever I read about all of this conspicuous consumption associated with the Kims, I’m reminded of ancient Egypt where a handful of royals lived in unimaginable splendor amid the toiling masses who struggled to eke out a meager subsistence or starved trying.

And bear in mind that we are talking of a dynasty that governs a country with an economy roughly the size of the central African nation of Gabon, albeit a Gabon with an incipient nuclear arsenal and a cyber-terrorist sector.

From my Western perspective, it seems remarkable to me that a dynasty ruling a country as desperately poor as North Korea would deign to travel in train more lavishly equipped than, say, Adolph Hitler’s Fuhrer train.  But then this glaring anomaly explains everything about North Korea.

Kim’s opulence is intimately bound up in his country’s abject poverty: Without the Kim family’s obstinate pomposity, North Korea would be seen for what it truly is: a squalid Third World hellhole soldiering on amid one of the world’s most prosperous, economically vibrant regions.

In their own perverse way, the Kims, by employing a large army, nuclear saber rattling and cyber warfare and, of course, lavish lifestyles, provide the bedraggled masses with their own sense of majesty and omnipotence.  In the midst of all their suffering and squalor they are comforted by the myth that they inhabit one of the world’s most significant nations, a nuclear armed behemoth governed by geniuses with divine attributes who routinely defy the most evil power in the world and the greatest existential threat to their country: the United States.

At least, that seems to be the theory, one that, until recently, appears to have worked reasonably well.

Western monarchies have developed their own symbiotic relationships with their people.  But compared to Korea, these have been relatively benign, if not largely beneficial to both ruler and ruled.

Yet, as symbioses go, the North Korean monarchy  — and that, in effect, is what the Kim dynasty is, a de facto monarchy —  is a strange one indeed,  the strangest one in modern history, fraught with all manner of instability and risk, not only to the Kims and to the people of North Korean but also to humanity at large.

Yet, somehow they — the Kims and their subjects —  manage to soldier on.

Posted in geopolitics, Jim Langcuster, The Passing Scene | Tagged , , , , , , , , | Leave a comment

Richard Dawkins and the Decline of Christianity

Wilmersdorfer-Mosque

Wilmersdorfer Mosque, Berlin. Picture: Wikimedia Commons.

I read with great interest this morning that the great evolutionary biologist and atheist Richard Dawkins has warned Europe, the cradle of both Christendom and Western Civilization, not to abandon Christianity wholesale.

As Dawkins sees it, Christianity, compared with Islam, is a relatively more benign faith. And the fact that Dawkins quotes Hilaire Belloc, who, along with Belloc’s friend and collaborator, G.K. Chesterton, was one of the greatest apologists for the Christian faith, lends some remarkable insight into this thinking.

“Before we rejoice at the death throes of the relatively benign Christian religion, let’s not forget Hilaire Belloc’s menacing rhyme: ‘Always keep a-hold of nurse for fear of finding something worse,” Dawkins warned in a recent tweet.

Dawkins obviously perceives the effects that likely will unfold in Europe as the birthrates of European Muslims increase relative to non-Muslims. Increasingly within Europe, there appears to be nothing in place to provide a counterweight to the rapid demographic onslaught of Islam throughout the continent — a faith that, broadly speaking, at least, holds no truck with Dawkins’ tradition of atheism and free thought or with traditional Western notions of liberalism and the open society.

The Coming of Sharia Law in Europe

It seems that former Archbishop of Canterbury Rowan Williams, who predicted that Britain and, for that matter, other European countries ultimately will adopt elements of Sharia law, will be proven right in a few more decades.

Demographic trends in Europe paint a disturbing picture of what’s to come.

Based on a recent survey of European 16- to 19-year-olds, the Czech Republic is the least religious country in Europe. Some 91 percent of people within this age group responding that they profess no religious affiliation. Between 70 and 80 percent of young adults in Estonia, Sweden and the Netherlands also categorize themselves as non-religious

In Poland, the most religious country in Europe, some 17 percent of young adults claim to be non-religious, while a fourth of young adults in Lithuania profess no belief.

I have my problems with Christianity, particularly evangelical Christianity. I think both the great strength and misfortune of evangelical Christianity stems from a set of historical realities that forced it to embrace an unusually radical form of Sola Scriptura – something that earlier Reformers, Luther and even Calvin, were unwilling to do.

Essential Scaffolding

If my study of networking and what I’ve come to call the “human exoskeleton” has underscored anything, it’s that we’ve got to look at all human civilization, but particularly its crowning achievement, Western Civilization, as a highly nuanced network, as an intricate system of interrelated parts.

All that we know of our civilization has been supplied by this interconnected scaffolding — everything that has been enlisted over eons in the construction of it: language, culture, faith, technology, for example. And Christianity has provided the basis of much of the scaffolding Western civilization. Equally important, it has provided Westerners not only a means to sort good from bad on an individual basis but also as standard for measuring the value of civilization as a whole — to apprize the merits of their civilization vis-à-vis others.

I’m convinced that the propagation of the culture and values of the West on a global scale would not have been possible without the edifying and sustaining attributes of Christianity, both in its Catholic and Protestant forms. I seriously doubt that Westerners could have otherwise summoned the courage and resolve to undertake such a difficult task.

Christianity has contributed in a multiplicity of ways to formation and advancement of Western civilization. We now take much of this legacy woefully for granted.

The Advent of Secular Liberalism

And now that so much of the civilization of the West has been supplanted by liberal secularism, itself an outgrowth of Christianity, we lack even the standard with which to identify the factors that threaten us, particularly the mass introduction of Islam into the heart of Europe, the cradle of both Western civilization and Christendom.

Simply put, Christianity no longer serves most of Europe as a moral and ethical backstop or as the means of assessing and affirming the quality of Western Civilization.

Stephen Bullivant, professor of theology and the sociology of religion at St. Mary’s University, London, is the researcher who compiled and reported the survey of religious beliefs of young Europeans in a report titled “Europe’s Young Adults and Religion.”

Bullivant has concluded that religion is now moribund. “With some notable exceptions,” he says, “young adults increasingly are not identifying with or practicing religion.”

And he argues that this trajectory is likely to become even more marked in the future.

“Christianity as a default, as a norm is gone, and probably gone for good — at least, for the next 100 years,” he says.

The Mixed Blessings of Sola Scriptura

This brings me back to Sola Scripture doctrine or rather to the irony of it. A set of circumstances confronting Martin Luther, forced him and subsequent generations of Protestants to put most of their eggs into single basket, namely the emphasis on scripture as the primary basis of church authority.

Remarkably, Luther’s and subsequent reformers’ emphasis on scripture sparked an unanticipated effect: an explosion in literacy among the masses that placed the West on a trajectory toward scientific and technological transformation, one that in strictly material terms, has benefitted the West and the rest of the world decidedly for the better.

Yet, this had the entirely unintended effect of rendering Christianity susceptible to two remarkable achievements that grew out of these advances: textual criticism of ancient scripture and the advances in evolutionary science sparked by the research of Charles Darwin.

As I’ve mentioned previously, University of North Carolina at Chapel Hill Professor Molly Worthen explores this great unintended effect, at least, as it has unfolded within the American religious and cultural context, in her excellent book, “Apostles of Reason: The Crisis of Authority in American Evangelicalism.”

Catholicism, the world’s preeminent Christian faith tradition, though one less reliant on the authority of scripture, grapples with the same crisis, of course. For both the Catholic and Protestant faith traditions, the dramatic advance of secularism has undermined their ability to carry the very best and enduring attributes of the Christian faith — the parts that have served us so well over two millennia — into the future.

One example of this impending loss: The longstanding Catholic opposition to abortion, one that has been embraced by their evangelical co-religionists within the last couple of generations. This traditional Catholic resistance to abortion provides a very important check on a civilization that seems on the verge turning away from the West’s historically high standard for the value of life.

Consider how quickly Nazi Germany embraced mass extermination as the Christian influences within Germany receded in the 1930’s and 40’s and one gets an idea of the spiritual malaise that ultimately may take hold of our civilization.

Simply put, for the last two centuries since the advent of David Strauss, the father of textual criticism of scripture, and Charles Darwin, the father of evolutionary science, the efforts of Western religious thinkers to supply a compelling moral response to the effects of material progress and the advent of secular liberalism have been seriously undermined.

Granted, post-theism is the reality that scientific discovery and progress have served Westerners within the last two centuries. But this should not detract from our paramount need to reconstruct the most valuable aspects of Christians scaffolding into a modern, post-theistic faith. We must find some way to build new scaffolding that not only acknowledges but also borrows significantly from Western civilization’s historic Christian heritage.

Posted in Jim Langcuster, Religion, Religion and Culture, Religious Faith | Tagged , , , , , , , , , , | Leave a comment

The Creaky Scaffolding of the West

scaffolding

Photo: Courtesy of Condrinb

Evangelical Christians have labored for decades under the assumption that they can reclaim, piece by piece, the secular culture of the West through active engagement with it.

Like Sisyphus and his stone, millions of Christians, despite setback after setback, remain undaunted and determined in this onerous task. Apologetics classes and other related evangelical efforts to engage with secular culture are exploding. As one recent Patheos column observed, a Google Search of the phrase “engage with culture” yields almost a half million results.

Yet, as a growing number of evangelical thought leaders are realizing, this longstanding effort has proven futile. With each passing year, the secular culture drifts further left, away from the Christian moorings that once defined and secured Western culture.

As the Patheos column argues, every attempt to reverse this drift has yielded pitifully small results, if any at all.

The column concludes that evangelical Christianity stands little chance of making headway in the Culture Wars not only because the vast majority of ordinary Americans now identity with the predominant culture but also because the elite, culturally hegemonic segments of American society regard evangelical Christianity as low status.

This reminds me of something I heard recently that was shared a few years ago by the now deceased Christian scholar and theologian Marcus Borg. He related that in the religious classes he taught at Oregon State University, his students’ body language underwent a discernible change from engagement to one of disengagement and even hostility whenever the topic switched from, say, Hinduism or Buddhism to Christianity.

This this speaks volumes about the intractable challenges Christianity faces today.

Throughout secular culture, Christianity not only invokes an ambivalent response among Americans but even an antagonistic one in some quarters.

All of this arguably can be traced back to the opening of two Pandora’s boxes roughly two centuries ago when German scholars first began laying the foundations of textual criticism of New Testament scripture and when Charles Darwin began articulating what ultimately became known as evolutionary science.

These discoveries have wrought major changes in Western society, many of which, to be sure, have conferred numerous benefits on our species. This especially applies to evolutionary science, which has provided the basis for advances in medicine and other scientific disciplines. Many of these strides would have been scarcely imaginable only a few decades ago.

But they, along with many of the insights garnered from textual criticism, have also sparked a profound existential crisis in the West. These have dethroned humanity from what was perceived for centuries as its central place in Creation, one ordained by God, who established humanity as the crowning achievement of this divine undertaking.

In existential terms, these intellectual advances have been regarded by many thinkers as catastrophic for the future of the West. And, incidentally, in the full interests of disclosure, I should stress that I’m neither an evangelical Christian nor a theist.

I count myself a nontheist, but I won’t go to the trouble here of explaining all the rather subtle differences between nontheism and atheism. Suffice it to say that I believe that everything that we have achieved has been the result of a network that has developed over eons and that has grown primarily out of language and written script. Religion has historically been bound up in this network and has afforded humanity all manner of advantage in terms of providing a sense of purpose and keeping all of the psychological furies and common human fears at bay.

In a very real sense, religion is both cognitive software and technology, much as language is.

This networking can also be viewed as scaffolding – in fact, within this context scaffolding can be used interchangeably with networking. Like networking, it underscores how everything that human have developed across time essentially is interrelated, contingent on everything else. It’s all connected, bound up in a vast network that extends across eons and that undergoes constant change and refinement.

The Christian faith amounted to the most valuable of scaffolding. And with the erosion of this scaffolding – at least, a significant part of it – it remains doubtful whether humans will manage to construct anything of equal and enduring value in its place.

So much scaffolding of the West two centuries ago was bound up in orthodox Christianity. The promise of an afterlife, coupled with the fear of eternal damnation for egregious offenders, provided an integral, if not essential facet of this scaffolding. These facets of Christianity breathed life into the faith and provided the West with some of its strongest and most enduring scaffolding, at least, until the mid-19th century.

But textual criticism and evolutionary science have compromised this scaffolding. In the minds of the most culturally influential members of our society, these advances put a lie to Christianity.

Friedrich Nietzsche, as memory served, believed that the destruction of all this old order ultimately would clear space for the emergence of a new breed of enlightened, well-integrated humans who would put aside the old slave morality of Christianity and construct a new ethos better aligned with the true nature of our species and better equipped to maximize human potential.

Perhaps advances in Artificial Intelligence will finally enable us to construct a viable alternative to this old scaffolding over the course of time, but, frankly, I harbor serious doubts.

Someone once said that evolution is parsimonious, working only with the stuff that is available. Perhaps evolution will never settle on new scaffolding that supports humanity and civilization in the way the previous scaffolding did – that fills the deep existential void that characterizes, to one degree or another, the lives of most us.

Perhaps in this respect humanity has reached an evolutionary dead end. Perhaps we are destined never to recover.

Posted in Jim Langcuster, Religion, Religion and Culture, Religious Faith, The Passing Scene | Tagged , , , , , , , | Leave a comment

East Germany and the Strange Case of Symbolic Fusion

DDR-40-1989

Officials celebrating 40th anniversary of the establishment of the German Democratic Republic (East Germany).  Note the display of red flags and the distinctive East German Hammer and Compass with the liberal democratic German tricolor, first adopted in 1848.

People wonder why I’m fascinated by flags.  It’s because flags tell a story, sometimes good, sometimes bad, about the nations that they represent.

I’m especially fascinated by the flags and symbolism of East Germany.  They represent a protracted, bitter struggle on the part of East Germany’s ill-fated leaders to secure legitimacy, not only among the nations of the world but also among their own people.

This struggle was bound up in the legacy of the East German state, one as remarkable as it was tragic.  East Germany – the Deutsche Democratische Republic (German Democratic Republic), as it was officially known – was the product of one of the most brutal conquests and occupations in human history. An estimated one-million German women suffered rape at the hands of the Soviet conquerors, and following this conquest, the Soviets removed a third of the region’s industrial capacity and extracted an additional $10 billion from its industrial and agricultural sectors as war reparations.

This was only the beginning: Following this brutal conquest, the Soviets were reluctant to let go of this zone of occupation and to allow eastern Germany to be reunited with the rest of what remained of post-war Germany, although they actively considered it from time to time.  They viewed their sector as a vital geopolitical asset – the crown jewel of the Soviet Empire, much as the British had regarded Imperial India.  Eastern Germany served the Soviets as a vital strategic asset, safeguarding Soviet interests in the very heart of Europe.  For the Soviets, controlling this part of Germany, historically known as Prussia, greatly reduced the chances that a unified Germany, Russia’s historic foe, would rise again to threaten it.

This Soviet legacy of brutal conquest, occupation and exploitation presented the leaders of East Germany with a daunting set of challenges. Many ordinary East German citizens understandably reviled the Soviet Union and regarded East German government as simply an illegitimate extension of Soviet power. The loss of some 3 million Germans who fled the sector from 1945 until the construction of the Berlin Wall in 1961 attested to the enmity among millions of East Germans felt for their Soviet occupiers and their German lackeys.

East German True Believers

Yes, there had always been socialist and communist sentiment in eastern Germany, and, yes, there were some Germans who were genuinely receptive to a socialist or even a communist regime, even a Soviet-imposed one. These sympathies were reinforced by Germany’s defeat in World War II and by what was widely perceived as the deep portrayal of the German people by Adolph Hitler and the Nazi regime.

hans-modrow

Hans Modrow, a post-war German convert to communism and the last Communist premier of East Germany.

The utter defeat of Germany and dispossession of Nazi Germany profoundly altered the thinking of many Germans, particularly young post-war Germans. Hans Modrow was among them. Captured as a teenaged German soldier by the Soviets in 1945, Modrow felt grateful for being fed and assigned work as a POW instead of being killed, which was more than could be said for the millions of Soviet soldiers who fell into German hands during the war. That made a deep impression on Modrow and other POWs.   He ultimately joined the East German communist party, though he remained throughout his adult life a maverick and rather dissident one, later sympathizing with Mikhail Gorbachev’s calls for glasnost.  Modrow was selected as the last Communist leader of East Germany before the election of the first fully democratic East German government in 1989.

But Modrow, at best, represented only a minority of Germans in the Soviet sector.

For many other Germans in the Soviet zone occupation, communist loyalties didn’t come this easily.  For their part, the Soviets were aware of the daunting challenges involved in forming  a separate socialist German state.

Socialism in One Occupation Zone

For a time, the Soviets collaborated with the West Allies – Britain, France and the United States –  with the expressed goal of securing a neutral Germany at the heart of Europe, one patterned after Austria.   But after repeated disagreements with the Soviets, the Western allies in 1949 established their own government in the Western zones that was formally named the Federal Republic of Germany, more commonly known as West Germany.

The Soviets then undertook their own state building in the eastern sector, but they knew that to succeed they would have to create the illusion of a state with many of the attributes of  a Western-style democracy.  The Potsdam Agreement, negotiated among the Soviet Union and the Western Allies, called for building democracy in Germany.  And the Soviets were content to do so, as long as the democracy that emerged in their sector placed East Germany squarely on the path toward a full-fledged Marxist-Leninist state.

While they have every intention of ultimately creating a communist state, they dressed up their state-making in liberal democratic clothing.  And these efforts were reflected in the symbols the communists developed for the East German state, including flags.

Virtually from the beginning of their occupation of their sector, the Soviets staged elections, inviting several “anti-fascist” parties — the Christian Democrats, Liberal Democrats and Peasants — to participate.  As Stalin’s request, the East German leaders even organized the National Democratic Party to accommodate former Nazi Party members, former Wehrmacht personnel and members of the displaced middle class.

The Clasped Hands of the Socialist Unity Party

Moreover, the Soviets, having concluded that the Communist Party was incapable of winning a democratic majority of its own, forced a merger with the far more popular and democratic socialist party, known in Germany as the Social Democratic Party.  They staged a joint conference of the two parties and at its culmination the party leaders — Communist leader Wilhelm Piecke and Social Democrat Otto Grotewohl — rose to shake hands to affirm this unity.   This merged party became known as the Socialist Unity Party, and while it purported to be a unity party, it functioned as party undergoing transition into a full-fledged Marxist-Leninist party patterned after its Soviet counterpart.

SED-DDR

Erich Honeker (Right) and other leaders pictured in front of the clasped-hand Socialist Unity Party emblem at a party meeting in Karl-Marx-Stadt (Chemnitz) Photo: Courtesy of the Deutsche Fotothek of the Saxon State Library.

The image of two hands forming a handshake become the symbol of the party rather than the iconic Hammer and Sickle, which was far more commonly associated with Marxist-Leninist parties.

This marks the beginning of a long-term East German practice: The use ambivalent symbolism to project to the rest of the world that East Germany was a People’s Democracy rather than a full-blown Marxist-Leninist state in the tradition of the Soviet Union.  East Germany purported to function as a people’s democracy, governed principally by the Socialist Unity Party, but supposedly in dialogue with other, so-called Bloc Parties, such as the Christian Democratic and Liberal parties.  All of these Bloc parties were afforded a limited number of seats in the Volkskammer, the East German parliament.

The same method was employed in the East German youth movement.  The Soviet and East German communist authorities invited young people from other anti-fascist backgrounds to participate, but over time, the democratic elements were rooted out and the movement transformed completely into a communist-led youth organization.

ndpd-penant

A pennant of the National Democratic Party of Germany, which was comprised of former Nazi Party members.

In time, virtually all the symbolism of the East German reflected this intentional obfuscation.

The communist regime settled on the Weimar Republic flag, bearing three horizontal stripes of black, red and gold and that dated all the way back to the liberal Revolution of 1848 – the same flag chosen by West Germany.   There was some support for adopting the black-white-and-red imperial flag, which was used by the National Committee for a Free Germany, an anti-fascist group organized among captured German officers during the war.

Both East and West Germany flew the Weimar Republic flag until 1959, when East Germany affixed its national emblem to the flag: a wreath of barley wrapped in black, red and gold bunting and enclosing a hammer affixed with a compass.  The wreath of barley represented farmers, the hammer, workers, and the compass, intellectuals and technocrats.

Hammer-and-Compass

The Hammer and Compass, East Germany’s Coat of Arms, which was affixed to the German Tricolor flag in 1959.

The national symbol departed significantly not only from previous German symbols from older symbolism in general.  Most western countries, particularly western European countries, followed the time-honored rules of heraldry, which governed how shields, wreaths and other symbolic devices related to each other.

East Germany and other communist countries, largely to underscore that they represented a new departure in human history, broke with these old heraldic practices.  They designed new symbols that drew from a variety of graphic elements, some rather mundane, such as Romanian coat of arms, which featured an oil derrick set amid a forested mountainside with a sunburst in the background.

Yet side from the inclusion of this somewhat communist-looking symbol on its national flag, most East German flags carried few references to communism.

FDJ-Singers2

Young People of the Free German Youth celebrating under a banner featuring the organization’s sun-burst emblem.

The flag of the Free German Youth, which effectively expelled its non-communist elements, was entirely bereft of communist symbolism. And that goes for the East German Pioneer organization, roughly the equivalent of our Cub Scouts.

Inspired by their Soviet patrons, East German leaders were convinced that they could create an entirely socialist man, much has their Soviet patrons had.

Yet, all of these attempts at fusing liberal democratic with Marxist-Leninist practices and symbols to create the appearance of constitutional democracy had little effect in engendering much loyalty among ordinary East Germans.

A Failed Socialist State

What the East German leadership created instead was a sham — a state in which hundreds of millions of discontented people were forced to eat rationed, substandard food, to use shoddy products, and to subscribe to discredited ideas, even as their fellow Germans across the Elbe River in the West created one of the most successful countries in the world: an economically and vibrant constitutional democracy.

The East German people eventually rose in righteous indignation.

The flag of the Revolution of 1848, embodying the values and ideals of 18th century liberalism, flies over a unified capitalist and liberal democratic Germany today.  And the vast majority of Germans living east of the Elbe River are happy that this flag is bereft of the hammer and compass and barley wreath that once symbolized the peculiar Marxist-Leninist interlude of German history known as the German Democratic Republic.

Posted in Jim Langcuster, The Fall of the Berlin Wall | Tagged , , , , , , | Leave a comment

The Limits of Autodidacticism, Part II

Churchill-sten

Churchill practices with a Sten gun during a visit to a Royal Artillery experimental station in June, 1941.

I’ve fallen into referring to New York Times columnists, David Brooks and Ross Douthat, as the dancing conservative bears of Mainstream Media. Their presence on the Times’ Editorial Page essentially serves to assure the Times’ erudite readers that they are tolerant and enlightened enough to withstand at least a mild daily exposure to conservative opinion. But like all dancing bears, these two columnists remain relevant to their readers only so long as they stay within the circle and don’t wander off.

Even so, I must agree with Brooks’ recent appraisal of Donald Trump’s rowdy, unconventional management style.

Trump essentially has sold himself to millions of heartland Americans as a gifted autodidact — a novice who not only brings a fresh, take-no-prisoners approach to the presidency but is also capable of learning quickly on the job.

Heartland voters elected him to cut through the Gordian knot and, if you will excuse this mixing of analogies, to drain the swamp.  But as I mentioned in a previous post, there are limits to autodidacticism – self-taught learning.

We can draw some lessons here from another leader widely regarded as a gifted autodidact, arguably modern history’s greatest one: Winston Churchill. To a significant degree, Trump is the populist right’s Churchill. Like Churchill, he is perceived by his supporters as an erratic, albeit talented and strong-willed outsider – an anti-establishment scrapper and self-taught learner ready and willing to employ unorthodox methods.

But like Churchill, Trump is still presented with the reality of governing a nation still run on conventional rules – complex, technocratic ones.  And because of his outsider, maverick status, he’s been forced to forego establishment insiders and to run his government with second- and, increasingly, third-string players.

And as all maverick leaders, at least, the ones ultimately proven great, finally realize is that sooner or later they must broker some compromise with hard reality and realpolitik.  Learning on the job only carries one so far.  One eventually must invest some a measure of trust in the specialists, those who have acquired a strong working knowledge of all the nuances of statecraft.

Interestingly, I was reminded of this only a few days ago watching a recent film that explored Churchill’s wartime premiership, starring the unusually gifted and versatile Scottish actor Brian Cox as Churchill.  As it turned out, the film dealt with the limits of Churchill’s autodidacticism and maverick views.  The very traits Churchill summoned to steer the British elite away from a negotiated peace with Hitler and to inspire the British people during the dark days of the Blitz arguably did not serve him as well as the war dragged onto in 1944 and as American and British war planners struggled to settle on the strategy for mounting a cross-Channel assault on Normandy.

As a former soldier and policymaker, Churchill had amassed an encyclopedic knowledge of geopolitics and war-marking strategy.  But he had spent a lifetime struggling to master military tactics and strategy along with many other demands: serving in Parliament and building a successful column- and book-writing career, to name a few.

Compared with his military planners, Churchill remained a strategic novice. He lacked immersion in military planning –  general staff training and, in the case of Eisenhower, Army War College instruction that distinguished the backgrounds of most top British and American brass by the mid-Twentieth Century.

Small wonder why Churchill’s insistence on micromanaging grand strategy in the later war years had pushed the Imperial General Staff and their American counterparts to the limit of tolerance as D-Day approached.

Churchill, still haunted by the fiasco at Gallipoli, held strong but antiquated views on the use of military power.  He anticipated that the assault in Normandy would turn out to be a blood bath. He had failed to see how the mass production of fighters and bombers, armor and artillery and the refinement of strategy associated with those technologies had altered the balance of military power heavily in favor of the Western Allies.  He did not fully grasp how these advances not only afforded the Allies enormous strategic advantages but also the real prospect of reducing battlefield casualties.

Churchill had reached the limits of his autodacticism. There came a point at which Churchill, for the sake of prosecuting the war against the Axis, had to step aside and to leave grand military strategy to the professionals.

Trump sooner or later will reach the limits of his own autodidactism.  Most self-learners do at some point, They are forced to delegate critical tasks to professionals – trained specialists – who have acquired a far more nuanced understanding of their fields.

The fact remains that for as long as governments operate on highly complex, technocratic rules, we will need highly trained and accomplished technocrats.

 

 

Posted in Jim Langcuster, The Passing Scene | Tagged , , , , , , , , , | Leave a comment