Wednesday, December 25, 2013

Swan Song?

  Can a blog sing a swan song?  If so, this last posting of 2013 may be The Berkeley Write's.

  As I noted recently, I have been posting on this blog weekly for almost two years, managing in nearly 100 postings—I hope coherently—to string together close to 100,000 words in the process.  I suddenly feel that I have run out of two vital ingredients: interesting new ideas to write about and the ardor to keep up the pace.  Deeper down, I have an all-consuming fear of becoming more repetitive, more trivial and triter than I may already have been.  So, at least for the nonce, and maybe forever, I am taking down The Berkeley Write's masthead.

  If this really is The Berkeley Write's swan song, what could be more appropriate than to include in its lyrics a thank-you to all those readers who have praised its contents.  I would have stopped long ago had I not been so encouraged.

  I wish you Happy Holidays and a great 2014!

George Turin

Wednesday, December 18, 2013

Will We Ever Learn?

  The scene: The wake of a major recession, caused by excesses by wealthy individuals and corporations.  The nation, particularly the middle class and small businesses, has been severely damaged.  The Democratic Party presses for reform and more regulation.  The Republican Party splits between its mainstream conservatives, who support the status quo, and a fringe group demanding a sweeping change in the party's direction.  The time: 2013, after the Great Recession of 2008? 

  Nope—I am writing about the turn of the 20th century, after the devastating Panic of 1893.  Ironically, the Republican Party's fringe were then left-wing progressives calling for legislation that would limit the power of the great trusts and monopolies, and provide more protection for unions and the average citizen.  (Remember: the Republican Party was at that time truly the party of Lincoln.)  Fortunately, the president was Theodore Roosevelt, one of the Republican fringe who—after being relegated to the powerless office of Vice President by the mainstream of the party—had succeeded to the presidency on the assassination of President McKinley.  By virtue of that accident of history, Roosevelt almost single-handedly launched the Progressive Era of the first two decades of the 20th century.

  That era is the subject of an excellent new book by Doris Kearns Goodwin: The Bully Pulpit: Theodore Roosevelt, William Howard Taft, and the Golden Age of Journalism.  As the title suggests, it is a twin biography of Roosevelt and Taft, the 26th and 27th presidents of the United States.  It also contains a series of mini-biographies of the so-called muckraking journalists, notably S. S. McClure, publisher of the influential McClure's magazine, and those who wrote lengthy exposés for it—among them Ida Tarbell, Ray Stannard Baker, Lincoln Steffans and William Allen White.  It is a massive book (over 900 pages, about a third being copious end notes documenting its many quotations) but for all its length a very worthwhile read.

  I need not summarize Bully Pulpit here.  That has been done in a splendid review in the New York Times by Bill Keller, formerly executive editor of the paper.  Suffice it to say that the biographies of Roosevelt and Taft are fine pointillist paintings of the men, their families, their philosophies and their long-term interdependence—gripping even for those who have read about them previously.  The biographies of the muckrakers are much shorter, but explain how each became a stentorian voice for the people and against conglomerations of corporations and corrupt politicians.  The book illuminates how Roosevelt, Taft and the muckrakers interacted to change the course of the nation.

  Of course, Teddy Roosevelt dominates the book, as he did the era.  Scion of a rich and famous family, he was brash, hyperactive, and multi-talented: a prolific writer, a consummate politician, a warrior, a rancher and a big-game hunter.   He made his mark as a politician by being marvelously open to others' insights, especially the muckrakers' investigations of excesses in the country, always probing to get an understanding of the thinking of the masses.  Taft plays a secondary, supportive role—a counterbalance to Roosevelt's sometimes-immoderate forays.  They both used the Congress, the courts and executive power to break up trusts that had monopolized the oil, steel, railroad and financial industries through bribery, corruption and intimidation; and to empower the middle class and labor unions in their opposition to those trusts.

  Goodwin's book cannot help but focus one's attention on the repetitive folly of the business cycle, as new generations forget the lessons of the past.  We have had three such major cycles in the United States in the past 150 years, each starting with a period of increasing laissez faire that led to an extraordinary disparity of wealth, income and power between the moneyed classes and the rest of the population.  Major economic crises ensued, followed by periods of reform:
 
In the aftermath of the Civil War, the period of laissez faire was accompanied by the ascent of the robber barons in railroads, steel, oil, finance and other industries.  The Panic of 1893 called forth the reformers and regulation of the Progressive Era described above. 
Subsequent laissez faire excesses during the Roaring Twenties led to the Great Depression of the 1930s.  Franklin Roosevelt's New Deal then added massive new regulation to the economy, reining in the free-wheeling financial and industrial sectors and empowering unions.
In the 1970s and 1980s, laissez faire came back in favor, with many of the New Deal's regulations being reversed.  Subsequent excesses caused the Great Recession starting in 2008.  There is a still-ongoing period of regulatory reforms, against those who purblindly again want the market to "do its magic" unimpeded, and therefore are busy chipping away at those reforms.

  Will we ever learn?  I doubt it, for such madcap cycles have been occurring with regularity since at least the Dutch tulip mania of the early 17th century.  In this opinion, I am in the good company of John Kenneth Galbraith who in his little gem of a book, A Short History of Financial Euphoria, concludes that "there is probably not a great deal that can be done.  Regulation outlawing financial incredulity or mass euphoria is not a practical possibility." 

  When will the next disaster hit?  No one can tell, not even Nobel Prize-winning economists, for all their expertise.  As Galbraith once famously said, "The only function of economic forecasting is to make astrology look respectable."

Wednesday, December 11, 2013

A Career in Uniform

  After writing about my maiden ocean voyage to Europe [1], I got an email from my cousin G [2], "swapping stories" by describing his own first trans-Atlantic trip.  It was courtesy of the U. S. Army on his way to a tour of duty in Germany: nine days in 1953 as an inductee, bunked in a "stateroom" with fifty others on cots stacked three high.  My own year-earlier trip now seems quite effete, but the comparison reminds me how lucky I was never to have been drafted.  In a strange way, as it turned out, the country was even luckier.

  I did have a career in uniform of a sort, which by default ended in my making my particular contribution to the nation's military.  That career started modestly when I was just 12.  I had joined the Boy Scouts immediately after America was drawn into World War II by the attack on Pearl Harbor, doing so because most of my friends did.  I was also attracted by the idea of wearing a uniform in those military-dominated times—although that brought an uneasy reminder to my mother of another uniform I would have to don if the war lasted long enough.  (It didn't.)

  There was also the excitement of automatically becoming a member of the Civil Defense Corps, a group being trained to respond to an enemy air raid.  When the sirens sounded, I was not to shelter in the central hallway of my apartment with my family, but to put on my uniform with a special lightning-bolt armband signifying that I was a messenger, and report to my assigned command post in New York City's streets.  Despite my mother's immediate anxiety about that role, the likelihood of an air raid was near zero, since it could only be done from Germany's sole aircraft carrier, which would have been detected long before getting within range of our shores.  Enemy U-boats were of course always present offshore, but they were more concerned with sinking ships than lobbing a few shells at cities.  Each "air raid" was only a test of the system;

  What could be more thrilling to a 12-year-old boy than being outside in pitch-black streets, delivering messages from one command post to another, notifying wardens of violations of the blackout, learning to distinguish between incendiary and other types of bombs and what to do about each, and generally participating in a war "game"?  

  Some five years later, the war over, my love affair with uniforms had vanished.  By then, I was in the Reserve Officers Training Corps in college, compulsory for two years, and had to wear a uniform on the three days a week when drills were held.  I could not have been a worse student in ROTC, getting in it the only C grades of my college career.  Faultlessly participating in lockstep drills on the parade ground was beyond me;  indeed, with my mind on physics or chemistry, I twice to my embarrassment dropped my rifle during a review of the regiment held in the armory for a visiting general from Washington.  (Do you have any idea how a dropped rifle echoes in an armory?  I can still hear the reverberations.)  Once, when disassembling a sidearm—supposedly I was to be able to do it with my eyes closed—I let go of some doohickey I shouldn't have, releasing a spring that, shooting across the room, almost permanently closed my sergeant's eye.  My marksmanship with a rifle was so poor that my allotment of bullets was sequestered for use by the rifle team.  And I cannot imagine how I did it, but I even failed an open-book exam on strategy and tactics! 

  So it was very fortunate for the security of the nation that I was able to opt out of ROTC at the end of two years, ending my career in uniform.  If I had been forced to stay in for four years, I would have graduated from college with a second-lieutenant's commission in the Army reserve.  My God!—who knows what military disasters might have been precipitated if I had been placed in the front lines in that capacity?  The country was much more secure for having had me in a support role in defense industries, as described in [3] and [4].

Wednesday, December 4, 2013

The Internet Revisited

  This year I have been curiously mute about the impact on our lives of the Internet and information technology more generally.  Last year, on the other hand, I was obsessed with the subject, publishing no fewer than seven postings about it.  I tended to be mostly negative, despite all the undisputed benefits the web places at our fingertips. 

  For example, I joined MIT professor Sherry Turkle in being disturbed by the "alone together" syndrome, epitomized by groups of people staring at their smartphones and tablets rather than engaging face to face with each other—a modern preference to communicate by texting or tweeting rather than by physical presence [1].  With writer Nicholas Carr, I wondered what the Internet is doing to the wiring of our brains—whether our constant multitasking is making it harder for us to think linearly, as we do when we concentrate on reading a book [2].   With New York Times columnist David Brooks, I worried that online learning will diminish the passionate, interactive experience that education should be [3].  And with Columbia professor Tim Wu, I tried to parse the Internet forces that could constrain rather than hugely multiply the availability of information [4].
  Now comes a book that is almost unequivocally positive: Clive Thompson's Smarter than You Think: How Technology is Changing our Minds for the Better.  His motif is stated plainly at the outset: "If this book accentuates the positive, that's in part because we've been so flooded with apocalyptic warnings of late."  Indeed, the very apocalypses of others are boons to him.  Thompson has morphed the original Apocalypse's nefarious four horsemen into three very unapocalyptic Internet virtues: 
Memory Augmentation: We supplement our brain's memory using personal electronic devices and the Internet's huge data banks, thus freeing us from the chore of memorization so we can do more "human" things: intuit, invent, conduct relationships. 
Focusing our thoughts:  We all are now writing far more than most of our forebears by incessantly blogging (mea culpa!), emailing, texting, IM'ing, tweeting, etc.  In thus writing down our ideas, we feel forced to hone them more precisely than we would if merely thinking or mouthing them. 
Networking:  Through our constant online interaction, particularly in social networking, we have developed an "ESP-like 'ambient awareness' … of what others are doing and thinking," which expands our ability to understand people we care about and to dispel "pluralistic ignorance" of people at a distance from us.  Networking also makes us more collaborative.
  Thompson's initial chapter, "The Rise of the Centaurs," illustrates the first of these virtues—memory augmentation—by using the example of chess.  Technology makes it possible for the best computers to beat the best chess masters, because computers can quickly explore every possible chain of moves seven or more deep in the light of a huge memorized archive of classical strategies, and choose the best next move, which can beat a mere human's intuitive understanding of the state of play.  But Thompson points out that even moderately good chess players augmented by modest computers can beat either the best chess master or the most powerful computer playing alone.  That is, a "centaur" combining the human brain with a computer's memory and speed trumps all. 
  That's the main theme of the book: humans will progress by delegating functions like memory and calculation to machines, while reserving human, un-machinelike capabilities to themselves—a complementary hybrid of organic and silicon chemistry.  My son likes to think of this in sci-fi terms: as a stage of evolution of Homo sapiens.  Many foresee the next step as direct electrical connections between the brain and the machine.  Rudimentary links of this type have already been fashioned to control artificial limbs—cyborgs rather than centaurs, so to speak—but so far not to memory augmentation.
  I myself can attest to the second virtue—focusing our thoughts.  In the past twenty months I have written almost 100,000 words on this blog, a rate of composition that astonishes even a retired academic like me.  The process has forced me to turn inchoate thoughts on a large variety of subjects into what I hope are well-structured arguments and sentiments worthy of being read.  Every paragraph—indeed, every word—I write is examined and re-examined until it expresses the meaning and nuance I intend.  Thompson claims that, despite the Internet Age's sometimes regrettable outpouring of badly composed screeds, my experience is the more common; people in general are taking added care in polishing their writings to a fine sheen.  Further, he contends that increased literacy with the written word has led to adroitness in other media when words alone won't do, e.g., video commentaries on such sites as YouTube.  Activity like this by ordinary people, when broadcasted, has broken the stranglehold the powerful have traditionally had on public speech.
  The impact of the third virtue—networking—is even more dramatic, says Thompson.  First, people are constantly electronically telling each other of their doings, however trivial (to the point where I wonder how they have any time for other daily pursuits). Thompson asserts that this activity serves to create an invaluable "ambient awareness" that binds society together and replaces the less spontaneous and less efficient water-cooler and coffee-klatsch chit-chat of yesteryear.  Second, on-line collaboration in projects such as Wikipedia has shown that placing our knowledge on the web has a more personal quality than just storing cold facts.  The so-called wisdom of crowds comes into play, in which many small, independent and interactive contributions to a project can lead to faster and more accurate solutions of problems.
  A brave new world in the offing?  We'll see.  Even Thompson worries that outsourcing memory to machines might impair the Eureka! moments that arise unbidden from the brain's obsessive and subconscious searching for relationships among the myriad items stored in its own memory (see [5]).  He is concerned about the falling away of privacy as people sometimes rashly share thoughts that then permanently become embedded in the web's memory.  He notes too that a collaborative network structured by sharing many people's ideas can fall prey to a dominating personality who can transform the group into lemmings—independent brainwork by each of its many members is essential.  On the other hand, Thompson is less worried about the disruption multitasking imposes on our ability to concentrate for long periods on a single task, feeling that each of us will somehow figure out how to suppress multitask interruptions when we need to, while taking advantage of their value at other times.
  The pell-mell advance of the Information Age is of course unstoppable.  None of us can predict where it will lead in ten years, much less a century, no more than anyone in the early 19th century could have predicted the impact of industrialization one hundred years later.  Will Homo sapiens evolve by the 22nd century into a new cyborg/centaur species—call it Homo sapiens artificialis—with implanted electronics fully integrated into it?   Will it be a self-perpetuating species of the wealthier among us who can afford the implantations for themselves and their offspring?  Will Homo sapiens itself become a subordinated species? 
  Maybe we are headed for an apocalypse after all.

Monday, November 25, 2013

An Old-Fashioned Thanksgiving

  Thanksgiving has always, hands down, been my favorite holiday, probably because it combines three things: an exhilarating snap in the air as winter approaches, even in California, which Easterners imagine has no seasons; a community feeling that it is celebrated equally by all Americans; and a lack of sectarian religiosity and commercialism.  (Well, not quite the last—commercialism is creeping in with the recently invented atrocity of Black Friday, the day after Thanksgiving, when consumerism for Christmas kicks into high gear.  This year, many stores are not even waiting for 12:01 a.m. Friday to open for their pre-Christmas sales, but have invaded Thanksgiving afternoon.  Cartoons show buyers shopping while still gnawing on a turkey leg!)

  When I was young, the holiday meant to me a gathering of family and hours of playing with cousins, unencumbered with religious observances and replete with a luscious feast.  In college, it meant driving with friends from Boston to New York through the still-remaining glories of New England Fall foliage, to spend a few cozy days with family.  As I progressed through life, raising my own family, it became a joyful occasion for thankfulness that we had gotten through another year and were about to embark together on a new one.

  For all that, I always wanted to get a sense of how the holiday was celebrated centuries ago in New England—maybe not as far back as the first Thanksgiving in 1621, which was still under brutal conditions, but say a century or so later.  It was a very romantic idea, I knew, not likely to be fulfilled amidst the creature comforts of the twentieth century.  Nonetheless, in 1984, I booked the Thanksgiving weekend for my family in a small colonial-era inn in central Massachusetts.  After picking my son up from his college near Boston, the four of us drove to the inn through light flurries of snow.  Not enough snow to inhibit driving, but enough to imbue the trip with a dreamy aura and make me twice pass through the hamlet in which the inn stood—a hamlet so small that I must have been blinking my eyes briefly each time we passed it.

  The inn was authentically colonial, from its many-centuries-old stout construction, its multiple fireplaces, and its antique decor, down to the furniture, beds, rugs and quilts.  We quickly got into the mood of those older days.  Then, for the entire weekend we ate nothing but game with wild vegetables, all cooked according to old-fashioned recipes.  What could be more scrumptious than wild turkey on Thanksgiving, together with wild cranberries and such!  Our walks were through the woods surrounding the inn, their gorgeous Fall foliage whitened by a dusting of snow. 

  That throw-back Thanksgiving still stands out in my mind almost thirty years later as very special.  Although it wasn't at all like the Pilgrims' first Thanksgiving in Massachusetts in 1621, which celebrated survival in the face of enormous adversity, it was an almost surreal confirmation of an old American tradition started then.

  This year, I'm pleased that—despite Black Friday—the quality of the holiday I love so much remains pretty much intact.  I  hope you  have a good one!

Wednesday, November 20, 2013

Fifteen Objects

  In January, I wrote about having met Richard Kurin, undersecretary for history, art and culture of the Smithsonian Institution, at a reception in San Francisco.  He gave a talk about the Smithsonian's amazing complex of museums and research centers and its collection of almost 140 million artifacts.  He also mentioned that he was writing a book to be called The Smithsonian's History of America in 101 Objects.

  I was abashed at my lack of knowledge of the Smithsonian's sweep, which I tried to repair by visiting its website, particularly its Collections section, spending many pleasurable hours rummaging through what I called America's attic.  As I recounted in my January posting, I chose, in serendipitous order, fifteen objects from the attic that I thought should be included in Kurin's book.  Note that I magnanimously left him 86 additional objects with which to fill in the rest of America's history—as it turned out, from the Cambrian era until today.

  Kurin sent me an email a couple of months ago, letting me know that the book was soon to be published, and astounding me by saying "of the objects you named on your blog," which the hostess of the reception had sent him, "just about everyone is in the book.  For those that are not—there is another item that is included that gets at the same topic, theme or event. So you were spot on!"  That burnishing of my ego instantly impelled me to pre-order the book.  And lo! as if by magic it appeared on my iPad while I slept in the early morning of the publication date.  I spent the next two weeks engrossed in it.

  The book is in itself an objet d'art: lavishly illustrated, beautifully written, meticulously researched and intensively end-noted.  For those who like history per se, it can be read linearly as a chronological narrative.  For those who favor artifacts, it can be sampled at random, object by object.  Either way, the depth of description of both the objects and the history surrounding them will surely captivate you.  This is a must read for history buffs, lovers of artifacts, and aficionados of the Smithsonian.

  Kurin's 101 objects run several gamuts—in time, from a collection of half-billion-year-old Burgess Shale fossils, to the Giant Magellan Telescope currently being built in Chile by a consortium led by the Smithsonian Astrophysical Laboratory; in size, from a small postal date stamp retrieved from the wreckage of the U.S.S. Oklahoma at Pearl Harbor, forever frozen at December 6, 1941, the day before the attack that brought the U.S. into World War II, to the huge Enola Gay bomber that dropped an atomic bomb on Hiroshima, effectively ending the war; in culture, from esoterica such as Thomas Jefferson's cut-and-paste New Testament, purged of material he thought contrary to reason, such as miracles and references to Jesus' divinity, to the pop-culture of Mickey Mouse cartoons.  Each is accompanied by a lovingly written, fact-laden essay chronicling its importance from its creation throughout the rest of American history.  It's an 800-page tour de force, providing a unique insight into the story of America.

  So how did I do in my recommendations of fifteen objects for inclusion in the 101?  Here's my original unchronological list, on which I've inserted a √ mark against the twelve objects that made it into the book, and added an italicized note to the other three, indicating the closest matching object in the book.

å A piece of Plymouth Rock, representing the migration of Europeans to settle in America.
å Eli Whitney's cotton gin, which made slavery an economically viable and indispensable institution for the South.
√• Any one of Thomas Edison's many inventions—say the light bulb or the phonograph—representing one of the pre-eminent inventors in American history and the vast impact of such inventors and inventions on our civilization.
√• One of the early personal computers—a Commodore or an Apple—which augured the stunning shift to our now-webcentric lives.
å A Model-T Ford, the car that almost alone made Americans mobile.
å The Woolworth's "Whites Only" lunch counter in Greensboro, North Carolina, where in 1960 four African-American college students staged a sit-in, an event that helped ignite that decade's civil rights movement.
å The chairs and table from the Appomattox Court House that Generals Grant and Lee used when signing the documents ending the Civil War.
  •The pen used by President Lincoln to sign the Emancipation Proclamation.  (Emancipation Proclamation Pamphlet.)
å George Washington's Revolutionary War uniform.
å The Wright Flyer, which made the first heavier-than-air flight, an invention that further increased our mobility.
å The shuttle Discovery, representing the advent of the Space Age.
å The ruby slippers worn by Judy Garland in "The Wizard of Oz," symbolizing both the power of the movies in our national culture and the advent of Technicolor.
√• The original Star-Spangled Banner from Fort McHenry in 1814—an emblem of the fight to defend the young America from invasion and the inspiration for our National Anthem.
  •A poster from the Longest Walk, a 1978 American Indian civil rights march from California to Washington, D.C., protesting the continuing devastation of reservations and violation of treaties and tribal rights that have characterized the fate of Native Americans.  (Gay Civil Rights Picket Signs.)
  •A barracks sign from one of the relocation centers in which Japanese-Americans were interned during World War II.  (A piece of art painted by a detainee in one of the centers.)

  My ego, already burnished by Kurin's kind words in his email to me, fluoresced when I compared my list with his full list of 101 objects.  I felt as if I had aced an important final exam.  But the fluorescence dimmed considerably when I thought of Kurin's Herculean effort in choosing 101 objects from 140 million candidates, elaborating the provenance of each, and writing at length of its intimate connection with American history.  My few hours of poking about through the Smithsonian's website suddenly seemed very dilettantish compared to the years of effort, the thousands upon thousands of hours of exhausting labor, that I know he expended.

  But, hey, ego boosts are rare enough for me these days.  My ego is thankful for any it gets.

Wednesday, November 13, 2013

Young and Solo in Europe

  More reminiscing—I can't stop the flood of memories of my youth from overwhelming me.  Today I am in a reverie about my first journey abroad, in 1952, when I was an oh-so-young 22.  A memoir of most of that trip, spent in England at a summer job, is part of a previous posting [1].  But the summer also included two brief stays on the Continent, which were not accompanied by the self-confidence I had when I revisited the Continent ten years later [2].

  Flying across the Atlantic was uncommon in 1952—it was a long and uncomfortable flight on a DC6 propeller plane, which we would now call small, with refueling stops in Newfoundland, Greenland and Iceland.  Almost everyone took a boat then, as I did.  My boat sailed from Hoboken, NJ, to which my mother and I drove in her car.  She saw me off with a mixture of hugs and anxiety on both our parts.  As I sailed away, still waving to her on the dock, I realized to my consternation that I had the keys to her car in my pocket!  I later found out that she—always one who planned for the unforeseen—had an extra set in her purse.  

  I learned an important lesson from that incident: there are situations when you cannot uncast the dice, so obsessing about their roll is futile.  I would be enisled for five days, unable to turn back the clock or the boat to return the keys.  Those who rush to and fro nowadays, accustomed to instantaneous action and response, cannot know the sense of total relaxation such a suspended state imparts: one is unalterably in the hands of Fate.  In this case, Fate was accompanied by the camaraderie of a boatful of boisterous youngsters like me, almost all on their first trips abroad, together with a seemingly limitless number of cases of Heineken beer.  (It was a Dutch ship.)

  We docked in Le Havre, most of us then taking the boat-train to Paris.  On it, reality brought us down to earth with a thump: the still-omnipresent, depressing reminders of World War II.  The landscape was quasi-lunar, with craters pockmarking it everywhere, and the ruins of as-yet unrebuilt villages standing as signs of the carnage less than a decade before.  It was a blessing to arrive in undamaged Paris, which had been declared an open city by both sides.

  On this leg of my trip to Europe, I stayed in Paris only overnight, the following day taking a boat-train to Calais, then progressing by ferry across the Channel to Dover and on to London by train, and the next day to my summer job in Essex.  As I reported in [1], London had not at all been spared destruction, as had Paris; desolation from the Blitz was everywhere.  As a sheltered American now amidst it, I could not fathom what it must have been like to live there through the War.

  At the end of the summer, my exhilarating job in England completed, I returned to the Continent.  I flew to Brussels (my first airplane flight), spent a few days of sightseeing there, and then went by train to Munich.  Again the landscape showed the War's devastation, as did Munich itself.  What idiocy, I thought, that supposedly civilized people descend over and over again into such ruination!  I must confess, though, that I couldn't sympathize with Munich as I had with London.  I felt just a schoolyard indignation—"You started it!"  More sober reflection would have reminded me that children and others who had nothing to do with starting the War had nonetheless died horrible deaths in Munich as well as in London.  None of us, alas, is much removed from Paleolithic feelings of vengeance.

  After the spartan food of England, which was still rationed and lean after the War, my stomach was unprepared for the richer fare and heavy sauces of the Continent.  That, and my angst at traveling alone in countries whose languages I didn't understand, precipitated a debilitating round of stomach ailments that impeded my enthusiastic tourism.  By the time I reached Munich, I needed medical help, so I took a series of trams to an American military hospital, showed my passport, and pled for assistance.  (It was thankfully forthcoming in the form of an examination and medications.)   In Zurich, my next stop, I checked myself for two days into a clinic, whose nuns put me right, even with no common language between us; I could dredge up only a few words of Yiddish, which I hoped would have close cognates in German. 

  It was only when I went on to Paris for a week, joining a friend who had worked at the same company in England as I had earlier in the summer, that my health and equanimity returned to normal.  I stayed in a student dormitory at the University of Paris, and found I was able to eat its cafeteria fare, even to imbibe from the carafe of wine that magically appeared on each tray.  "A meal without wine is like a day without sunshine"—a new experience for me.

  As I said out the outset, my roving on the Continent in 1952 was not underpinned by the aplomb that my more mature self had on a lengthier junket there ten years later.  It would have been so nice for this naïf in 1952 to have had the ability of today's 22-year-olds: press a few buttons on a Skype-enhanced cellphone and seek succor from friends and family at home.  In 1952, when the only means of conversation with someone in the States was a public-telephone call, prepaid at $5-10 per three minutes ($50-100 in today's currency), I was forced to remain incommunicado—a frightening experience for the then-me, daunted to be so isolated for the first time in my life.  I felt a surge of relief as I returned to Le Havre, there to embark for home and familiarity. 

Wednesday, November 6, 2013

Eve, Lilith and Astarte

  In 1945, my junior year at New York City's Bronx High School of Science, the school became coed.  The transition wasn't an epiphany for me about girls' equal rights.  It just puzzled me—why would girls want to study science or engineering?  "Female scientist" had a minimal credibility after Mme. Curie's work earlier in the century; "female engineer" was an oxymoron.  One simply didn't think of women in connection with these and most other professions.

  Captive as I was to the mores of the society in which I'd been raised, my attitude wasn't unusual.  Although I blush to say this now, I actually asked one of the coeds, while riding on the subway with her one day after school, "What do you plan to be: a secretary or a teacher?"  I plead nolo contendere to the charge of having been a dunderhead, claiming extenuation: I was merely parroting my elders.

  My topic today, however, is not sexism in the professions per se.  Rather, I want to explore the source that underlies sexism throughout society.  The book I discuss below has convinced me that, at least in the West, it is religion—the fundament upon which our civilization is built.  The hierarchies of the so-called Abrahamic religions (Judaism, Christianity and Islam), and probably most others, are even at this late date almost exclusively male, and the few women in them are largely confined to the lower echelons.  Gender inequality in religion has invariably led to its flourishing in the larger society.

  I was propelled into this line of thought by email correspondence with my son's sister-in-law, Ruth.  Raised Catholic, in her twenties she searched for a religion where women were equal.  (Interestingly, she is the namesake of possibly the sole woman in the Old Testament who made her own decision about which god to follow.)  Finding no religious equality anywhere, Ruth decided that "religion was a confidence trick of the highest order."  She mentioned a 1970s book by Merlin Stone that she'd read at the time, When God was a Woman.  It's still available digitally; I found it an illumination.

  Stone tells a fascinating story.  She starts by pointing out that male domination of the Abrahamic religions began with the male-fabricated myth of the Garden of Eden, where God created Adam in His image, and as an afterthought created Eve to serve as Adam's helpmate.  Eve immediately showed her inferiority by defying God and precipitating the Fall.  Women have paid dearly since.  Listen to the much later New Testament (I Timothy 2:11-14):

"Let the woman learn in silence with all subjection.  But I suffer not a woman to teach, nor to usurp authority over the man, but to be in silence.  For Adam was first formed and then Eve, and Adam was not deceived, but the woman being deceived was in the transgression."

  Christianity, of course, went on to be dominated by a celibate male priesthood preaching the concept of Original Sin stemming from Eve's infraction.  Mystical Jewish writings like the Zohar, probably influenced by a passage in Isaiah cited below, postulated that Adam had had a first wife, Lilith, formed from the same dust as Adam, possibly even before him.  She—alas for patriarchy!—became the first feminist, asserting that she had been created equal and refusing to be subservient to Adam.  She fled Eden, later to be tormented by angels and turned into a she-demon, eternally surviving among us to tempt men into sin.  The Lilith experiment having failed, God tried another, creating Eve from one of Adam's ribs as his more passive but still sinful helpmate. 

  Is it any wonder that Western civilization was brainwashed about the relative standing of the sexes by the two Testaments and the generations of males who interpreted them?  Women never had a chance.

  It wasn't always this way.  Stone goes on to document that, before Abraham (who putatively lived some 4000 years ago) and his male God Yahweh, the Goddess Astarte (Athtart, Ashtoret, Ishtar, Ate, Asherah, Attoret, Anath, Elat, Hathor, et al., in various ancient languages) was almost universally recognized as the principal deity around most of the Mediterranean and even farther afield.  Contrary to the pernicious image of woman that was to be attached to Eve, Astarte was revered as creator, law-maker, healer, wise counselor and prophet.  Correspondingly, societies having Astarte as the principal deity tended to be matrilineal and matriarchal—property and inheritance ran through women, as did the management of affairs of home and state. 

  The evidence Stone presents for the ancient dominion of the Goddess is compelling. Wherever excavations of upper Paleolithic, Neolithic and early historical sites have found evidence of religion, it has usually been accompanied by idols of full-breasted goddesses, often surviving emplaced in wall niches.  The oldest Sumerian tablets tell of a principal Goddess, mother of all other gods, and this myth propagates to subsequent cuneiform records of early antiquity.  Those and other writings—particularly in ancient Egypt and by later classical Greek and Roman historians—testify in addition to widespread matrilinealism and matriarchy in those times. 

  It was only during the third and second millennia BCE that the tide turned, as waves of invasions by Aryans from the north (later called Indo-Europeans) descended on the Near East, bearing with them a male supreme God, patrilinealism and patriarchy.  Stone posits that these incursions brought to Abraham the seeds of his religious tenets, since he was born in an invaded area.  As the Old Testament progresses, Yahweh commands the destruction of all images and worship of Ashtoret (Astarte's Hebrew name) wherever found, which was done with a pitiless wrath, especially as the Israelites conquered Canaan.  In the passage from Isaiah mentioned above, Lilith (who in myth was one of Ashtoret's priestesses) is represented as a night monster.  Even as late as the Koran, Stone finds ongoing evidence of the influence of the Goddess and God's animosity toward her: "Allah will not tolerate idolatry … the pagans pray to females."  Soon God had completely supplanted Goddess in the Western world, and patriarchy correspondingly replaced matriarchy.

  It was one of the West's many catastrophes.  In a posting last year, when asserting that the fairer sex is fairer in both senses of the word, I looked to a future where women could lead us with a feminine sensibility.  "Maybe in another generation or two," I wrote, "when it hopefully will not be so difficult to climb the ladder while carrying along a feminine worldview, we will have a world whose ethos is completely different, and better."  I wasn't talking about the Margaret Thatchers and Angela Merkels of the world, who got to the top by beating men at their own game.  My model is Eleanor Roosevelt, who didn't have to become masculine in order to reach the heights, and therefore was able when there to maintain her feminine discernment and responsiveness in effectively addressing society's needs. 

  What kind of society could have been built over the millennia if God had been a woman all along?  Or was it inevitable that the collective testosterone of men would have overcome Her ministrations anyway?

Wednesday, October 30, 2013

W2QKU

  Those were the call letters of my amateur radio station almost seventy years ago.  It was a modest affair, operated only by using Morse code—I  couldn't afford to build a voice transmitter, and in any event my mother and sister wouldn't have appreciated hearing my voice until the wee hours during which I usually operated the station.  I don't know how many thousands of times I tapped out those call letters over the two years that I was an active ham:  . - -    . . - - -   - - . -    - . -   . . -

  My fascination with radio technology started when I was about 10 years old.  It was the equivalent then, I think, of falling in love with computer technology as a youngster today.  I started studying radio theory and building radio after radio as I learned new details.  I soon found out about ham radio, but by that time, late 1941, the U.S. had entered World War II, and amateur radio was shut down for the duration as a precaution against its use for espionage.

  By the end of the war, I was fully prepared both to take the test to get a ham license and to build my own station—actually a series of stations whose transmitters had ever more power.  I remember climbing to the top of the water tower on the roof of my ten-story apartment building to install one end of the most well-sited antenna I could—not a small feat considering my quaking knees and the length of the antenna, some 60 feet.

  Then came many late nights—after I had done my homework, but more importantly when transmission at the frequencies I used would be best.  At first, with my initial low-power transmitters, I was able to contact other hams only in surrounding areas.  Later—what excitement!—I was able to contact stations throughout all of the then 48 states and, mirabile dictu!, amateurs throughout the world.  In these days of the Internet, making international one-on-one contacts is so commonplace that it may be hard for young people to understand that then each new one was an accomplishment of some magnitude.

  It was a tedious procedure, especially given the slowness of Morse code: twenty words per minute was an excellent speed.  I would start by repeatedly tapping out "CQ de W2QKU" (CQ being international code derived from the English "seek you").  Then I would tune my receiver through neighboring frequencies to try to find a response—someone sending my call letters back to me followed by his or her own call letters—amidst the chatter of other stations and omnipresent static.  Or, of course, I would start by listening for others seeking a contact and respond to them.  After making contact, a Morse-code conversation would ensue, full of abbreviations like those used today in texting except that they were established by international agreement (the so-called "Q" codes).  Then both parties would confirm the contact by sending to the other their own custom-designed postcards; I soon had postcards from all over the world with resplendent stamps on them.

  At times I would deliver messages to neighbors—a free, custom radio-telegraph service in those days when long-distance telephone calls and telegrams, particularly international  ones, were very expensive.  In one case, I repeatedly relayed messages back and forth between a father in the pre-Castro Cuba of that time and his two daughters who lived but a few blocks from me in New York City.

  During the seven years of my radio hobby, until I went off to college, my cousins often made fun of me for having my head stuck in radio equipment all the time—the complete 1940s nerd.  But immersing myself in radios turned out to have been a worthwhile effort, for it was an introduction to a broad field of engineering that occupied all of my professional career. 

  Viva il nerdismo!

Wednesday, October 23, 2013

A Bum Rap

  Richard III may have gotten a bum rap.  Since his death in the Battle of Bosworth Field in 1485, he has been almost universally vilified as a monstrous king—murderous, Machiavellian, deformed in mind and body.  Most of us have absorbed this judgment from Shakespeare's eponymous play, itself largely based on an earlier book attributed to Sir Thomas More.  The trouble is that history is written by the victors.  Richard was the last Plantagenet king, while More and Shakespeare wrote their politically correct accounts under Henry VIII and Elizabeth I of the usurping Tudor line.

  So what?  Why in the world would I be interested in the rivalries of English royalty more than five centuries ago?  Two reasons.  First, the recent discovery and exhumation of Richard's remains created a splash in the newspapers (see New York Times article) and a fresh discussion of history's verdict on him.  Second, in a desperate search to find a good mystery novel to read, I found citations of Josephine Tey's 1951 book The Daughter of Time as first and fourth, respectively, on lists of the best 100 mystery books of all time by the British Crime Writers' Association and the Mystery Writers of America.  Tey was a writer of British police procedurals, not my favorite mystery genre, but such acclaim was hard to ignore.  So I bought the book after reading its synopsis on Amazon.com and finding to my surprise that the mystery it investigates is Richard III's bum rap.

  A little dynastic history is needed here.  When King Edward IV died in 1483, he left his two sons under the Protectorship of his brother, Richard.  The elder prince, 12-year-old Edward, ascended the throne as Edward V, but that was soon challenged.  Edward IV's marriage to Prince Edward's mother was declared by an act of Parliament to be invalid, as records showed that he was already married at the time to another woman.  The two princes were therefore illegitimate and not eligible for the throne.  Richard III—next in line—was anointed king.  He reigned for only two years until dying at Bosworth at the hands of forces under the Earl of Richmond.

  Richmond assumed the throne as Henry VII, the first Tudor king.  He had everything to gain from defaming the defeated Richard in order to shore up his own legitimacy as king, which was weak from the viewpoint of bloodlines.  (The closest he came was as the great-grandson of an illegitimate son of a younger son of a king.)  Aspersions were therefore retroactively cast at Richard, the most odious being the alleged murder of the two princes.  That is the crime indelibly etched in all our minds by Shakespeare's play.

  Tey uses the detective Alan Grant of a number of her mysteries as the protagonist in her book, who tries to unveil the truth about Richard.  I can do no better than quote from the Amazon.com synopsis:

"Inspector Alan Grant of Scotland Yard, recuperating from a broken leg, becomes fascinated with a contemporary portrait of Richard III that bears no resemblance to the Wicked Uncle of history.  Could such a sensitive, noble face actually belong to one of the world's most heinous villains—a venomous hunchback who may have killed his brother's children to make his crown secure?  Or could Richard have been the victim, turned into a monster by the usurpers of England's throne?  Grant determines to find out once and for all, with the help of the British Museum and an American scholar, what kind of man Richard Plantagenet really was and who killed the Little Princes ..."

    In mulling over the evidence he acquires, Grant asks a bevy of trenchant questions.  Among them:
If Richard was the fiend he was later portrayed to be, why do contemporary, pre-Tudor accounts paint him as a gentle noble, in touch with the people, a good administrator and a "good lord" with a "great heart," who often forgave his enemies.  Why was his great villainy discovered only after Henry VII's accession?
  Why did the most vicious attacks on Richard occur even later, by those who had no first-hand knowledge of him—e.g., Thomas More, who was just five when Richard ascended the throne and seven when he died?
The Little Princes disappeared from view only some time after they had been delegitimized and Richard III crowned.  Since they were by then no threat to his ascension to the throne, what motive could Richard have had in having them murdered, especially since he forgave so many actual enemies and maintained an ongoing friendly association with their mother?
Why would Henry VII, in drawing up a Bill of Attainder against Richard III immediately after being crowned in 1485, list in it any number of Richard's purported crimes, but not mention the most heinous, the murder of the Little Princes, which he only later alleged?  Does this mean that he knew they were still alive at the time he started impugning Richard's reputation?
Soon afterward, Henry VII married the Little Princes' sister.  He had the delegitimizing act of Parliament rescinded, presumably in order to re-legitimize her and strengthen his own claim to the throne.  But that also re-legitimized the Princes and restored their succession to the throne, thus challenging Henry's own hold on it. Did Henry therefore know that they were by that later date dead, and therefore posed no threat?  In fact, did he have a hand in their death? 
Why was Sir James Tyrrel—who was said to confess in 1502 to the Princes' murder—given a general pardon by Henry VII in June 1486 and then an unheard-of second general pardon a month later?  For what crimes?  Had he been Henry VII's agent in the Princes' deaths?

  Grant, using the police procedures of Scotland Yard, makes a case that (1) Richard III was scapegoated by the Tudors and their supporters for all sorts of malefactions in order to strengthen the weak Tudor title to the throne, and (2) the disappearance and presumed murder of the Princes was likely Henry VII's doing.  In my mind, it is a strong case.  But the pall over Richard III was heavily laid by More and Shakespeare, and only after the Stuarts replaced the Tudors as monarchs of England in the early 17th century did some historians dare try to remove the stain, not very successfully.  The controversy continues to this day.  Tey, although not a historian, did much to tilt the balance more in Richard III's favor.  

  Interesting history.  I myself wouldn't list The Daughter of Time among the top 100 mystery novels, but it was certainly a captivating read.

Wednesday, October 16, 2013

Pan's Pipes

  For some weeks, writing this blog has steered me into remembrances of times long past.  I guess that's what happens when old geezers get even older—they frequently fall into sepia-toned memories, if they are lucky enough to have memories at all. 

  For example, while writing in [1] about heirloom fruits and vegetables, I found myself in a sentimental reverie about picking wild berries in a summer camp I went to in the 1930s—and wrote about that in [2].  Again, when pondering last week the transition from mythology to historicity in [3], I was nostalgically transported back to the summer of 1962, leading me to describe my stunning experience when I came upon the palace of Agamemnon, where mythology and history intersect.  In turn, describing that brush with antiquity reminded me of a mystical contact I had the same summer with Pan, the ancient Greek god of the wild, of shepherds and of rustic music.  Here's how it happened. 

  For the first time, that summer over fifty years ago, I had both the opportunity and money to travel widely, without any immediate objective.  I was at the beginning of my professorial career at UC Berkeley and still a bachelor—I hadn't yet met Helen.  Since I was to attend a technical conference in Brussels at the end of the summer, I decided to roam Europe and the Near East for two months before it, with no particular itinerary in mind.

  I started in Paris, wanting to renew my two brief stops there ten years previously, en route to and from a summer job in Britain [4].  From Paris, I wandered by car south-easterly in France almost at random, stopping where the spirit took me.  One of those places was Annecy, where I won about $500 at a casino.  Flush with that windfall, I headed to the Côte d'Azur, where I parted with much of my loot by staying at the Hotel Negresco in Nice, a Belle Époque watering spot that was then still singular in its luxury.

  After that touch of indolence, I resumed wandering, crossing northern Italy to Venice, then taking a boat down the Adriatic and through the Corinth Canal to Piraeus, the port of Athens.  The smog and bustle of Athens offended me, so I struck out by car for the Peleponnesus, where I had the startling encounter with the ghost of Agamemnon mentioned above.  Now besotted with antiquity, I decided to go to Rhodes, an island in the Aegean just off the coast of Turkey where most of the cultures of the ancient world intersected.  It is filled with relics of successive invaders, and once was the site of the Colossus, a 100-foot-tall bronze statue that was one of the seven wonders of the ancient world until it was destroyed by an earthquake in the third century BCE.  

  The pull of antiquity led me to drive to the ancient acropolis at Lindos, taking me across a good part of the island.  It was a very hot day, so I stopped at an isolated, rustic taverna to have a bite to eat and a carafe of wine.  That's when it happened.  After eating, I lay down under a tree for a brief doze before driving on.  The combination of the wine and the sun flashing on the fluttering leaves above me must have been hypnotic, for I went into an other-worldly state.  I will swear to this day that I heard the pipes of Pan; I could almost see him.  For the only time in my life I completely knew—at the level of my soul, not in some intellectual rationalization—what it felt like to be possessed by a god.  I understood why the ancients invented so many gods to enrich their existence.  It was mind-bending.  

  That all seems so silly and romantic now, a half century later, that I hesitate to write about it.  Yet it happened, and it brought me infinitely closer to the antiquity on which I was feasting.  I might even say that it is the one truly religious experience I've ever had, although in retrospect I suppose that it was merely psychedelic.  The rest of my summer—further eastward to Israel and then a return to Western Europe and my conference—was anticlimactic.