Wednesday, December 26, 2012

Limericking the Muse

  I didn't publish anything in this blog during most of August.  New ideas simply weren't coming to mind.  You must know the feeling: you think until your head aches, and come up empty—the Muse has disappeared.  Despite being a rationalist, I cannot help believing in her existence; haven't I indeed dedicated my blog to her in its subtitle, Musings?  I wait impatiently to feel her presence, and panic when I don't.

  That rationalist in me keeps on saying, "Nonsense! Don't be so romantic," insisting that my "Muse" is just a small fold of brain tissue, the anterior superior temporal gyrus (aSTG), which I wrote about in May.  It functions subconsciously, apparently by obsessively searching for important but hidden relationships among the myriad fragments of data stored in the brain.  Images from fMRI machines show that it becomes especially active (finding a relationship?) a few seconds before a conscious insight, an Aha! moment.  It seems to be most active when one is relaxed, possibly daydreaming, as was Archimedes in his bathtub. 

  I do in fact find my musing most lively during that daydreaming half-hour or so before I am fully awake in the morning.  That is when my brain has started to return control to my pre-frontal cortex, the center of analytical thinking, which acts as a sanity check, an inhibitor of our most outrageous thoughts, a site of our conscience—a Jiminy Cricket who goes to sleep when we do, allowing the rest of our brain to run amok with phantasmagoria.  The half-awake arousal period is when Jiminy is drowsy too, so our thoughts flit back and forth with no apparent sense of order, yet not at the level of phantasms.  Inspiration is then in attendance, apparently guided by the aSTG.  "Sleeping on a problem" has yielded fruit.

  These are fascinating scientific perceptions, but insufficient to capture the sheer marvel of inspiration.  The romantic in me responds derisively to the rationalist: "You would expound on the diffraction of red light around the earth's limb to explain the awe I feel on viewing a gorgeous sunset!"  I throw my lot in with my romantic self and with the ancients, who saw the Muse's presence as a link to the wisdom of the gods, and who thanked Helios for sunsets.  Communing with the goddess is what musing is truly about, not the random firings of neurons.  As Hesiod said almost three millennia ago, "Happy is the man whom the Muses love: sweet speech flows from his mouth."

  Still, trysting with a goddess can be a love-hate relationship.  A goddess, yes, but also an unfaithful tormentress.  Here's what others have said:

“But the fact is, she won't be summoned. She alights when it damn well pleases her.  She falls in love with one artist, then deserts him for another. She's a real bitch!”  Erica Jong

"I would especially like to recourt the Muse of poetry, who ran off with the mailman four years ago, and drops me only a scribbled postcard from time to time."  John Updike

I too have harbored such resentment.  When she vanished last August, I angrily wrote to her—she who traditionally listens more to poets than to writers of prose—in the only sort of rhyme I can handle, especially in her absence:

Oh! Muse, why didst thou desert me?
I so need thee to alert me
            In my waking dreams
            To splendid new themes.
How wouldst thou feel if thou wert me?

Art thou Erato or Clio?
I know not.  O Sole Mio!
            Whichever thou art
            Please mend my sore heart,
And revitalize my brio!

Wait! I have to end with a threat.
Know thou, if my plea is not met,
            Thine ears I'll assail
            With more doggerel
Than thou, my Divine, hast heard yet.

An offer thou canst not refuse!
Be quite sure, Olympian Muse:
            I can deftly kick
            Out more limerick
Should I decide thee to abuse!

  Fearing, I suppose, my spewing out more such claptrap, the Muse returned.  Just the same, I realized I'd offended her, and became more anxious than before.  I'm increasingly at her mercy, more tormented by her infidelities.  She drives me so crazy that today, when she has distanced herself again, I am trumpeting my anger to the world at large. 

  Whatever am I doing?  Publicly offending a mortal woman is more than any man should dare, but a goddess?  Maybe I've really done it this time!

  No!  Spurned lover that I am, I'll stand my ground.  I only wish I'd had the courage to make my limerick bawdy, as this anonymous one advises:

The limerick packs laughs anatomical
In space that is quite economical.
            But the good ones I've seen
            So seldom are clean
And the clean ones so seldom are comical.

I guess I'm insane enough to insult the Muse, not so insane as to insult her obscenely.

  Last August, I conjectured that the Muse was merely on summer holiday.  That seemed to be the case, for she returned in September.  Maybe she is on winter holiday this time, and will return in January despite my impudence.  Zeus! if ever a father has influenced a daughter, intercede for me with yours!

Wednesday, December 19, 2012

Getting from A to Z

  Some of us, very methodical in solving problems, get from A to Z in 25 diligently organized steps.  Others get there in intuitive leaps, from A to K, then to Q, then to Z, perhaps backtracking a bit along the way when confused.  Neither process is definitively better, for methodical and intuitive people can be equally creative and successful in working things through.

  I am reminded of the two sons of good friends of mine—call the older S1 and the younger S2.  S1 is  systematic in the way he tackles tasks, S2 is instinctive.  Many years ago, when S1 was about 10 or 11, he got a new computer for Christmas.  As he unpacked it, he immediately went to the instruction manual, not only to check whether all the components were there, but to prepare to assemble them.  S2, three years younger, simply started to plug the parts together and had the computer operational before S1 was even a few pages into the manual.  The two are equally smart.  I used to fault S1 when he wrote an essay because he would assemble more information than he needed to sustain his theme.  I used to fault S2 because he left out convincing arguments that were obvious to him, although maybe not to his audience.

  I am one of those who likes eventually to visit every letter in the alphabet.  If I make an intuitive leap over some of them to enhance my understanding and project my path to Z, I then force myself to go back and systematically fill in the gaps to verify that my intuition hasn't led me astray.  That used to drive my late wife Helen crazy.  If I'd gotten an answer intuitively, she would wonder, why did I waste time shoring up the result analytically?  For her, intuitive leaps sufficed.

  I most assuredly endorse intuition as a method of attacking a problem, especially if it is preliminary to deeper investigation.  One of my postings in March, Intuition and Expertise, was indeed an essay in praise of it.  However, as I said then, I believe that intuition must be informed by expertise, and expertise comes only through arduous practice.  A chessmaster can in a flash intuit the course of a game many moves in advance by just glancing at the board, yet only because of years of hard effort.  An amateur's intuitive move is likely a mere stab in the dark.  Even my friends' son S2 assembled that computer so quickly because he had spent years of his earlier youth plugging electrical circuits together just for the fun of it, so he knew what made sense.

  Helen would also complain when I carefully plotted a route through a strange country, usually planning a fast, minimum-distance drive along autoroutes: A to Z in a trice!  "Why not wander along byways and get lost?" she would say.  "Maybe we'll see some lovely, unexpected sights. We are, after all, tourists!"  Impatient as I habitually was to get to Z, I still had to admit that she had a point, at least for tourism.  When she prevailed over my sense of efficiency, we often did come across beautiful vistas, towns, churches and the like.  It's what I call discovery by meandering.

  Transposed to problem solving, discovery by meandering—in a usually vain hope for serendipity—is an ineffective heuristic.  I liken it to jumping through the alphabet at random, guided neither by method nor intuition, placing one's faith in a stroke of luck.  Even then, one must have a well-honed ability to recognize luck when stumbling upon it, an ability that is itself part of intuition.  Without it, an amateur chess player may miss a lucky opportunity in front of his eyes.  Being truly lucky is largely a matter of preparing one's intuition to take advantage of Fortune when she smiles, not of simply praying that she does.

  In my earlier posting on intuition, I described a grand old professor of mine at MIT, Professor Ernst Guillemin, a master of the intuitive method of teaching and learning.  I remember his presenting a paper at a symposium at a time when younger members of his field had turned toward proving results through methodical sequences of theorems.  He said, "I'm not going to try to present a series of theorems and lemmas to get to my results—I wouldn't know how.  But I'm pretty sure that I will be able to convince you of my results by showing you how intuitively reasonable they are."  And he did.  Because he was a grandmaster with decades of experience, he likely knew that his results were correct because his brain had flashed through all the intermediate steps subconsciously.  And even if he stumbled on a result by sheer luck, he immediately intuitively recognized its worth.

  Few of us are purely methodical or purely intuitive; most are a combination of the two.  Although some at the intuitive extreme, like Professor Guillemin, will be inclined to leave it to others to fill in the gaps to their satisfaction, most will themselves backfill with careful analysis.  Those toward the methodical extreme will usually be unsatisfied with a result unless it also appeals to their intuition.  I like to think that I fall about halfway along the spectrum, mixing intuition and method in equal measures. 

  Were Helen alive today, I imagine she would disagree with my self-assessment, instead placing me squarely among those who address life systematically at every step.  Luckily for me, that's probably one of the reasons I was able to catch her in the first place—I guess she intuitively wanted someone she could count on to methodically conduct our affairs.  If so, it was a good trade-off, for I much needed the spontaneity she brought me, which I now sorely miss.

Wednesday, December 12, 2012

Another Saint of Education

  The best of educators are by their nature dreamers, because they focus not on the here and now but on shaping generations to come.  That engagement with the future might explain why they accept payment for their work that is often inadequate for their own needs in the present.  I recently wrote about Chris Bischof, the founder of Eastside College Preparatory School in East Menlo Park, California, who is a paradigm of this commitment.  I've lately come across another: Anne Crowden, founder of the Crowden School in Berkeley, California, which is for young musicians.

  Last month, I attended a fund-raising reception for the Crowden School and Music Center.  I went mainly because of my fondness for the reception's hosts, not at all  knowing what to expect.  I got a beautiful earful: a string quartet of eighth-graders and recent alums from the school playing a work of a seventeen-year-old who graduated from it three years ago.  I don't know whether it was the acoustics of the hosts' living room, the beauty of the composition, or the sheer virtuosity of those very young players—likely a combination of the three—but I have never before been so immersed in and moved by a chamber-music performance, literally to tears.  I decided that I had to find out more about Crowden.

  I found a gem of education, figuratively sitting at my doorstep; I am sure that I have driven by it a thousand times without noticing it in plain sight.  On my visit, I encountered several dozen fourth- through eighth-grade pupils in their morning music lessons, practicing in ensembles.  Even in practice sessions, those nine- to fourteen-year-olds were playing as young professionals. 


The Crowden School

A sextet practicing with a teacher
  
  On a later visit, I saw the entire student body watching a performance of the San Francisco-based Alexander String Quartet, one of frequent guest visits by professional musicians.  It was exhilarating to watch: those youngsters were leaning forward in their seats, intensely absorbing every bow-stroke and fingering of the Quartet as it played a piece by Schubert.

  The school was Anne Crowden's dream and accomplishment.  A Scotswoman and concert violinist, she first played a stint with the Edinburgh String Quartet before joining the famed Netherlands Chamber Orchestra in Amsterdam.  The separations from her young daughter while she was on tour were too much for her to bear, however, so when a group of friends urged her to move to the Bay Area and offered to sponsor her for permanent residency in the United States, she felt in her bones that some good spirit was telling her something.  On that hunch, she moved to Berkeley in 1965, and was soon playing with the Oakland Symphony Orchestra, the San Francisco Opera/Ballet Orchestra, and chamber music groups at universities around the Bay Area.  The move sat well with her, and a bonus was that she was able to be at home for her daughter.

  Teaching music was always in Crowden's blood, a yen that was only partly satisfied by having a roster of private students and teaching at summer music schools.  Her dream was really to start a school for young musicians like those she had seen in London.  Such a dream, if it is to come to fruition, is the point at which dreamers must descend from the ether and face the constraints of reality—in this case, no money, no site, no students and no faculty, all wrapped into a giant chicken-egg conundrum.  It was an arduous multiyear task, juggling all those elements until they cohered, but with the help of many local parents and the endorsement of international musical luminaries, Crowden was able to complete it.  In 1983 she started Crowden School in a church basement with 13 students in grades six and seven.
 
  The rest is history.  Crowden School is now in its 30th year, since 1998 occupying its own spacious building, bought from the Berkeley School District and lovingly renovated.  I believe it is unique in the United States: a private middle school spanning five grades, in which pupils spend each morning on instruction in music technique, ensembles and practice, and each afternoon on academic subjects—English, math, science, history, foreign languages, etc., combined with music theory and history, and chorus.  Its  graduates go on to high school fully prepared for academic challenges as well as deeply educated in music.  Many later become professional musicians.

  The school is part of a subsequently formed umbrella organization, the Crowden Music Center, which also provides year-round music instruction and summer music courses for all ages.  Its Outreach program offers music classes at elementary schools in Berkeley and Oakland.  The Center has thus become a substantial multipurpose resource for the community.

  Anne Crowden's original vision for the school was for it to be tuition-free, so that talent alone would be the criterion for entrance.  Alas! that has not come to pass.  Of the annual budget of $1.4 million, about 19%—the totality of charitable donations received—is applied to tuition assistance for most of its pupils, who otherwise would not be able to attend.  The school currently cannot afford to assist all qualified applicants.  Were charitable donations to increase enough, enrollment could expand from the present 55 to the school's capacity of 75-80.

  At the reception that introduced me to Crowden, the rising American composer Sam Adams, who graduated from the school 12 years ago, gave a stirring appreciation of his five formative years there.  He also played a stereo recording of one of his works—a stunning piece of electronic music.  His comments, the recording he played, and a New York Times review of the recent San Francisco Symphony premiere of his orchestral work Drift and Providence tell a compelling story: that musical education at Crowden, although concentrating on chamber music, is a reflection of its founder's core belief in music's power to transform the soul, opening it to new vistas.

  I am sad that I will never get to meet Anne Crowden; she died in 2004 at the age of 76.  But I look forward to engaging with her legacy.  And I have added her to my personal pantheon of saints of education, where she joins Chris Bischof. 

Wednesday, December 5, 2012

Eusociality

  Certain invertebrate species such as ants, in which individuals subordinate their own needs to those of the group, are said to be eusocial.  The term has also been applied more loosely to some vertebrate species, particularly Homo sapiens.  Unlike ants, though, humans act selfishly for their own survival and reproductive benefit as well as altruistically for the benefit of groups to which they belong.

  Eusociality is the thought-provoking subject of my book club's latest selection, The Social Conquest of Earth by E. O. Wilson of Harvard University.  Wilson is one of the world's leading biologists and probably its leading entomologist.  His lifelong study of eusociality in insect colonies led him to study it in humans, in order to explain how we came to dominate the biosphere.  In describing the difference between eusociality in ants and humans, Wilson wryly observed that "Karl Marx was right, socialism works, it is just that he had the wrong species."

  A warning before I proceed: Wilson is one of the originators and a strong proponent of a controversial sociobiological theory which asserts that eusociality is dependent on evolution of groups as groups, a level of evolution above that of individuals.  As did Darwin with ant and bee colonies, he sees the group as a macro-organism, subject to forces of natural selection.  It is a top-down theory, in which the group selectively chooses as members those who express a set of innate altruistic traits favoring the group's success in competition with other groups.  Wilson is an equally strong opponent of an alternative evolutionary theory called kin selection, a bottom-up theory, which posits that altruistic behavior in groups evolves only because individuals in the groups already share family genes, and accordingly at times act selflessly to assure the perpetuation of their genes in others if not in themselves. 

  The argument literally rages.  Evolutionists Jerry Coyne of the University of Chicago and Richard Dawkins of Oxford have unsparingly attacked Wilson [1,2].  In the remainder of this posting, however, I'll summarize some key points of Wilson's book as well as I can—it is not always easy for a layperson to follow.

****

  The development of eusociality is quite rare.  Of the many millions of invertebrate species, only a score or so have independently achieved it, the remaining species competing largely at the level of individuals rather than groups.  (Think flies and ants.)  Among vertebrates, eusociality is even rarer, having occurred in just a handful of species, including Homo sapiens.  Its appearance in ants has led to their dominance over the insect world, in humans to their dominance over the world at large.

  What is amazing, Wilson maintains, is that eusociality has occurred at all, for the evolutionary maze that leads to it is so extensive and full of twists and dead ends, that for any species to wend its way through the maze is something of a miracle.  After all, eusociality must get its rudimentary start in a species where the rule is selfish natural selection at the individual level, and selflessness/altruism the exception.  Still, ants and their precursors negotiated the maze successfully over many millions of years, as did the line of hominins leading to Homo sapiens in a somewhat shorter time.  Wilson describes the process as involving a long sequence of small evolutionary pre-adaptations over hundreds of thousands to millions of years, leading to a tipping point where the random mutation of as little as a single gene in an individual could start the final conversion to full eusociality.

  Eusociality in humans is of course very far from the extreme of the genetically homogeneous, fixed-caste behavior of ants, yet goes much farther than the social behavior of other vertebrates.  A critical competitive advantage of humans vis-à-vis other vertebrates is that they build defensible, multigenerational, task-allocating cooperatives to which members may have no genetic kinship.  Each community, Wilson says, selects for inclusion individuals who express a desirable subset of altruistic behaviors, most of which were already deeply ensconced in the human genome at the time Homo sapiens emigrated from Africa some 60,000 years ago.  The chosen subset reflects the evolved culture of the group.

  The tension in mankind between the opposing forces of individual/selfish natural selection and group/selfless selection is complex.  Genetic evolution in the former arises from competition between members of a group, in the latter from competition between groups. The two types of competition point evolutionarily in opposite directions because one requires selfishness and the other altruism.  Homo sapiens maintains a tenuous balance between the two because of what Wilson calls the "iron rule": selfish individuals beat altruistic ones, yet groups of altruists beat groups of selfish individuals. 

  The relationship between genes and culture in eusocial groups is also a critical element in group evolution, Wilson asserts, and equally complex.  In building a culture, genes make available not just one trait as opposed to another, but patterns of traits that together define the culture.  The expression of multiple traits is plastic, allowing a society to choose an ensemble from many available combinations, the choices differing among societies.  Wilson contends that the degree of plasticity itself is subject to evolution by natural selection.  These concepts add another perspective to the nature/nurture question, very different, e.g., from one presented in a book by psychologist Bruce Hood that I discussed in my posting The Self.

   A final, quintessential point.  For Wilson, the indispensible requirement for the development of human eusociality—without which group selection could not have proceeded—is the primitively evolved instinct to form tribes.  He says that people feel they must belong to tribes for their existential sense of identity, security and well-being.  Eons ago they were small bands that hunted and gathered together, built encampments, and defended each other and their young.  Today, individuals join many interlocking tribes—city, country, religion, even sports teams—each commanding loyalty, communal effort, and competition with other tribes of the same genre.  In  this sense group evolution is bilateral: groups select individuals based on group needs, and individuals choose groups based on their own needs.

  As evidence that the tribal instinct is extremely deeply embedded in the species, Wilson cites a recent experiment showing that, when pictures of out-group people are flashed in front of experimental subjects, their amygdalas—their brains' centers of fear and anger—activate so quickly and subtly that the conscious centers of their brains are unaware of the response.  Other experiments revealed that even when experimenters created groups at random, inter-group prejudice quickly established itself, subjects always ranking the out-group below the in-group.

****

  Many of the ideas in Wilson's book are indeed controversial, and at times I found them alternately too detailed or too sketchy.  Still, it is a feast for thought, which has forced me to reconsider some previous musings in this blog.

  First, the deep-seated need for tribal association undermines my conjecture in a recent posting, where I discussed tribalism/clannishness and what I called their antithesis, "anticlans."  I suggested that tribalism might be ready for obsolescence in our modern, globally hyper-connected society.  I was of course out of my depth, and therefore took to wild speculation.  I asked whether the modern obsessive inclusiveness of social-networking activities could lead to a widespread "we're all in it together" anticlan behavior, which over the long run could evolutionarily trump the exclusionary, "we vs. them" behavior of clans. 

  I think Wilson would answer my question with a definitive "No."  He spends just one paragraph of his 350-page book noting that the increasing interconnection of people worldwide through the internet and globalization weakens the relevance of ethnicity, locality, and nationhood as sources of identification.  Tribes may wax and wane and sources of tribal identification may shift, but Wilson would hold that tribalism itself will survive, overwhelming anticlan behavior just as it has always defeated less-tribal species.  Bummer.

  Second, the almost-impossibility of negotiating the evolutionary labyrinth to yield a species like Homo sapiens sheds further light on my posting on the search for extra-terrestrial intelligence (SETI).  In a catalog of obstacles to the success of SETI, I listed the rarity of a star having a planet capable of hosting life at all, much less an advanced civilization that was existent at a time when electromagnetic radiation from it might be detected by us now.  If Wilson is correct about the very remote possibility that Homo sapiens could have evolved on Earth, the odds against there being a similar one extraterrestrially are even larger than I thought.  Could we therefore be alone in the Galaxy, dare I say in the universe?  We'll probably never know.  Bummer again.

  I sometimes wish that newly developing scientific theories didn't get in the way of one's fondest hopes.

Wednesday, November 28, 2012

The Other Side of the Coin

  Several of my friends, knowing my aversion to things military, were aghast at the revelation in my most recent posting that I'd been involved in military work in the summer of 1948.   Despite their shock, though, they had the kindness to ask for other tales of my early professional days.  I'll oblige, but probably will upset them again by disclosing still more of my early defense work.  It's the reverse side of the coin whose obverse shows my present, more pacific image.

  In that last posting, I surveyed the geopolitical context of the summer of 1948: the Soviet blockade of Berlin and the Berlin Airlift.  Things got worse after that, becoming a period of heightened fear and paranoia in the US. Many were predicting a nuclear Armageddon; others were calling for a first strike on the USSR while it was behind in the nuclear arms race.  Anti-communist hysteria raged.   (I remember my mother hiding away books by Russian Bolsheviks that she had bought in 1920 as a college student, and a colleague at one of my defense jobs being discharged and blacklisted because his father had been a radical during the Great Depression.)   The Korean War, which started in 1950, was seen as an augur of things to come, part of an ongoing communist conspiracy to take over the world.  Bomb shelters were a hot item.  To begin to understand the zeitgeist of the era, it's worthwhile watching the superb satirical movie, Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb.

  Most engineering jobs at the time were defense oriented, rather than civilian.  So, after earning my Master's degree from MIT in 1952, I joined MIT's Lincoln Laboratory, a just-started undertaking for the Air Force aimed at upgrading the country's air-defense system.  One element of the system was to be the Distant Early Warning (DEW) Line—a network of radar stations in far-northern Canada and Alaska, built to detect Soviet bombers coming over the polar region.   I first worked on a classified radio system, NOMAC, which would provide a minimally detectable and jam-proof connection between the DEW Line and the Air Force's central command. Later, still at Lincoln Laboratory, I did my MIT doctoral thesis on theoretical models of the radio-propagation distortion that could degrade the operation of NOMAC.   

  The work was at once technically exhilarating and dismaying in its Cold War context, yet strangely normal for one who had been weaned on the upheavals of the Great Depression and World War II.  From my present-day viewpoint, the main good to come from it was that the signaling technique we used in NOMAC became a predecessor of code-division multiple access (CDMA), which is now used by many cellphone carriers such as Verizon; and my radio-propagation studies led to much of my later academic work, which in turn was used in the development of the GSM cellular-signaling system employed by other carriers such as AT&T.

  Since I'm in a revelatory mode, here's more grist to chew on.   While working on my doctorate, I also consulted for EG&G, a company with origins at MIT that developed instrumentation for the atomic-bomb test range in Nevada. My task was to analyze how such instruments distorted the measurements they made, with an eye to correcting the distortion.  In the course of that work, I visited the test range.  Testing was then still done above ground, on towers; I actually saw such a test from 13 miles away, wearing goggles with a 10,000-fold attenuation.   It was overwhelming and sobering.  But, once more, I must admit that it just seemed part of the era's reality, and in that context was by force of habit unexceptional to me. 

  And there's yet more.  After getting my doctorate in 1956, I worked for four years at the Hughes Aircraft Company in Los Angeles, doing theoretical work that in part underpinned the development of radar systems for fighter aircraft.  An unclassified presentation of some of that work at a conference led to my being recruited to teach at the University of California at Berkeley in 1960.  What's amazing is that, except for a brief stint as an office boy before college and three work-study semesters at the radio and  television manufacturer Philco as an undergraduate, UCB was my first non-defense job.  I was already thirty.

  Even after having just again watched Dr. Strangelove, it's hard for me to re-create in my mind both the temper of those times and how natural it seemed working under its influence.  Those who grew up later, I believe, cannot understand the miasma in which we were engulfed.  The lifting of the cloud was slow, starting with the downfall of Senator McCarthy in 1955, and the subsequent abating of anti-communist hysteria.  It received a major impetus from President Eisenhower's farewell address in January 1961, when he warned of the unprecedented power of what he called the "military-industrial complex," which he saw as threatening to change the country's very core principles—a grave statement from a lifelong military man.   None the less, echoes of the 1940s and 1950s reverberated for decades, through the Vietnam era and beyond.

  Eisenhower's speech was a clairvoyant precursor of the remainder of the 1960s, yet in an ironic way.  Student protestors, also fearful of the powerful, conformity-inducing "establishment" (which included the military-industrial complex), ultimately did change the very nature of our society.  But the transformation was in the opposite direction from the one that Eisenhower worried about: rather than a change to more military fire power, it was ultimately to the hippie generation's flower power.  As I've written elsewhere in this blog, I had a front-row seat at that upheaval too.
  
  The reverse side of my coin, before it flipped to its obverse in 1960, shows a side of my career that is scarcely recognizable to me now.   As the ancients understood, Tempora mutantur, nos et mutamur in illis.  Times change, and we change with them.

Tuesday, November 20, 2012

Changing Gears

  During the summer of 1948, after my freshman year at MIT, I got a job at the ARMA Corporation in Brooklyn.  It was by sheer luck.  I had visited the company where I'd worked as an office boy before college, just to say hello.  One of the secretaries, hearing that I was looking for a summer job, referred me to a friend at ARMA, who was an MIT alumnus.  At 18, I was apparently already part of the MIT old boys' network (there were very few women then at MIT), for after a brief interview I was hired.

  ARMA, a military contractor, was then developing an electromechanical fire-control computer for calculating the settings needed by a submarine's torpedo so as to make it hit a ship whose coordinates, speed and bearing had been entered into the computer along with other data. Electromechanical computers are analog devices that do computations using a mass of motors, shafts, gears and other components.  Each such computer was built for a special purpose, unlike later general-purpose electronic digital computers.  (There were then only a handful of digital computers in the world, each taking a roomful of equipment.)  The torpedo-firing problem would be the only one ARMA's computer would be able to solve.

  My job at first seemed colossally boring.  It was to manually calculate the answers to problems the computer would be asked to solve after it was built, to make sure that it was functioning properly.  Sitting eight hours a day doing such calculations didn't seem like it would be much fun.  But the job paid $27 per week, more than my college-graduate sister was making in the nascent TV industry, so who was I to complain?

  For those who know my current-day aversion to things military, I should give some background for this decidedly war-oriented work.  By the end of World War II in 1945, Eastern Europe was under the control of the USSR.  Defeated Germany had been divided into four zones, administered respectively by the US, the UK, France and the USSR.  Berlin, an enclave deep within the USSR's zone, was similarly divided into four sectors.  The US, UK and France accessed and supplied their Berlin sectors through specified railway lines, roads and canals crossing the USSR's zone of Germany, as well as by air.

  In June 1948, the Soviets suddenly blockaded all surface connections to the western allies' sectors in Berlin, thereby trying to make the western part of the city fully dependent on the USSR for provisioning.  That precipitated the first Cold War crisis.  The US and UK, unwilling to give the Soviets such a stranglehold, pledged to supply the western sectors by air.  During the ensuing 11 months, the Berlin Airlift flew an amazing 200,000 flights to the city, each day providing West Berliners up to 4700 tons of necessities such as fuel and food.  As I daily rode the subway to Brooklyn, I read the New York Times' dispatches on the blockade, along with analyses that pondered whether a hot war with the Soviet Union would erupt.  I felt I was doing a minimal but meaningful job in this internationally tense context.

  As it turned out, the Soviets backed down the following May and lifted the blockade.  A few months later, Germany was formally divided into the Federal Republic of Germany (West Germany} and the German Democratic Republic (East Germany), with Berlin now divided between the two new countries.  Both remained formally occupied until 1955.  I was caught by the irony that, right before my eyes, the western part of a despised former enemy was emerging as an ally of the West, an indispensable "bastion of freedom" against the communist threat.  

  Back to my job at ARMA.  In those days, manual calculations involved parsing a set of equations into a number of steps, calculating each step, recording its intermediate answer on paper, and slowly working up through these steps until getting a final answer.  Each step was carried out by referencing printed tables of mathematical functions and using a desktop Marchant mechanical calculator to do the arithmetic.  Compared to modern electronic calculators, the Marchant was molasses: using a complicated system of rotating gears, it took seconds for an addition or subtraction and ten seconds or more for multiplication and division.  Grinding through even a simple set of equations could take hours, and was very prone to errors.

  I had never used a Marchant, but once I familiarized myself with it, I set about the calculations that I was asked to do.  The trouble was that the answers seemed crazy, for they weren't directing the torpedoes toward the target's coordinates that I had started with.  I was pretty sure that I was making mistakes, and was terrified, spending many days doing fruitless recalculations, and sleepless nights wondering how I could be so in error.

  After what seemed like an eternity, I understood the problem.  The new computer for submarines was based on equations similar to those designed into an existing ARMA computer that controlled the firing of torpedoes from destroyers.  But the two situations had a critical difference.  A destroyer launched torpedoes from port and starboard, a submarine from bow and stern.  The equations for the two cases therefore should have reflected the very different launching symmetries the two types of ship had with respect to their forward-to-aft axes, but they didn't.  On probing, I discovered that the difference involved a single plus sign that should have been a minus sign in the equations I had been given; a change of sign could be implemented in the fire-control computer for submarines by adding a single gear to its design, changing the rotation of a single shaft.  I redid my calculations with the new sign, and they suddenly gave sensible answers.

  At first, no one believed me.  But I persisted in working up through my boss to his boss, and convinced him.  The new gear was added to the computer.  I was ecstatic—my very first contribution to a real-world engineering design!

  As you might imagine, I was fixated on gears that summer, so they became a metaphorical theme for my thinking.   I thought of myself as up-shifting from student engineer to professional engineer.  I  thought of the country's policy toward Germany as slamming into reverse.  Change was exciting, and vivid in my imagination.  

  That exuberant youth was six decades away from being jaded with change and subscribing to the proverb Plus ça change, plus c'est la même chose.  For him it was still Plus ça change, plus c'est une bonne chose.

Wednesday, November 14, 2012

Pluto and Me

  When Pluto was recently demoted by the International Astonomical Union (IAU) from being a full-fledged planet to dwarf-planet status, many people the world over were upset by the seeming arrogance of the action.  They felt that the IAU had arbitrarily over-ruled a scientific fact they knew to be true: the Sun has nine planets.  They mourned the loss of one of them.

  I guess I had even more reason to be upset, for Pluto was discovered during the week of my birth in 1930.  In a sense, it is my birth sign.  So, when my book club decided to read The Hunt for Planet X: New Worlds and the Fate of Pluto by Dutch astronomer Govert Schilling, I plunged into it to see why and how my birthright had been diminished.  I finally decided it was all the fault of modern electronics.

  A brief history: The first five planets other than Earth—Mercury, Venus, Mars, Jupiter and Saturn—were known to the ancient Babylonians.  Although the telescope was invented in 1608, and some telescopic sightings of Uranus were reported in 1690, it wasn't until 1781 that it was confirmed as the seventh planet.  Likewise, Gallileo's drawings show that he had seen Neptune as early as 1612, but it wasn't established as the eighth planet until 1846, and that was mostly because Uranus' orbit deviated from the one dictated by Newtonian physics.  A trans-Uranian planet was conjectured as the cause, leading to a successful telescopic search for Neptune in a calculated location. 

  An earlier "eighth planet," Ceres, had been detected in 1801, orbiting between Mars and Jupiter.  It is very small, less than 1000 km (600 miles) in diameter.  But soon many other, smaller objects were discovered between Mars and Jupiter, so astronomers decided not to call them all planets, re-categorizing them as asteroids.  Adding an eighth planet to the known solar system thus had to await Neptune's discovery in 1846.  Finding Pluto took almost another century.  Its orbit is mostly beyond Neptune's; its diameter—about 2300 km—is about half the width of the U.S. and half the diameter of Mercury.

  Finding a new body in the solar system by telescopic observation had thus traditionally been a tortuous endeavor, involving hand-written records or, later, huge libraries of photographic plates, together with mind-bending manual calculations.  Then came a late-20th century break-through: mounting electronic CCD cameras on telescopes—cameras like those in cellphones, but with orders of magnitude more sensitivity and resolution.  Using digitally stored photographs from them, a computer can quickly detect a new solar-system object and calculate its orbit.  Combined with increasingly large terrestrial and space telescopes, such cameras have within the past several decades found a cornucopia of objects rotating about the sun.  The largest, Eris, found in 2005, is roughly the size of Pluto.  Should Eris therefore be called the tenth planet, and other new bodies also be added to the list?

  Astronomers were now in the same position as when they defined the asteroid belt, having to revisit the question of which objects would be classified as planets.  Like Congress drafting a bill that will benefit only some companies without naming them, the IAU—and with similar acrimonious debates—set about drafting a definition of "planet" that would include the classical eight but exclude Pluto and Eris and lesser bodies, without naming any of them.  It finally resolved that to be called a planet in our solar system, an object must satisfy three criteria:

It must circle the sun, but not be the satellite of a planet.

It must be massive enough to be in hydrostatic equilibrium under its own gravitational force, normally meaning that it is of spherical or ellipsoidal shape.

It must have cleared the neighborhood around its orbit, i.e., be gravitationally dominant in it, so no objects of comparable size are present there other than its own moons or other bodies under its gravitational influence.

Several of the eight classical planets have objects in their orbits that have not been cleared (so-called Trojans), but they are synchronous with the planet, locked in by its gravitational field so as to revolve around the Sun at fixed distances in advance or behind it.  Those planets therefore satisfy all three conditions, which is why they still qualify as planets under the new prescription.   

  Pluto, Eris and Ceres pass the first two tests but fail the third.  Pluto's orbit passes through the Kuiper belt, a region well beyond Neptune's orbit, in which more than 100,000 objects over 100 km in diameter are believed to exist, including many that Pluto hasn't cleared from its orbit but aren't under its gravitational sway.  Eris, which ranges further from the Sun than Pluto—through the Kuiper belt and beyond into the so-called scattered disk—similarly hasn't cleared its orbit.  And Ceres has neither cleared other asteroids from its orbit nor locked them into synchronism.

  Bodies like Pluto, Eris and Ceres that satisfy only the first two criteria are now called dwarf planets.  Two others have been recognized by the IAU:  Haumea and Makemake, both having diameters roughly 60% of Pluto's and orbits about the same size.  It is suspected that another 100 now-known bodies may qualify, and the eventual total may be as many as 200.  So poor Pluto, my cherished birth sign, has been legislated out of its former planetary grandeur. 

  I'm delighted, though, that Pluto retains some of its idiosyncracy.  It has a nearby sister, Charon, half its diameter but large enough so the two jointly rotate around a barycenter lying between their surfaces rather than inside Pluto, like an unbalanced dumbbell—see the illustration below.  (For comparison, the earth-moon barycenter is about 1700 km—1000 miles—below the earth's surface.)  Considering the external position of their barycenter, some astronomers call Pluto and Charon a double dwarf planet, twins so to speak, unique in our solar system; but the IAU persists in classifying Charon as a moon of Pluto.  The former designation appeals to me because of its singular glamor, but I am solipsistically pulled in the other direction: since I don't have a twin, how can Pluto, my birth sign, have one?


Artist's portrayal of Pluto and Charon [source unknown]. The added
white dot is the barycenter around which the two rotate in common. 

  At any rate, I still feel sad for Pluto.  Perhaps never having achieved glory is better than achieving it and then having it fall to others' machinations.

Wednesday, November 7, 2012

On the Other Side of the Highway

  It's heartening to know that saints still walk among us.  One of them is Chris Bischof, who by his lifework has once again shown us that children who likely would become castoffs from our society can be guided to full participation in it and full fruition of their talents.  His devotion to his calling is nothing short of transcendental.

  This was Bischof's grim starting point:  East Palo Alto, adjacent to but "on the other side of the highway" from its more well-known and affluent neighbor, has had no public high school since 1976.  All its teen-agers, predominantly from minority groups, were bused to high schools in more-prosperous neighboring towns.  Because their elementary-school education was usually below par, most were tracked in high school into the lowest-level, non-college preparatory classes; and because they were bused to unfamiliar communities, they became increasingly alienated.  Sixty-five percent of them ended up dropping out, and of the remainder fewer than 10% went on to a four-year college.  That's shocking: in a Bay Area city right next to Silicon Valley, only one in three youngsters completed high school and only one in thirty went to college!  In effect, they were being consigned by neglect and isolation to society's dust bin.

  That was intolerable to Bischof.  As a Stanford undergraduate in the late '80s and early '90s, he started an after-school program for East Palo Alto elementary-school students, linking basketball with tutoring, hoping that this intervention would improve their chances when they went to high school.  But he soon realized that ever so much more was needed.   So, in 1996, soon after getting his Master's degree in education from Stanford, he recruited fellow graduate Helen Kim to help him start a new high school in the city—not just a garden-variety one, but the incredible Eastside College Preparatory School.

  As Bischof says, they started Eastside before they were ready, but eight youngsters who had been part of the after-school program for five years were about to enter ninth grade.  They had put their faith in him to help them do better than being bused to another town for a second-tier high school education.  "Sometimes," he says, "you just have to take a leap of faith, and trust that either there will be a net to catch you, or you will learn to fly."   So Eastside started with those eight freshmen, two teachers, and an old van to pick up the students.  Until some catch-as-catch-can classroom space was found, their "schoolhouse" was a picnic table under a tree in a park.


The Original "Schoolhouse" 
  
  All of those eight freshman graduated four years later as Eastside's Class of 2000 and went to four-year colleges.  By that time, Eastside had moved into a donated house, had taken in three new classes of freshmen, and had achieved its goal of full accreditation by the Western Association of Schools and Colleges, an indispensable imprimatur.  

  Fast forward to the present, Eastside's 17th school year.  It now includes a junior high school, so it covers grades 6 through 12.  It has a beautiful campus on 5.5 acres, currently with about 325 students, all from minorities—63% Latino, 34% African American and 3% Pacific Islander.  The graduation rate is 85%, and every graduate has gone on to a four-year college, over half to the most selective colleges and universities in the country.  Ninety-eight percent have been the first in their families to go to college.  Eighty percent of graduates finish college (compared to 11% nationally for first-generation college students).


Eastside's Campus Today (Source: Google Earth)



Students with Chris Bischof (left) and Helen Kim (fourth from right)
  
  As a lifelong educator myself, I have participated in minority-outreach programs since the 1960s.  Those programs, usually underfunded and understaffed, have had varying degrees of success, sometimes impressive but nothing like what I heard about when I was first introduced to Eastside six years ago.  In a state of disbelief, I went to the campus to visit classes and speak at length with Bischof, faculty members and students.  I was completely convinced: such a transformative education was really taking place there. 

  And there's more.  Until a few years ago, enrollment consisted only of day students.  Some who scarcely had homes to return to at night were taken in by Eastside's incredibly dedicated faculty to live with them.  Today, Eastside has dormitories—the block of buildings on the right side of the quadrangle in the aerial view above—that will eventually hold half of its students.  (About 90 live there now.)  They have allowed Eastside to draw 18% of its students from other minority communities around the Bay Area, as well as made it possible for students to continue at the school when their families have had to move away.

  Remarkably, all this is done completely with private donations through a combination of annual fund raising and income from endowment—not a cent of public money is involved.  The budget this year is $6.2 million, or $19,000 per student, which is more than twice what California spends per student in its public schools, but much less than other private schools.  And what a difference that expenditure has made over the years: so far over 750 Eastsiders have been lovingly and painstakingly guided away from the fate they might have had as society's castoffs and to lives taking full advantage of their innate talents.

  What are Eastside's "magic" ingredients for taking minority students with huge education and resource gaps and getting them to the level of college graduates?  There are many.  Among them: 

A rigorous, demanding schedule, far beyond what the students have experienced before.  The school day runs from 8 a.m. to 5 p.m.  The curriculum is what one would find in the best college preparatory programs in the country, in no way watered down. 
The most accomplished teachers, as committed as Bischof to the school's mission. 
Personalized instruction: During the school day and extending until 10 p.m., time is scheduled for one-on-one tutoring, especially for students who are falling behind.  No student is allowed to drop through the cracks.  
An esprit de corps that tangibly infects the campus, students and faculty alike.  Despite the onerous schedule, the students find the emotional support from it to persevere.
Sponsored learning experiences every summer for each student, throughout the country and the world.
Guidance throughout the college years, especially to make sure that the transition to college is successfully navigated.

From my own long experience with outreach programs, I know that every one of these ingredients is essential.  Drop one or two of them, and the chances of overcoming the obstacles these students face would plummet.

  In my eyes, Eastside—Bischof's brainchild, his passion, the whole of his existence—is a full-blown miracle.  That's why I look on him as a modern-day saint.  I apologize if I embarrass him by this accolade, for he is a modest man, but I must say what I feel.