Wednesday, June 26, 2013

Academic Tenure

  Universities have been a frequent subject of this blog [1,2,3,4].  I therefore wasn't surprised when a reader asked for my thoughts on tenure, the sacred cow of academia.  It's a tough topic—one can't help having mixed feelings, because tenure can be not only the fundament of freedom of inquiry and speech, but also a distasteful source of sinecure.
  I was exposed to the academic-freedom aspect of tenure early in my university career. When I arrived at UC Berkeley in 1960, the infamous Loyalty Oath dispute, although by then resolved, was still an active memory.  A decade earlier, during the initial anti-communist hysteria of the Cold War, new state legislation suddenly required faculty members to sign a codicil to an oath already mandated by the California constitution, which had until then required only an affirmation of support for the state and national constitutions.  The codicil said, "I am not a member of the Communist Party, or under any oath, or a party to any agreement, or under any commitment that is in conflict with my obligations under this oath."  Those who did not sign it would be fired, tenured or not.
  The faculty objected to the new oath on many grounds, among which were: it constituted a political test for employment (the Communist Party was and is a legal party in California); it violated the privilege of tenure, i.e., not to be fired without just cause, as adjudicated by a faculty investigation; it contravened the state constitution, which proscribed requiring any additional oath by the faculty beyond that already included in it; and it violated the constitution's requirement that the University be solely under the governance of the Regents, with faculty advice but without legislative interference.
  During the controversy, dozens of faculty members who refused to sign were indeed fired, although court decisions by mid-decade had reinstated them.  By the time I arrived at the University, the situation had been restored to the status quo ante, but wounds still festered.  I believe the open sore was one of the sources of the Free Speech Movement [3] of the early 1960s, when the University's administration, succumbing again to outside political pressure, denied students their long-recognized right to organize on campus for political causes—the immediate target this time being freedom marches to the South.  It was another assault on free speech at the University by the same actors who launched the Oath, and the faculty overwhelmingly supported the students' cause.
  The Loyalty Oath dispute limned the upside of tenure for me: freedoms of speech, association and inquiry are intimately part of it, and any trimming of its prerogatives attacks the core of academic freedom.  That freedom had been hard-gained as tenure grew in the 19th century from an ill-defined de facto principle in a select group of universities to the virtually universally accepted practice it is today, with more or less uniform standards throughout the country.  Weakening tenure's shield could be disastrous, as the Loyalty Oath conflict showed.
  As my career at UC unfolded, however, I also saw the downside of tenure: the accumulation of deadwood within the faculty—people who no longer seemed to be sharing the full load of research and teaching.  As a chairman of a department and later the dean of a school, I was appalled by the few who I felt were taking advantage of tenure to slide into academic lethargy.  They were a source of faculty and student discontent—even ridicule—and a waste of precious resources.
  At first thought, one might imagine that "just cause" for dismissal could be enlarged from travesties such as academic dishonesty—e.g., falsifying research—to a requirement that those with tenure live up to the same standards of teaching and research that gained them tenure in the first place.  But this could be a slippery slope with a quagmire at its bottom.  It is one thing to require young faculty members to demonstrate during their initial 6-8 years that they deserve tenure—that itself creates undesirable pressure on them in formative years to adhere to the safe and acceptable, to eschew the risky.  It is a huge step to require senior faculty continually to pass judgment on their peers, risking imposition of a lifelong conformity on academic inquiry and teaching that could stultify the entire academic venture.  The Einsteins among us, whose ideas are far out of the mainstream, probably could not maintain tenure under such a regime nor would they want to make the sacrifice of creativity needed to do so. 
  The trick, I think, is to make as sure as humanly possible that new hires have the character and intellect that will assure their attaining tenure and thereafter remaining fully contributing members of the faculty.  I believe that can be done by meticulous, multilevel screening of candidates before their initial appointment as well as at the tenure decision.
  In my ruminations about the differences between UCLA and UC Berkeley [2], I opined that UCLA's initial-appointment system was too lax, depending on only a dean's approval.  That depends too much on the dean's judgment, leaving the brunt of evaluating qualification for tenure to later events; but it is much harder to discharge a colleague after 6-8 years than not to have appointed him or her at all.  Some universities avoid this dilemma by setting up "horse races," appointing several assistant professors for each tenured position that will be available, with the understanding by all that at most one will win.  I think such an approach is cruel, for it can tarnish many excellent academics early in their careers. 
  I believe that the system at UC Berkeley strikes a happy medium.  The initial appointment is first very thoroughly scrutinized at three or four levels before being sent to a disinterested and demanding campus-wide committee of faculty.  That committee has standards constraining it to approve only candidates who will almost certainly achieve tenure and not thereafter become deadwood.  Knowing these standards, lower-level evaluators daren't forward a case that doesn't meet them.  The same campus committee constantly monitors a newbie's progress and has the final word on the tenure decision.  Under this regime, it is the rare assistant professor who doesn't clear the tenure hurdle by some margin, and even rarer that, once awarded tenure, doesn't remain a fully productive member of the faculty.

  My answer to my reader, then, is that I am four-square in favor of academic tenure in universities as it generally exists today, not only as a safeguard of academic freedom, but also as a nurturer of the unorthodox among us.  If the system of appointment to the faculty is sufficiently finely tuned, then the overhead of the few who later become deadwood is not too much a price to pay.

Wednesday, June 19, 2013

Mogie

  There's an elephant in my family.  Not a real one of course, though neither is he pure fantasy.  He has been with us for almost forty years, first appearing when my daughter Abby was three or four.  I had gone into her room to awaken her, and—tired of her imaginary friends, Blink and Blank, who she insisted were real—I was on all fours as a diversion from them, trumpeting like an elephant and swaying my head.  Abby immediately joined the make-believe. 

  She named her new friend "Mogie," because that's what his call sounded like. They were soon roaming around the house, she leading Mogie with her arm around his head, identifying all the rooms and furnishings to him and making a home for them under the dining table.  As Abby became firm friends with Mogie, he and Daddy became ever more distinct in her mind—one or the other might be present, but never both simultaneously.  Blink and Blank were never heard from again.

  Originally Mogie could say just one word, "Mogie"—yet by modulating its intonation and tempo, he could make his feelings known.  Abby soon taught him a small English vocabulary, which he learned to pronounce within the constraints of his vocal mechanism.  His syntax was never very good, but he and Abby were able to talk in a sort of pidgin.  He would say, "Mogie no like Abby go school leave Mogie 'lone," to which Abby would reply, "Abby have go school, bring Mogie-food for Mogie when come back."

  Mogie was an especially valuable friend when Abby's brother David, six years older than she, teased her.  She would scream "MOGIE!" and Mogie would charge into the room, roaring his own name, knocking David over and trampling him.  The first time this happened, David was shocked; still, he was smart enough to adopt his own alter ego, a little dog Jerry who snapped at Mogie's legs.  The teasing was soon forgotten amid the general uproar of Mogie and Jerry battling while Abby tried to separate them.

  Abby's Mommy also had to contend with Mogie.  If she was being cross with Abby, he would charge at her, pushing her into a corner with his head.  She sometimes had to pour water over him or slap him with a dish towel to defend herself, reducing Abby to tears.  Mommy would say that Mogie would be put out of the house if he didn't behave himself, and Abby would plead with Mogie to calm down.  It isn't easy to pacify a stampeding elephant.

  Mogie was soon traveling with the family on vacations.  Now it was Daddy's turn to complain: he was infuriated at having to buy two adjoining airplane seats for Mogie so that he could fly comfortably, and to rent station wagons with enough room for him in the back.  On some trips there were occasions when Mogie got separated from the family.  Once, in Italy, because he had binged on pasta, he got stuck between the walls of a narrow lane on Capri; his plight went unnoticed by the family in their concern for Abby, who had fallen ill.  When Mogie finally found his way back the next day to their hotel on the mainland, he was furious with Abby, who had by then recovered: "Mogie stuck 'tween walls Capri, shout 'HELP!!'  Abby no come.  Stuck six hours.  Firemen need crowbars pry Mogie out.  Miss last boat, sleep on wharf.  Abby no love Mogie no more, leave him Capri."  Abby rubbed his chafed sides, cooed to him, and shared some of the food that Mommy had ordered from room service.

  Although Mogie started out to be just a way for Abby and Daddy to have fun, he later became an extra way for them to relate to each other.  When Daddy was grumpy or preoccupied or inattentive—which Abby thought was excessively often—she could unfailingly get his attention and affection by yodeling, "Mogie, Mogie, Mo-oh-gie, Mo-oh-oh-oh-gie," until Mogie had to respond, and soon they were giggling and cavorting.  And when Abby was blue or sulky, or when Daddy felt he needed a few big hugs and couldn't rouse Abby to give them, all he needed to do was drop to his hands and knees, wag his head from side to side, and mewl "Mo-oh-gie?" in the softest voice.  Abby would rush to him, hug him, and say, "Oh, what a good elephant!"

  As Abby approached her teens, Mogie started saying that he was really a little girl's companion and, as Abby was no longer a child, Mogie would have to leave.  Abby couldn't stand hearing this.  Not only would Mogie leave, but Daddy would become just plain, unmagical Daddy.  She would get panicky (no make-believe here!) and beg both Daddy and Mogie not to let this happen.  Daddy realized that he too would miss Mogie if he were to leave; Mogie's moments with Abby were really special.  Daddy and Mogie decided, as one of Abby's presents for her thirteenth birthday, to give her Mogie forever.  They felt that they were giving themselves a gift too, one that all three of them would enjoy for a long time.

  I eventually wrote a book about Mogie's adventures with the family.  It was illustrated by the very talented Bryan Johnson-French—actor, musician, and artist in many media, whose latest artistry is masquerading.  His tesselated illustration for the cover of the book tells the whole story:


  Mogie is still part of the family, graying along with me.  My grand-daughters were raised in his presence: when they were younger they frolicked with him as Abby had.  He is not as spry as he was forty years ago, and his bellowing sounds a bit irritable.  Aged or not though, he has been a wonderful family member to have had all this time!

 

  



Wednesday, June 12, 2013

A New Prospectus for Science?

  I was recently delighted to come across the works of Professor Philip Kitcher of Columbia University, one of whose books I cited in [1].  An acclaimed philosopher, he blessedly doesn't seem to have a metaphysical bone in his body—unlike those whose philosophy-cum-metaphysics I have decried [2,3].   I was therefore inspired to read another of his books, Science in a Democratic Society. 

  The book argues that science only very imperfectly conforms itself to the role it should play in a democracy, a failing that has resulted in increasing popular suspicion of its results.  It notes that respect for science in the U.S. has fallen far from that in its post-World War II heyday, during which era it was much influenced by Vannevar Bush's famous report to President Roosevelt, Science, the Endless Frontier.   Kitcher thinks a newer prospectus for science is essential for our era.

  Kitcher points to a harsh indicator of the falling esteem for science and scientists:  The new sciences of molecular biology, genetic modification of organisms (GMO), and global warming have been received with mistrust, anger and denial by substantial numbers.   He maintains that the shift is partially because science has not properly included itself as much within the conventions of democratic consensus as is currently expected of it.  Scientists remain too much removed from most of the citizenry, sometimes seeming to set themselves up as godheads looking down on the values and wisdom of the folk.  Society is culpable too, for it does not adequately educate all its members about the results, methods and limitations of science.  Kitcher therefore addresses his book to a proposal for returning science to a more-accepted position.

  In a perfect democracy, Kitcher writes, each adult would be fully informed and have equal weight in decision making.  While that may be possible in a democracy with only dozens of people, it of course isn't in one with millions.  Large societies have therefore organized "divisions of epistemic labor":

"Consider the entire range of questions pertinent to public life, all matters about what the society should aspire to and how it might realize whatever aims are set.  These topics are partitioned, divided into nonoverlapping sets, and for each set in the partition except one, a particular group of people is designated as authoritative with respect to that set.  For the remaining set, epistemic equality holds: that is, on these topics each citizen is entitled to make up his/her own mind. …

"[W]hat ideals to adopt, or what goals to pursue … should be assigned to that set of topics on which citizens should make up their own minds.  Moreover some questions about facts might also belong in this set, issues about which each of us can be expected to be competent, or even privileged."

I'll denote as "Set A" the topics Kitcher says should reside in the domain of the public at large.

  Kitcher asserts that increasing public wariness of science boils down to this:  Many have concluded that scientists have arrogated to themselves decisions on set-A topics that should be reserved for the body politic as a whole.  In molecular biology, particularly the use of embryonic stem cells and cloning, they believe that scientists have pre-empted fundamental decisions on the value of life, which belong to set A.  The same applies to the contentious issue of GMOs and the fear many have about polluting the environment, another set-A topic.  The perception of science thus "running amok" spills over to the most important issue of all, global warming, where the populace should clearly listen to the experts, but many don't because of accumulated doubts about scientists' integrity and objectivity.  I myself suspect that this increased mistrust has magnified anti-evolution and creationist fervor beyond its levels a half century ago.

  To cope with this erosion of trust, Kitcher proposes a way to align science better with democratically derived values.  He calls it "well-ordered science," a paradigm in which "specification of the problems to be pursued would be endorsed by an ideal conversation, embodying all human points of view, under conditions of mutual engagement."  Such a conversation would be conducted more in the spirit of a town-hall meeting than through today's hierarchical political mechanism.  It is a utopian concept, but one with valuable insight into the necessary interaction of science and the public in a democracy. Kitcher clearly has a Jeffersonian faith that truth will out in reasoned discourse among well-informed people.

  In even imagining the suggested "ideal conversation," a clear tension arises between two conflicting concepts of science: an autonomous effort, largely unconstrained, for discovering fundamental principles of nature (mostly a position of scientists); and a means for solving practical problems, restrained within the context of societal values (mostly a concern of the laity).  Kitcher submits that the tension must be resolved by fully enlightened, give-and-take dialogs equally involving experts who represent the viewpoint of science in the abstract and laypersons who represent the values and needs of society as a whole.

  The book devotes hundreds of pages to elaboration of "well-ordered science."  There is no way to do justice to this difficult discussion in a short essay, but a brief summary of stages of Kitcher's ideal conversation is illuminating:

  Assessment of Options.  Scientists lay out the significance of possible courses science might take, indicating which have intrinsic interest ("pure science") and which have more immediate practical potential ("applied science").  The object is to bring all discussants to a tutored state on science and to try to come to agreement, based on societal values, on options worth pursuing for the good of both abstract science and society at large.

  Certification of Results.  As the debate on global warming illustrates, a coterie of scientists cannot alone persuasively certify the results of science when faced with untutored denial and claims of fraud and misrepresentation.  Instead, certification must involve participation by informed, disinterested and trusted laypersons whose imprimatur will lend credence to results, which scientists by themselves cannot.  Standards of certification must be high and transparent, acceptable to all.

  Application of Results.  Even if scientific results are certified as true, it is not clear that a resource-limited and value-constrained society would want to take action on them, and if so, for whose benefit.  What are the benefits of building another particle accelerator compared to those of conquering cancer?   Should GMOs be further deployed, and if so how should their benefits and disbenefits enter the assessment?  How should global warming be addressed, taking account of both present needs and future menaces? 

  To many, Kitcher's scheme may appear too well-ordered—frighteningly regimented and even Big-Brotherish—undermining the classical conception of science as free-wheeling and independent in pursuing nature's truths without substantial interference.  Others will assert that there can be no agreement in the ideal conversation about the values held in set A, hence no decisions on courses of action laden with these values.   Kitcher takes pains and many pages to assuage such fears, arguing not only that they are exaggerated but that science will flourish more in a democracy using the widespread decision-making participation he proposes.  

  Although I share the concerns just mentioned, I applaud Kitcher for calling attention to an important and escalating problem and examining it rationally.  His prescriptions may seem utopian, but it is often necessary to strive for an otherworldly utopia in order to achieve a worldly harmony.

Wednesday, June 5, 2013

Aging

  "You're slumping again, Dad," says my daughter Abby.  "Yeah, and tilting to the left too," echoes my son David.

  It is their youth redux, when they could win any battle by joining forces against their common enemy, me.  I remember their old tricks and make a valiant attempt to fend them off:  "Well," I say, "I'm pretty old, and old people do tend to slump.  Anyhow, it would be nice if you were to mind your own business.  I am comfortable in my own skin."  I say this as grumpily and peevishly as possible to accentuate my age (not hard to do).

  They are having none of it.  "You're not old," they cry in unison to this octogenarian.  "Anyway," says Abby, "Mom made me promise not to let you become an old man!"  She says this as if she could stop time in its tracks.  As if I were in denial, not she.

  Don't get me wrong.  I love my children and appreciate their concern for me.  Nonetheless, I am reminded of the joke I heard on TV in the 1960s, told by comedian Sam Levenson (1911-1980) as he complained about the changing relationship between parents and children:  "When I was a boy and my father said 'Jump!', I asked 'How high?'  Now when my son says 'Jump!', I ask 'How high?'  When's my turn?"

  Aging is a strange process indeed.  Things that seemed quintessentially important decades ago turn out to have no importance at all.  Activities that were the center of one's life recede into dim memory.  (My doctor once told me of a long-retired patient who, when asked what he had done in his career, furrowed his brow and said, "I think I was a dentist.")  Soul-disrupting emotions—anger, impatience, disapproval—become ever so much less intense.  Time quickens—I do what I would have previously considered nothing and find out that a day has passed quite enjoyably.

  (I have a theory about that:  As one's metabolism slows, so does one's internal clock compared to the external world's.  One internal second, long ago synchronized to an external second, now passes while several external seconds have gone by.  To convince you of my point, I only have to ask you to compare two time spans: how long a summer vacation from school seemed to last when you were young and how fast summer goes by today.)

    There are pluses and minuses in aging.  The greatest benefit for me is a sense of peace: fewer and fewer things rile me, and the equanimity I seek, which I have frequently mentioned in this blog [1, 2, 3], comes easier and easier.  Retirement, which for me started 18 years ago, became—after a year or two of decompression—a blissful way of life in which I can do all those things that I hadn't time for in "real life," including writing the diary that is this blog.  I can control my own calendar, scheduling as many or as few items on it as I want, with only a scattering of "musts" imposed by outside forces.

  So far there are blessedly few minuses: more aches when arising in the morning; slower perambulation; the unseemly effort entailed in getting up off the floor after I've sat down on it with my younger grand-daughter; after half a lifetime playing tennis, hanging up my racquet because slowing reactions have taken the fun out of it.  A heightened awareness compared to younger days that my life after all is not of infinite extent; an almost-detached curiosity about how fate will contrive to end it.  But these latter thoughts aren't maudlin, as equanimity now rules my psyche.

  So too does Ecclesiastes now govern my philosophy:  "To every thing there is a season, and a time to every purpose under the heaven."  It is my favorite book of the Bible, for—although it is usually interpreted as pessimistic—in my age I see it simply as a wise understanding and acceptance of the natural cycle of life and the cosmic insignificance of worldly striving.  "What has been is what will be, and what has been done is what will be done, and there is nothing new under the sun."

  Now, if I could only get my kids off my (slumping) back …