Wednesday, July 31, 2013

Vanishing Jobs

  America has long cherished revolutions in technology.  It is an article of faith that they are always for the good, spawning vital new industries, spurring economic growth, creating more jobs than they destroy.  GDP, productivity, profits and wages increase,  all boats rising on a waxing tide.

  Indeed, this rosy scenario has repeatedly been validated.  Although mechanization of agriculture displaced most of the country's farmers in the 19th and 20th centuries (they accounted for 90% of the working population in 1800, only 2% in 2000), they and their children were fully absorbed into a growing urbanized society.  Replacement of the horse and buggy by the automobile created millions of net new jobs in the 20th century, not only in car manufacturing but in the mobile economy created as Americans took to the roads.  Motion pictures, radio and television displaced other forms of entertainment, but spawned megalithic replacements.  Following all these innovations, GDP and the standard of living persistently increased.

  This is not to say that technology revolutions did not cause painful dislocations for workers whose skills were no longer needed, but the pain was temporary as needed new skills were acquired in a changed economy.  Workers ultimately benefited with shorter hours and higher pay.  When I was a teen-ager, there was frequent talk about a likely 30-hour work week and a nagging concern that Americans wouldn't know how to use all of their leisure time.

  Could that virtuous cycle have come to an end?  In an exquisite irony, MIT—that great bastion of technology—has an article in this summer's issue of its Technology Review declaratively entitled "How Technology is Destroying Jobs."  Much of it is based on the work of Erik Brynjolfsson and Andrew McAfee of the MIT Sloan School of Management.  The striking graphic below, central to the article, shows how the growth of productivity is now greatly outpacing the growth of employment in the U.S.;  innovation has caused productivity to continue skyrocketing, while employment has recently remained almost flat.

The increasing gap between productivity and employment.
[Source: MIT Technology Review, July-August 2013, p. 31.]
  Part of the flat-lining of employment is surely the effect of the two recessions in the first decade of this century, shown by the two dips in the employment curve during those years.  However, according to the Technology Review article, Brynjolfsson and McAfee "believe that rapid technological change has been destroying jobs faster than it is creating them, contributing to the stagnation of median income and the growth of inequality in the United States."  Robots and computers, they say, are replacing humans faster than they are increasing the need for humans having mid-level skills.

  As a result, the fraction of jobs requiring a mid-level skill—those increasingly easily done by automation—seems to be declining for good.  Many blue-collar jobs, mostly in manufacturing, continue to be permanently replaced by robots; more and more white-collar positions like those in middle management, bookkeeping and paralegal research are being taken over by computers.  The swath of jobs being replaced by automata is therefore steadily widening. Contrarily, the fraction of jobs not easily subject to replacement by automata (mostly low-skilled jobs in the service industries like home care and hairdressing, and high-skilled jobs like designing iPads and apps) continues to increase.  Altogether, though, employment as a fraction of population is falling.

  The first chart below shows the slump from 1980-2005 in employment in mid-skill levels.  The second chart shows an associated redistribution of income during that period—a more lopsided curve with most of the change going to people having the very top skill levels.  In effect, the great middle class, defined by mid-level skills and incomes, which once embraced the preponderance of the population, is diminishing rapidly.  Not all of these changes can be attributed to automation alone; off-shoring of mid-skill jobs to countries with lower wages has contributed.  However, note that both charts cover a quarter century prior to the Great Recession, which therefore cannot be blamed for the trend, but certainly has accelerated it.

Changes by skill level in share of total unemployment and hourly wages, 1980-2005.
[Source: MIT Technology Review, July-August 2013, p. 33.]
  If Brynjolfsson and McAfee are right, and I believe they are, new norms for unemployment and underemployment, much higher than we have traditionally come to expect, are here to stay.  The situation will be made worse as access to retirement benefits moves upward from 65 to 70.

  It's not a pretty picture.  The bifurcation of the country into those who have substantial wealth and skills and those who have little of either—with fewer and fewer in the middle—will further reflect itself in partisan politics, as if its present polarization were not disastrous enough.  The "normal" future that we have been awaiting will not come.  The future is already here, with depressingly different norms.

  I don't think that all this is just a new Luddist scenario, fated to fade away. The digital revolution is exponentiating so much faster than previous technology revolutions that I don't believe employment will ever be able to catch up. The future of American society under such regime is anyone's guess.

  To find out more, you might want to read Brynjolfsson and McAfee's very short and readable book, Race Against the Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy.  Its first three chapters elaborate the statistics given above.  The last two attempt to cast a more optimistic light on the trend by proposing numerous steps that could be taken to reverse it.  Those cures seem to me like bromides, not likely to be implemented and probably ineffective if they were.

  If you weren't already aware of the likely permanence of the new economy, sorry to spoil your day.  In fact, a lot more than a day has been spoiled.

Wednesday, July 24, 2013

La Bella Italia

  Although I've written here about my experiences in Britain, Russia and France [1, 2, 3], I've so far neglected Italy, a favorite.  Despite my surname, I have no ancestors from that country that I know of, although I feel an intense kinship with it.  I cannot be other than rhapsodic: Italy cooks the best cuisine; lilts with the most musical language; sparkles with some of the world's most verdant countryside and beautiful towns and cities; stuns with its gorgeous classical art, churches and monuments; beats the world in modern design; overwhelms with the loveliness of its people.

  I won't prattle about Italy's museums, monuments and other tourist attractions, dazzling though they may be. Because of its foundational place in Roman and Greek antiquity and the Renaissance, Italy has more of these than most, and you've no doubt seen them.  Rather, I'll reminisce about some of the personal experiences that enamored me of the country.

  An earliest memory: Helen and I were visiting Rome in 1967 with David, who was not yet two.  We'd arrived at our hotel at about 7:30 pm, tired and with David becoming cranky.  The concierge recommended a nearby trattoria where he said we could get a quick meal; but it was Friday night and we found a long waiting line.  We joined the line, imagining it was the best we could do.  Soon the maĆ®tre d' came by and scolded us: "You can't let the bambino wait for food at this hour!  Come with me."  He seated us, brought a roll for David, and said he would immediately bring a plate of pasta for him.  "The two of you can wait for your food, but he must eat."

  A later memory, this one concerning Abby:  We were driving with her through the Veneto in 1983; she was 12 at the time.  It was foggy—I could  barely see the road ahead.  As we rounded a curve, the fog broke in patches and Abby shouted, "Stop, Daddy!"  Fearing that I was about to hit something, I brought the car to a quick halt.  But Abby had yelled because she had been staggered by suddenly seeing the 16th-century Villa Barbero, a Palladian masterpiece, through a gap in the fog.  Luck was with us: the villa was open that day, so we could feast our eyes on its marvelous architecture and its still-vibrant Veronese frescos.  It was then, I am sure, that Abby decided to become an architect.
Villa Barbero
    Still later, when visiting Abby in Florence in 1992, where she was studying on a college semester abroad: Abby had absorbed Helen's and my love of Italy and the Italians.  As she took us to the many out-of-the-way sights and restaurants that had become her favorites, I was charmed by how patiently everyone encouraged her to speak in her adequate but still-faltering Italian, urging her with smiles to finish her sentences.  The Italians must be the most child-centric people in the world.

  There are so many more threads in the pastiche of my memory:  The infinite variety of pasta dishes; pasta is surely the primordial tranquilizer, which must partially account for the serenity of the Italian soul.  The delight of participating in a national pastime— watching opera—especially in provincial opera houses like the beautiful one in Trieste, a mixture of La Fenice on the inside and La Scala on the outside.  The surprise that even I—an inveterate hater of shopping, especially in big cities—actually enjoy the experience in Milan, where focused boutiques limn the elegance of Italian clothing design.  The joy of staying in small towns like Asolo, a jewel of the Veneto.

  Our favorite spot in the country?  That has to be Portofino, which we visited time and time again.  It was to be Abby's wedding site until the 9/11 catastrophe struck just a few weeks before, and the wedding plans there had to be canceled.  Helen often said that she wanted her ashes to be strewn in the hills above that lovely town, although I haven't been emotionally able to accede to that wish.  
Portofino
  A final, wrenching memory: a last boat ride with Helen and Abby in Venice in 1998, on the way to the airport just before returning to the U.S., where Helen died a few months later.  I am so glad that she got to enjoy la bella Italia once again in her final days.
Helen's Farewell to Italy

Wednesday, July 17, 2013

A Skeptical Neuroscientist


  An erstwhile member of my book club, Dr. Robert Burton, is an eminent neuroscientist and sometime novelist and columnist.  He has written two lay books on his field, the more recent being A Skeptic's Guide to the Mind: What Neuroscience Can and Cannot Tell Us About Ourselves. 

  The book addresses the capabilities of the brain and mind.  Most importantly, it is careful not to conflate the two.  Through advances in fMRI and other technologies, we are getting to know more and more about the operation of the brain, a delimited entity made up of untold billions of neurons and trillions of their connections.  However, we know very little about the mind, with its much less understood attributes of intention, agency, causation, feelings of morality [1], sense of self and free will [2], etc.  Burton stresses that the mind is a largely uncharted abstraction extending well beyond the brain and the sensory organs attached to it, indeed to the entire outside world: 

"If we wish to understand such phenomena as group behavior, cultural biases, or even mass hysteria, it seems preferable to see the mind in its largest possible context rather than persisting with the arcane notion of the individual mind under our personal control.  The receptors of our conceptual mind reach out to the far corners of the universe even as our experiential mind tells its personal tales and sings its unique songs just behind our eyes."

From this viewpoint, the human mind is diffused—akin to, yet much more complex than, the mechanisms governing colonies of eusocial insects such as ants and bees [3], synchronized flocks of birds, and swarms of locusts.

  The skepticism that Burton announces in his title stems from assertions by some neuroscientists that they have located distinct sites in the brain of attributes of the mind.  Looking at neurons and their connections, he says, can no more tell us about the behavior of the mind than examining carbon atoms can reveal higher-order features of life. Confounding an understanding of the mind even further is the self-reference of the effort: the observer, who has a mind, is trying to comprehend the mind's behavior, which cannot be done without the overlay of the observer's solipsistic, perceptual and experimental biases.  Considerations such as these provoke Burton's uncertainty that we can ever fully understand the mind at all.  His argument is persuasive; I am convinced by him that attempts to fathom the mind are at best in their infancy.

  However, with considerably less warrant, Burton extends his skepticism to fields other than neuroscience, although he admits that they are not as susceptible to the bias of the observer.  For example, after peremptorily dismissing modern cosmological theories by such scientists as Stephen Hawking and Lawrence Krauss, who contend that our universe may have arisen from "nothing" (more precisely, in Krauss' case, from the quantum froth and embedded energy in the so-called vacuum of space [4]), Burton says:

"My interest is in underscoring how an operational conception of the mind is the beginning point for all theories—whether talking about the cosmos, climate change, or the nature of consciousness.  Theories should not begin with assumptions about the subject under inquiry; they should begin with a close look at the tool—the mind—that creates these assumptions.  Otherwise it is a short step to believing in the spontaneous creation of something from nothing."

Even acknowledging our capability for conceptual error, that seems to me a prescription for paralysis.  Despite the dead ends that physical science has encountered, nothing would ever be accomplished if we were to follow such a timid and introspective course.  That would be skepticism raised the Nth power.
 
  Taken as a whole, A Skeptic's Guide is an interesting read, with many fascinating clinical anecdotes illustrating points about the mind.  Still, I'm disturbed by the skepticism that is the book's hallmark, which at times surfaces as derision for the lifework of those who do not have Burton's worldview.  Perhaps unfairly, I cannot rid myself of the feeling that Burton has set himself up as the Great Mocker: he is too quick to spotlight humanity's built-in biases and limited cognitive powers, belittling its efforts to overcome them so as to make sense of the mysteries of the universe.

  To be sure, we all have our blind spots and predispositions.  Yet we must examine the universe through the imperfect lenses we have, trying as well as we can to overcome their distortions. I don't think those distortions are as pervasive as Burton would have us believe.  And I don't think that addressing them with Burton's deep-seated skepticism is a productive endeavor. 

  At the risk of seeming to mock Burton as he does others, I would refer him to Ralph Waldo Emerson's observation, "Skepticism is slow suicide."
 

Wednesday, July 10, 2013

A Clot of Revolutionaries

  Last week I repeated what has become an annual ritual for me: each Fourth of July I read the Declaration of Independence anew.  As always, I was affected to the verge of tears.  Why?  Because a band of revolutionaries again spoke directly to me across a span of almost 250 years.

  I am forever stunned that colonial America, sparsely populated as it was, could accumulate one of those rare "clots of excess genius" about which I have previously written [1, 2].  To see how rare it was, consider this:  If one were to count only seven key founding fathers—John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, Thomas Jefferson, James Madison and George Washington—and scale their number up from the population of the colonies to America's present population, one would have to find over a thousand today who have such political and philosophical brilliance.  I would be hard put to name half a dozen.

  What brings me to tears is not just the Enlightenment ideals of the revolutionaries—as valid today as they were then—and the fact that they were acted on so valiantly.  I am also carried away by the soaring prose of the Declaration.  As familiar as the words are, I cannot help but quote from them here—not from the litany of grievances against George III, trenchant as it is, but from the timeless response to those grievances:  

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.—That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, —That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed.  But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security. 

We, therefore, … solemnly publish and declare, That these United Colonies are, and of Right ought to be Free and Independent States; that they are Absolved from all Allegiance to the British Crown. … And for the support of this Declaration, with a firm reliance on the protection of divine Providence, we mutually pledge to each other our Lives, our Fortunes and our sacred Honor.

Don't those words still speak to billions of people now living, who need their encouragement?

  The Declaration united what until then the British had belittled as a ragtag, leaderless collection of insurrectionary militias.  Were it not for the Declarations's stirring articulation of colonial aspirations, the rebellion might indeed have fizzled.  Instead, it created a nation, thereby standing as a major turning point in human history.

  Yet here we are, a quarter millennium later, still witnessing insurrectionary movements the world over that are trying, in the words of the Declaration, "to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness."  Most are inchoate, unable to create a vision for a prospective nation, and for that reason often cannot establish a coherent polity even if the insurrection itself succeeds.

  How fortunate for America that our clot of revolutionary leaders coalesced so successfully when it did!  I can never complete my celebration of the Fourth without yet again entering their minds by reading the Declaration and thanking them for the heritage they passed on to us.

Wednesday, July 3, 2013

The Power of the Pendulum

  The pendulum exerts a powerful force in American society: whenever an extreme position gains too much sway over one of our institutions, a countervailing trend sets in [1].  That may be happening now in the Supreme Court, which at least since Bush v. Gore in 2000 has moved steadily to the right.

  I am encouraged in my hope by the philosophy expressed by Justice Stephen Breyer in his 2010 book Making Our Democracy Work: A Judge's View.  He emphasizes that the Court historically has always had to strive to re-earn its reputation whenever it has strayed too far from the mainstream.  Keeping its credibility, he says, depends on three principles: 

Adhering to traditions slowly accumulated over centuries—for example, respect for established precedent and care in deciding cases on the narrow basis of points of law brought to it, without expanding them from the bench.
Steering a course between too-fettered originalism and unfettered subjectivism.
Respecting the roles in the system of diverse other players: legislative bodies, the Executive and its agencies, lower courts, and the public. 
  The current Court is due for a return to these principles.  Its first seven years (2005-2012) are characterized by Marcia Coyle in The Roberts Court: The Struggle for the Constitution as dominated by "a confident conservative majority with a muscular sense of power, a notable disdain for Congress, and a willingness to act aggressively."  She singles out three high-profile decisions in that period in which the Court distinctly parted from Breyer's dicta. 

  One of those decisions reversed a century of judicial precedent that had held the Second Amendment's right to bear arms is not an individual right but—in the words of that Amendment—is associated with a "militia, necessary to the security of a free state."  Another struck down the very limited school-integration programs of two cities, ringing a death knell for almost a half-century of such programs.  Of the three decisions, however, the one I believe most illustrates the Court's swerve to extremism was Citizens United v. FEC.

  For over a century, Congress and various state legislatures had tried to limit the effect of large amounts of money on the democratic process.  Most recently, the 2002 McCain-Feingold Act prohibited the use of corporate and labor-union treasury funds to pay for ads supporting or opposing federal candidates within certain time periods before elections.

  In the run-up to the 2008 presidential election, an ultra-conservative organization, Citizens United, produced "Hillary: The Movie."  It brought suit against the Federal Election Commission on very narrow grounds, seeking to establish only that the movie and ads for it were not subject to McCain-Feingold proscriptions; rather, it claimed that they were merely protected political speech.  For fear of losing, the suit very carefully avoided challenging the constitutionality of either the Act itself or any precedent decisions concerning campaign finance. 

  After the case progressed to the Supreme Court and was argued there, the Court unexpectedly ordered a re-argument in its following term, asking the two sides to address whether the Court should look at the validity of underlying laws and precedents—an unusual request because that broader question had not been addressed or adjudicated in the lower court.  After re-argument, the Court invalidated McCain-Feingold's ban on use of corporate and union treasury funds for electioneering purposes, declaring that there can be no limits on such expenditures as long as they are made outside of candidates' campaigns.

   In coming to its decision, the Court violated Breyer's principles in at least three ways: it unilaterally expanded the original question that had been brought to it; it paid no deference to a century of legislative efforts at the state and federal levels to limit the influence of money on elections; and it reversed a long history of judicial precedents supporting that legislation. 

  Adverse reaction was immediate and bipartisan.  Senator McCain called the ruling "The most misguided, naĆÆve, uninformed, egregious decision of the United States Supreme Court, I think, in the twenty-first century."  President Obama excoriated the Court during a State of the Union address—with Chief Justice Roberts in attendance—saying, "Last week, the Supreme Court reversed a century of law to open the floodgates for special interests … to spend without limit in our elections. "

  The visceral reaction to Citizens United and other Roberts Court decisions might have reminded the Court—particularly Chief Justice Roberts—of Breyer's assertion that it must constantly re-earn its esteem.  That may explain why at the end of the 2011-2012 term, when a 5-4 majority was about to strike down the very core of the Affordable Care Act—the signature piece of legislation sponsored by a popular sitting president—Chief Justice Roberts felt obligated to switch sides.  He thereby not only saved almost all of the Act from a judicial evisceration, but partially redeemed the reputation of the Court from charges of extremism.

  Could the ACA decision signal a deceleration or even reversal of the pendulum?  High-profile decisions at the end of the 2012-2013 term were mixed, yet they possibly indicate so:

• The Court tilted a bit to the political left when it overturned, as an unconstitutional deprivation of equal rights, the part of the Defense of Marriage Act that denied federal benefits to same-sex couples legally married in a state. 
• It punted on trial- and appeals-court decisions that California's Proposition 8, which forbade same-sex marriage, is unconstitutional, doing so on the narrow ground that those who appealed the trial court's holding had no standing to do so, thereby letting that court's decision stand.
• It also punted on affirmative action at the University of Texas by ordering a lower court to determine whether there was a valid and well-applied state rationale for the criteria used.
• In one voting-rights case it said that Arizona could not add a requirement mandating a proof of citizenship to the existing uniform federal voter-qualification standards. 
• On the other hand, another voting-rights case went the other way, virtually eliminating almost a half century of legislation requiring federal pre-approval of changes of election procedures in jurisdictions having histories of discrimination.  This, despite four Congressional reauthorizations of the original 1965 act, the last in 2006, when tens of thousands of pages of investigatory material were compiled and the vote was an astonishing 98-0 in the Senate and 390-33 in the House.  It thus gave shocking evidence supporting Coyle's charge that the Court has "a notable disdain for Congress."

  Many of these decisions were 5-4, indicating that the Court remains on a cusp.  Nonetheless, comparing them to the three right-leaning decisions of earlier years focused on by Coyle (see above) might encourage one to feel that the Court is becoming more moderate.  A lengthy front-page analysis in the New York Times last week argues otherwise, contending that Chief Justice Roberts continues to pursue a very conservative agenda by guiding the Court through a Machiavellian series of zigs and zags while moving it inexorably to the right.  

  Even if the Court's pendulum is not yet reversing its swing, I cling to my optimism that it will do so sooner rather than later—say within this decade.  Although the court is not supposed to be a political institution, it does respond to political and societal pressure.  I think that time is near.