Showing posts with label Internet. Show all posts
Showing posts with label Internet. Show all posts

Wednesday, December 5, 2012

Eusociality

  Certain invertebrate species such as ants, in which individuals subordinate their own needs to those of the group, are said to be eusocial.  The term has also been applied more loosely to some vertebrate species, particularly Homo sapiens.  Unlike ants, though, humans act selfishly for their own survival and reproductive benefit as well as altruistically for the benefit of groups to which they belong.

  Eusociality is the thought-provoking subject of my book club's latest selection, The Social Conquest of Earth by E. O. Wilson of Harvard University.  Wilson is one of the world's leading biologists and probably its leading entomologist.  His lifelong study of eusociality in insect colonies led him to study it in humans, in order to explain how we came to dominate the biosphere.  In describing the difference between eusociality in ants and humans, Wilson wryly observed that "Karl Marx was right, socialism works, it is just that he had the wrong species."

  A warning before I proceed: Wilson is one of the originators and a strong proponent of a controversial sociobiological theory which asserts that eusociality is dependent on evolution of groups as groups, a level of evolution above that of individuals.  As did Darwin with ant and bee colonies, he sees the group as a macro-organism, subject to forces of natural selection.  It is a top-down theory, in which the group selectively chooses as members those who express a set of innate altruistic traits favoring the group's success in competition with other groups.  Wilson is an equally strong opponent of an alternative evolutionary theory called kin selection, a bottom-up theory, which posits that altruistic behavior in groups evolves only because individuals in the groups already share family genes, and accordingly at times act selflessly to assure the perpetuation of their genes in others if not in themselves. 

  The argument literally rages.  Evolutionists Jerry Coyne of the University of Chicago and Richard Dawkins of Oxford have unsparingly attacked Wilson [1,2].  In the remainder of this posting, however, I'll summarize some key points of Wilson's book as well as I can—it is not always easy for a layperson to follow.

****

  The development of eusociality is quite rare.  Of the many millions of invertebrate species, only a score or so have independently achieved it, the remaining species competing largely at the level of individuals rather than groups.  (Think flies and ants.)  Among vertebrates, eusociality is even rarer, having occurred in just a handful of species, including Homo sapiens.  Its appearance in ants has led to their dominance over the insect world, in humans to their dominance over the world at large.

  What is amazing, Wilson maintains, is that eusociality has occurred at all, for the evolutionary maze that leads to it is so extensive and full of twists and dead ends, that for any species to wend its way through the maze is something of a miracle.  After all, eusociality must get its rudimentary start in a species where the rule is selfish natural selection at the individual level, and selflessness/altruism the exception.  Still, ants and their precursors negotiated the maze successfully over many millions of years, as did the line of hominins leading to Homo sapiens in a somewhat shorter time.  Wilson describes the process as involving a long sequence of small evolutionary pre-adaptations over hundreds of thousands to millions of years, leading to a tipping point where the random mutation of as little as a single gene in an individual could start the final conversion to full eusociality.

  Eusociality in humans is of course very far from the extreme of the genetically homogeneous, fixed-caste behavior of ants, yet goes much farther than the social behavior of other vertebrates.  A critical competitive advantage of humans vis-à-vis other vertebrates is that they build defensible, multigenerational, task-allocating cooperatives to which members may have no genetic kinship.  Each community, Wilson says, selects for inclusion individuals who express a desirable subset of altruistic behaviors, most of which were already deeply ensconced in the human genome at the time Homo sapiens emigrated from Africa some 60,000 years ago.  The chosen subset reflects the evolved culture of the group.

  The tension in mankind between the opposing forces of individual/selfish natural selection and group/selfless selection is complex.  Genetic evolution in the former arises from competition between members of a group, in the latter from competition between groups. The two types of competition point evolutionarily in opposite directions because one requires selfishness and the other altruism.  Homo sapiens maintains a tenuous balance between the two because of what Wilson calls the "iron rule": selfish individuals beat altruistic ones, yet groups of altruists beat groups of selfish individuals. 

  The relationship between genes and culture in eusocial groups is also a critical element in group evolution, Wilson asserts, and equally complex.  In building a culture, genes make available not just one trait as opposed to another, but patterns of traits that together define the culture.  The expression of multiple traits is plastic, allowing a society to choose an ensemble from many available combinations, the choices differing among societies.  Wilson contends that the degree of plasticity itself is subject to evolution by natural selection.  These concepts add another perspective to the nature/nurture question, very different, e.g., from one presented in a book by psychologist Bruce Hood that I discussed in my posting The Self.

   A final, quintessential point.  For Wilson, the indispensible requirement for the development of human eusociality—without which group selection could not have proceeded—is the primitively evolved instinct to form tribes.  He says that people feel they must belong to tribes for their existential sense of identity, security and well-being.  Eons ago they were small bands that hunted and gathered together, built encampments, and defended each other and their young.  Today, individuals join many interlocking tribes—city, country, religion, even sports teams—each commanding loyalty, communal effort, and competition with other tribes of the same genre.  In  this sense group evolution is bilateral: groups select individuals based on group needs, and individuals choose groups based on their own needs.

  As evidence that the tribal instinct is extremely deeply embedded in the species, Wilson cites a recent experiment showing that, when pictures of out-group people are flashed in front of experimental subjects, their amygdalas—their brains' centers of fear and anger—activate so quickly and subtly that the conscious centers of their brains are unaware of the response.  Other experiments revealed that even when experimenters created groups at random, inter-group prejudice quickly established itself, subjects always ranking the out-group below the in-group.

****

  Many of the ideas in Wilson's book are indeed controversial, and at times I found them alternately too detailed or too sketchy.  Still, it is a feast for thought, which has forced me to reconsider some previous musings in this blog.

  First, the deep-seated need for tribal association undermines my conjecture in a recent posting, where I discussed tribalism/clannishness and what I called their antithesis, "anticlans."  I suggested that tribalism might be ready for obsolescence in our modern, globally hyper-connected society.  I was of course out of my depth, and therefore took to wild speculation.  I asked whether the modern obsessive inclusiveness of social-networking activities could lead to a widespread "we're all in it together" anticlan behavior, which over the long run could evolutionarily trump the exclusionary, "we vs. them" behavior of clans. 

  I think Wilson would answer my question with a definitive "No."  He spends just one paragraph of his 350-page book noting that the increasing interconnection of people worldwide through the internet and globalization weakens the relevance of ethnicity, locality, and nationhood as sources of identification.  Tribes may wax and wane and sources of tribal identification may shift, but Wilson would hold that tribalism itself will survive, overwhelming anticlan behavior just as it has always defeated less-tribal species.  Bummer.

  Second, the almost-impossibility of negotiating the evolutionary labyrinth to yield a species like Homo sapiens sheds further light on my posting on the search for extra-terrestrial intelligence (SETI).  In a catalog of obstacles to the success of SETI, I listed the rarity of a star having a planet capable of hosting life at all, much less an advanced civilization that was existent at a time when electromagnetic radiation from it might be detected by us now.  If Wilson is correct about the very remote possibility that Homo sapiens could have evolved on Earth, the odds against there being a similar one extraterrestrially are even larger than I thought.  Could we therefore be alone in the Galaxy, dare I say in the universe?  We'll probably never know.  Bummer again.

  I sometimes wish that newly developing scientific theories didn't get in the way of one's fondest hopes.

Wednesday, October 17, 2012

The Anticlan

  I wrote last week about the life-long damage that discrimination inflicts, a result of the clannish, "us vs. them" behavior that is all too normal in human affairs.  Today, I write about the antithesis of such clannishness, a "we, all of us" mindset that is embodied in a group I know, which I'll call The Anticlan.  (I won't identify it for fear of embarrassing its core members.  Many readers will recognize it anyway.)  The Anticlan is surely not unique.  It is no doubt like other outreaching groups that you may know, but let me describe its operation so I can make my final point.

   A traditional clan is exclusionary, membership obtained on a limited basis, usually through birth or marriage.  It views other clans with suspicion, often hatred.  Contrarily, The Anticlan is obsessively inclusive.   It is as if it were a giant planet like Jupiter, sweeping into itself everyone who comes near.  Originally comprising a nuclear family that still forms its core, The Anticlan has grown for many years as others came into its gravitational field.

  I was drawn into orbit decades ago through a business colleague, himself a member by marriage.  Others were captured by living on the same block, having gone to school or done business with other members, or chance meetings.  Each new member in turn attracts additional members in the same manner.  This growth is not at all by active planning or proselytizing, merely by happenstance and an unspoken credo of inclusion.

  An initial introduction is followed by a warm embrace: one is asked to various Anticlan events.  The invitations are sometimes quirky.  In my case, my business colleague would call up and say, "Sunday's dinner is at 6 pm."  Just that.  I would scramble to my calendar to see if I had forgotten something, but I hadn't.  Yet such invitations were so charming and sincere in their idiosyncrasy that my wife Helen and I would attend if we could.  Others in the extended Anticlan would be there too, likely invited in the same abridged manner.

  Often the invitations are to more significant events than dinner—weddings, graduations, even vacations.  On one occasion, Helen and I were informed that The Anticlan had rented a compound in Hawaii, at which a milestone birthday of The Anticlan's matriarch would be celebrated.  We were told by my colleague that seats had already been reserved for us on such and such flights; we only had to make the final arrangements.  We were again enchanted by the warm but eccentric inclusiveness, and found ourselves going. 

  While in Hawaii, a signal event occurred.  One of the matriarch's sons, as a surprise to his parents, flew his new fiancée there from Russia, where he had met her, in order to introduce her to them—and, incidentally, to the assembled horde from The Anticlan.  Poor woman, I thought.  She could barely speak English at the time; Hawaii and The Anticlan must have seemed like another planet!  Yet that planet was the self-same Jupiter I have mentioned, and she too was captured by its strong gravitational pull.  When she married into The Anticlan, her parents, who lived in Moscow, were also swept into its orbit, embraced by it on their own trips to the U.S. 

  On another occasion, a year or two after Helen died, I almost magically found myself on a sailing trip in the Caribbean—ten glorious days of hopping between islands, sharing nautical tasks with other Anticlan members I'd not previously met.  The boat charter, every detail, had already been arranged, and I was simply told when and at what gate to appear at the airport.  I, a loner (especially after Helen's death), had still after all those years not quite gotten used to the zaniness of such arrangements, but was again quite literally captivated.

  To this day, I have thus been drawn out of my shell.  Recently, I attended the 90th birthday of The Anticlan's patriarch.  About sixty people from all over the country were there to pay him honor.  Although I knew most of the attendees from previous Anticlan occasions, there were again new faces.  One couple had not long before moved from Australia to settle in the Bay Area.  They had been introduced to the core family by another son, who—although himself living on the East Coast—had thought that the new arrivals would need a network of friends as they adjusted to California.  The woman of the couple told me that they had been adopted by The Anticlan, which made their transition so much easier.  I had heard the same story many times before by migrants to the Bay Area from the world over.

  Why do I tell this collection of anecdotes?  Because I think they well illustrate how The Anticlan's infectious inclusiveness runs counter to the normal human trait of clannishness—a trait I believe must be a genetic legacy of the cave man's need for self-defense, 50,000 or more years ago.  Although recent millennia of human history have led us to ever more inclusiveness, our genes still often reflexively tell us to flock with birds of our own feather, a conduct that even now splinters the world.  The Anticlan flocks with birds of many feathers in a lovely symbiosis.

  That leads me to my point.  I cannot help wondering:  Is this Facebook generation, with its obsessive "friending," the germ of a new norm of inclusiveness, an Anticlan on steroids?  Will the age of digital hyperconnectivity be a turning point in human social organization?

  In a previous posting, Neurons and the Internet, I quoted Jesuit priest and media scholar John Culkin as saying, "We shape our tools and thereafter they shape us."  That was in a pessimistic context: the possible deleterious neuronal effects of multitasking.  More optimistically, in the present context, could social networking be rewiring our neuronal networks toward a more inclusive nature?   Could even a genetic adaptation start to take place?

   Of course, I am wildly speculating here.  A connection between multitasking and neuronal rewiring in individuals at least has some basis in observational data, as discussed in the Neurons and the Internet posting.  I'm unaware of any scientifically established connection between social networking and reduced clannishness, not even a hypothesized neuronal rewiring of individuals, much less a predicted evolution of the species.  However, if there were a beneficent evolutionary force at work over generations—if belonging to extensive social networks confers a competitive advantage over belonging to just a narrow clan—one can only hope that its fulfillment won't take another 50,000 years!

Wednesday, June 20, 2012

Neurons and the Internet

  Socrates would have much to say about the Internet if he were alive today—and he would not be complimentary.
 
  In his own time, Socrates was distressed by the increasing use of writing in the place of bardic recitation, oral discourse and philosophical dialog.  That may be why we have no writings directly from him.  We know of his concern only through Plato, in whose Phaedrus we find Socrates recounting objections by mythological Egyptian god-king Thamus to the gift of writing: "[It] will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves; … they will appear to be omniscient and will generally know nothing." 

  A reincarnated Socrates would no doubt have the same reaction to the Internet.  He might paraphrase Thamus by saying, "The Internet will create forgetfulness in its users, because they will not use their own memories; they will trust external websites and not remember of themselves; they will acquire broad knowledge but it will be shallow." 

  Neurologically speaking, Socrates would be onto something.  Modern research has shown that neuronal circuits in the brain, including its memory, wax and wane in response to demands placed on them.  For example, would-be London cabbies are required to memorize the location of every street and landmark within six miles of Charing Cross and the best routes among them.  As they train to meet this requirement, their brains' posterior hippocampi—which store spatial representations and navigation information—get bigger. After a cabbie's retirement, the hippocampus returns to normal size.

  Persistent use of the Internet might be causing similar brain changes.  Recent studies show that people who heavily multitask and hyperlink find prolonged concentration difficult. They are much more easily distracted by irrelevant environmental stimuli.  They tend to have less control over working memory—the temporary memory involved in ongoing information processing, which is sensitive to interruptions.  They have more difficulty in the formation and retention of both short- and long-term memories.

  It would seem from these observations that neuronal circuits such as long-term memory, which depends on intense and repeated concentration to thrive, are being weakened in chronic Internet users, and circuits involved in rapid scanning, skimming and multitasking are being strengthened.  This is an atavism of a sort, a throwback to the days of cave people, who may not have been deep thinkers but were supremely alert to sudden events like motion in the peripheral vision that might warn of a predator.  Today's sudden events consist of arriving email and texts, hyperlinks, and the like, which deflect us from current tasks onto others.

  These topics and many more are discussed in a fascinating book by Nicholas Carr, The Shallows: What the Internet is Doing to Our Brains.  Carr's main thesis is that today's Internet-dominated activities are indeed causing our brains to rewire themselves.  He aptly quotes the Jesuit priest and media scholar John Culkin: "We shape our tools and thereafter they shape us."  Carr notes that it has ever been thus, from the writing that so worried Socrates, through the map, the clock, the compass, and so on to modern times.  Each new tool has broadened our ability to understand and control our surroundings, but each has exacted penalties in the form of loss of previous skills and the societal attributes that they underpinned. 

  The balance of benefits and costs of an new tool is not always apparent at the start.  It turns out that the original Socrates was wrong to worry that writing would be preponderantly harmful: the forgetfulness he feared did not occur.  Rather, at least until multitasking recently came into play, book-reading forced us to concentrate deeply and linearly, actually enhancing our abilities both to stay with a single task and to commit the knowledge we thus gained to long-term memory.  On the other hand, our modern-day Socrates might turn out to be right about the detriments of the Internet.  It does appear to be driving us in the opposite direction, toward spasmodic, superficial scattering of our attention; outsourcing our memory to the Web; and a possible intellectual destiny in "the shallows" of Carr's title.

  The counterbalancing upside of the Internet is of course substantial: the ease and efficiency with which so many workaday tasks can now be done. The balance between benefit and detriment will take a long time to become clear.  I suggested in the most recent of my several previous postings concerning the Internet—click on "Internet" in the index to the right to see those postings—that a final assessment might require a century or two to achieve. 

  (Did you deflect your attention to the index?  Did you click on the suggested hyperlink and peruse those previous postings?  Even if you did neither, did your working memory fail to maintain the first part of the sentence in mind during the interruption, forcing you to read it again?  Voila!  The world of online reading.) 

  I have two reasons for contending that it will take generations or centuries to fully assess the impact of the Internet.  First, Internet Age technologies are still unfolding in a now unknown way, with unknown consequences.  Second, the brain will likely show itself more malleable in adapting to the Internet's demands than Carr and others credit.  There is already evidence that the working memories of heavy multitaskers are expanding in order to retain temporary information on several tasks simultaneously.  Further, the brains of those raised on the Internet are still engaged in their initial wiring, which won't be finished until they are 25 years old, say 2020 for the earliest Web babies.  That generation and its successors will probably end up wired much more effectively for life in the Internet Age than their elders like Carr, whose brains must be rewired helter-skelter from earlier configurations, which they might not want to surrender.  (There's even an evolutionary implication here, which I am not addressing, for then we would be talking of waiting millennia for a verdict, not merely generations or centuries.)

  Then, will our latter-day Socrates be as wrong about the Internet as the original Socrates was about writing?  We can't know yet.  Patience is the name of the game.

Wednesday, May 23, 2012

Creativity à la Silicon Valley

  A writer friend of mine, whom I'll call X, choked on a comparison I made in an earlier posting.  It took a Heimlich maneuver to resuscitate the poor fellow.

  I had written about communal creativity, the kind that arises when creative people frequently and randomly interact, stimulating each others' individual creativity.  In a book I referred to, Jonah Lehrer says that communal creativity reaches a peak when an assortment of supportive social, civic, economic and demographic conditions align.  What he calls "clots of excess genius" then form.  (Better to call them "clots of excess creativity," since true genius is too rare to come in clots.)  He gives as examples fifth century BCE Athens, fifteenth-century Florence, late sixteenth- and early seventeenth-century London and today's Silicon Valley, and I used the same examples.

  After recovering sufficiently, X emailed me to ask how Silicon Valley's "techies" and their creations could possibly be compared with the cultural illuminati and creations of Athens, Florence or London.  To put words into his mouth, it was as if he were asking: Can one even begin to compare Steve Jobs with Sophocles or Leonardo or Shakespeare?  Can one even begin to compare the iPad with Oedipus Rex, Mona Lisa, or Hamlet?  Put this way, the juxtaposition is staggering.

  It isn't that Silicon Valley doesn't have a clot of excess creativity. Annalee Saxenian, in her now classic book Regional Advantage, showed how the alignment of  the very same social, civic, economic and demographic conditions cited by Lehrer led to an explosion of communal and individual creativity there.  That alone at least invites comparison of Silicon Valley with earlier eras, if not necessarily leading to its inclusion in the pantheon. 

  X might accede thus far.  He admits that creators of technology have at least mental originality, but objects on another ground, asserting that their work doesn't emotionally engage the whole being of the creator or the beholder of the creation.  This, he seems to say, excludes Silicon Valley from any grouping with the other eras.  I wish I could imbue him with the sense of beauty and wonder that creators and beholders of engineering masterworks can feel.  It's sad that, more than a half century after C. P. Snow wrote about the disjunction between the two cultures of science/engineering and the humanities, mutual comprehension is still largely lacking.

  Yet X has correctly identified a difference in kind.  Many of the earlier eras' major creations were artistic or literary masterpieces by single creators. Silicon Valley's major creations are technological masterworks, usually agglomerations of efforts by many individuals.  Since responses to these different species of creations are so subjective, side-by-side comparisons of creators and creations from Silicon Valley with those from the earlier eras might shed more light on the comparer than the compared.  I think the four eras should therefore be compared by using a more objective yardstick: their overall impacts on civilization, which is probably what Lehrer had in mind in the first place.

  The golden age of Athens set the course for western civilization for millennia.  Fifteenth-century Florence set the tone for the rest of the Renaissance and still enthralls us with the beauty it created.  Writers in late sixteenth- and early seventeenth-century London set a standard for English literature for centuries.  Yet even as measured against these colossal accomplishments, X isn't justified in being so dismissive of Silicon Valley.  It almost alone created the Information Age, enormously changing the very structure of society in the twentieth and twenty-first centuries and likely for centuries to come.  In terms of the magnitude of the consequences each creative era has had for the world, Silicon Valley clearly belongs to the quartet. 

  I will concede one point to X, though.  It is a big one, concerning the relative merits of the contributions of the four eras, not just their magnitude.  The creative heritages we have received from Athens, Florence and London, seen through the lens of time, are unalloyed boons. We are not yet sufficiently distant from the heritage of Silicon Valley to comprehensively and dispassionately evaluate its ultimate contribution to civilization.  On the plus side, the Information Age has made the cultural heritage of the world easily and freely available to all of its population, not just its elite.  It has begun to redress the imbalance between the powerful and the powerless by giving a stentorian voice to those who had been ignored.  It illuminates the dark recesses of society that would never have been discovered otherwise.  On the negative side of the ledger—as I have argued in the past two weeks—the Information Age has greatly distorted the way we interrelate personally, the "alone together" effect.  And there might be perils of the Information Age (sentient robots?) that are as yet unknown—after all, it took almost two centuries for the global warming arising from the still-ongoing activities of the Industrial Revolution to become evident.

  The jury remains out on the net legacy of Silicon Valley, its balance of good and bad.  The verdict might not be rendered for a century or two.  I wish I could be there to hear it.

Wednesday, May 16, 2012

Online Education

  Sherry Turkle's book Alone Together, which I discussed in last week's posting, has a single sentence that has haunted me ever since I read it: "Today's young people have grown up … not necessarily tak[ing] simulation to be second best."  That mindset underlies the "alone together" syndrome that is afflicting society, especially our youngest generation.  In our drive to become digitally ever more together, we have paradoxically become ever more alone as we simulate traditional and intimate forms of contact with diluted ones online.

  Our best universities now seem to be on their way to exacerbating that syndrome by heavily adopting online learning.  MIT and Harvard recently jointly launched a nonprofit, online-education venture called edX. Almost simultaneously, Stanford, Princeton, and the Universities of Pennsylvania, California (Berkeley) and Michigan (Ann Arbor) joined a commercial online-education venture, Coursera.  The latest online courses have many technological bells and whistles—computer-mediated testing to provide students with feedback on their progress, social networks to enable discussion among them, automated and/or crowd-sourced grading, and so forth—that simulate the conventional learning environment.

  Both new ventures are stepping over the corpses of previous online efforts such as Fathom (Columbia, et al.), which failed in 2003, and AllLearn (Oxford, Yale and Stanford), which failed in 2006. Despite these failures, the universities involved in edX and Coursera seem mesmerized enough by online learning's possibilities to try again. I hope they are sufficiently aware of its dangers. The dangers are nuanced, depending on the objectives of the students.

  As a tool of continuing education in one's later years, online courses seem relatively benign, although even in this context they feed the "alone together" malaise.  Since edX and Coursera currently offer at most only certificates of completion, not college credits, they are well suited to the needs of this continuing-education audience. All the same, the logistics are unnerving: last fall, over 100,000 students around the world took three free, non-credit Stanford computer science classes online and tens of thousands satisfactorily completed them!

  The situation is much different for younger students seeking degrees. Neither their educational objectives nor their mature selves are fully formed, so interaction only with a screen cannot possibly substitute for the educational and personal maturation a "bricks and mortar" institution offers.  To the extent that complete courses of study leading to degrees are offered online (as the University of Phoenix now does) they cannot help but distort the very nature of education and its impact on such students. 

  Many arguments have been made both in favor of and against online education. The most telling one in favor is the democratization of access to learning, making it available to people around the world who could never otherwise have it. This argument alone can outweigh many of the traditional concerns.  Other less substantial but still important advantages are cited: additional students can enroll in overcrowded, much-sought-after courses at their own colleges; they can benefit from an ability to time-shift their attendance to hours that suit their own schedules and to "rewind" a lecture to review difficult parts; their access to the best teachers online may outweigh closer personal interaction with less talented ones available in a traditional setting. Further, there may be cost savings or income enhancements for struggling educational institutions, although the recent failures noted above call this into question.

  Some questions about possible drawbacks inherent in online courses have been asked by David Brooks in a column in the New York Times published just after edX was announced: Will massive online learning diminish the face-to-face community on which the learning experience has traditionally been based?  Will it elevate professional courses over the humanities?  Will online browsing replace deep reading?  Will star online teachers sideline the rest of the faculty?  Will academic standards drop?  Will the lack of lively face-to-face discussions decrease the passionate, interactive experience that education should be, reducing it to passive absorption of information?

  The many pros and cons are continuingly being parsed by those more expert than I.  However, from my viewpoint a sole desideratum should dominate the discussion: Since adoption of an online option aggravates the already severe "alone together" syndrome, it should be avoided if other options are available.  Using this criterion, an institution's quest for cost savings, productivity increases or income enhancement is insufficient in itself.  Likewise, a student's desire merely to substitute screen time for human interaction as a preferred mode of learning should be discouraged.  Students who must for some reason view a course's lecture component online should be required when possible to attend non-cyberspace support tutorials, seminars or discussion groups.

  The march toward broader adoption of online learning seems unstoppable, supported at both the top and bottom of the academic ladder.  Among its strong supporters in high places is Stanford president John Hennessy (see his article on the subject), who has enthusiastically predicted that this new wave of education will be something of a tsunami—an apt metaphor given the image of destruction it brings to my mind. At the student level, one finds that many students who are anyway always connected to the Internet don't think of online learning as second best.  They are comfortably at home on Facebook, with its billion users, so they find an online class of 100,000 unremarkable, and don't discern additional value in a class of forty in a traditional classroom.

  I am frightened by the likely acceptance of online education as a normal or even preferred mode of learning. Educational institutions may well soon face Hennessy's online tsunami, just as newspapers, book and music publishers, and magazines have.  I hope they, or at least some of them, will seek higher ground rather than scurry lemming-like into the deluge.  

Wednesday, May 9, 2012

Always Connected


  Writing this blog has alerted me to some of my internal contradictions.  For example, I yearn for equanimity, yet crave a faster connection to the Internet.  There's the rub: equanimity and torrents of data don't co-exist well, or at all.

  I guess I was vaguely aware of this contradiction before my blogging highlighted it, for in retrospect I realize I've been trying to sort it out.  I've persistently resisted replacing my antique voice-only cellphone with a smartphone, I suppose fearing that my tether to the Internet would be reinforced by mobile texting, browsing, email and apps.  At home I've come down to opening fewer than 1 in 4 of the emails in my already spam-free inbox.  I'm not active on social networks and I don't tweet.  Tilting the balance the other way, I've started blogging.  I think I've been searching all along for a sustainable equilibrium between the frenzy of technology and the stillness of self.  It's a hard struggle, but I think it essential for sanity in the Internet age.

  Many of us, particularly the young, are feeling the effects of not having such a balance.  Those effects are well analyzed by Sherry Turkle in her recent book Alone Together: Why We Expect More from Technology and Less from Each Other.  Turkle makes a compelling case that our expanding connectivity has had the perverse result of distancing us.  She has spent three decades as an MIT professor studying the issue, so I find no reason to challenge her credibility.

  Here are a few of the symptoms Turkle gives of an "alone together" syndrome: emailing or texting while in a meeting or dining with others; replacing intimate, spontaneous and hard-to-break-off telephone calls with less-demanding texts and emails; sidestepping the complexities of face-to-face friendships in favor of the less-stressful "friending" on Facebook or fantasy relationships with other avatars on Second Life. Today's young people, she says, are the first generation that does not take the simulation of closeness as second best to closeness itself.

  Turkle also has much to say about another symptom: multitasking.  By engaging in it, we enjoy the illusion that we are becoming more efficient, squeezing extra time into our already compressed schedules; we get high on that illusion.  What we have really done, she notes, is learned how to put others on hold as we switch among tasks, for we are actually capable of handling just one task at a time.  We now spend much of our time with family and friends distracted from giving them the full attention they deserve.  And all for naught, because research has shown that when we multitask our efficiency and the quality of our work are degraded. 

  I add two additional concerns to Turkle's, informed by my posting on creativity.  Being "alone together" in a crowd may prevent the random, impromptu interactions needed for communal creativity.  And having little or no down time for daydreaming may frustrate the Aha! part of individual creativity.

  For all the undisputed benefits of having a world of knowledge at our fingertips, this is a disheartening picture.  As a society we suffer from an Internet-driven obsessive-compulsive disorder.  Internal pressure assails us with withdrawal symptoms when our connection is broken, as if we had a substance addiction. (We do, although the "substance" is connectivity.)  External pressure adds to the malady, for employers are increasingly demanding 24x7 connectivity of their employees, even when on vacation.  Teenagers complain about like demands from their parents, who are newly empowered to keep track of them all the time.  I can't see how this can be good for any of us.

  Wait, it gets worse!  So far I've drawn just from the second part of Turkle's book, which is about always-connected networking.  The book's first part is about the growing impact of robots on our lives.  Today's young people, taken as they are with simulation, are more accepting of robots than the rest of us.  They grew up cherishing robotic toys like Furbies, which sold 40 million of their first generation from 1998-2000.  A third-generation will be introduced this year. 

  Those who are Furby-deprived might appreciate a description of the first Furby generation.  They were programmed to gradually speak English as they interacted with their owners, instead of their native "Furbish."  They demanded attention and responded lovingly when they got it with phrases like "I love you."  They could even communicate with other Furbies.  Like humans, they were always on—no switch—so the sole way to stop their sometimes annoying demands and chatter was to open them up with a screw driver and remove their batteries, in effect killing them.  Replacing the batteries reset them to their initial state, reincarnating them with no memory of what they came to "know" in their previous life.  Children actually mourned their death.  I can't imagine what the third-generation Furbies, fourteen years later, will be able to do, but their pretence of intelligent behavior will certainly be greater.

  Turkle's research shows that our youngest generation, primed by interaction with Furbies and other robotic toys, is quite open to the likely advent in a decade or two of widespread "intelligent" humanoid robots that could be lifelike and caring companions— even lovers and spouses (see a New York Times review of a 2007 book on that frightening reminder of Stepford Wives!).  I won't go there further, although Turkle does. My 20th-century mind rebels.  I fear that the shards of human togetherness remaining in "alone together" will shatter further as togetherness with robots increases.  

  I flee back to the lesser dislocation of society occasioned only by the Internet's connectivity.   An eerily apt poem by Wordsworth from two centuries ago still rings true with its original words.  It becomes all the more pertinent to today's topic by deleting a single letter.  Try replacing "spending" with "sending."

The world is too much with us; late and soon,

Getting and spending, we lay waste our powers;

Little we see in Nature that is ours;

We have given our hearts away, a sordid boon!

This Sea that bares her bosom to the moon,

The winds that will be howling at all hours,

And are up-gathered now like sleeping flowers,

For this, for everything, we are out of tune;


It moves us not. --Great God! I'd rather be

A Pagan suckled in a creed outworn;

So might I, standing on this pleasant lea,

Have glimpses that would make me less forlorn;

Have sight of Proteus rising from the sea;

Or hear old Triton blow his wreathèd horn.

Thus did Wordsworth bemoan the societal costs of the industrial revolution.  We should listen to him as we reckon the costs of the information revolution.

Thursday, April 12, 2012

Content v. Distribution

  Today I'm on home territory, the information technologies, returning to it from ground on which I had no right to tread: cosmology, religion, history, psychology, poetry and philosophy.  That might be a relief to all.

  I entered my home territory as a pre-teenager nearly seventy years ago, starting by building radios and phonographs from vacuum tubes, later building a ham radio station and communicating with other radio amateurs around the world by Morse code at 10-20 words per minute.  That's perhaps 5-10 bits per second, and I was thrilled to achieve that speed.  Now I am discontent with speeds of less than 3 megabits per second.  When a gigabit per second is available, enough to download a full-length, high-definition movie in a few seconds, will that be enough?  Who knows?  I am surprised that I now need more than 3 Mbps.  It is wondrous how much the Internet revolution has changed our expectations.

  These personal musings about the speed the Internet has injected into our lives might be expanded in another posting.  (Can our brains handle it?)  Today I'd like to comment on an aspect of the information industry that the Internet has so far not changed: the fraught relationship between content and distribution, even on the Internet itself.  Succinctly put, information distributors always try to control the content they distribute; providers of content always try to control its distribution.  One usually wins.  Will the Internet change that too?

  In 2010, Tim Wu wrote an excellent book covering the topic: The Master SwitchAs his subtitle The Rise and Fall of Information Empires suggests, Wu describes a persistent cycle that has affected a succession of information industries in America: telegraphy, telephony, movies, radio, television and cable.  A critical part of every cycle has been the establishment of a closed system: a monopoly or oligopoly having control of both content and distribution, to the detriment of the public interest.  Here are three examples:

    • Western Union was by 1870 one of the world's largest corporations, having a monopoly over telegraphy in the U.S.  By limiting access to its lines, it was able to wield unconscionable power over content.  For instance, it allowed access for news reports only to the Associated Press, at the time the main source for non-local news for many newspapers.  Both Western Union and the AP were allied with the Republican Party.  In the 1876 presidential election between the Republican Hayes and the Democrat Tilden, the AP telegraphically distributed only favorable stories about Hayes and unfavorable stories about Tilden. This collusion likely tilted the very close election to Hayes.  Here the public lost the integrity of its political system.

    • By 1913, AT&T had acquired scores of local telephone companies and then its largest competitor, Western Union.  In an anti-trust settlement that year, it agreed to divest itself of Western Union; stay out of the telegraph business; and submit to regulation as a common carrier of telephony, giving equal access to all, including access to its long lines to the remaining independent telcos.  In exchange, with the blessing of the government, AT&T grew to be virtually the sole provider of telephone service in the U.S.; it was widely touted as a benevolent monopoly.  For the next seventy years it indeed provided superb service and technical improvements.  Yet it consistently suppressed "disruptive technologies" that could have provided content other  than voice to the network.  Technologies invented in its own famed Bell Laboratories never saw daylight, like a facsimile machine (which might replace lengthy conversations with shorter faxes) and digital subscriber lines (which might replace voice with faster computer communication).  Similar technologies invented externally could not on "technical grounds" gain access to AT&T's lines.  Here the public significantly lost by delays in the advent of new technologies that would have allowed use of the only national network for content other than voice.

    • By the first decade of the twentieth century, the motion-picture industry contained hundreds of small producers.  This maelstrom had mostly coalesced by 1920 into an oligopoly of five Hollywood studios: Universal, MGM, Twentieth Century Fox, Warner Bros. and Paramount.  The only barriers to a completely closed system were the myriad independent distributors and theater owners who refused to relinquish their ability to screen just the movies and stars they wanted from whatever producer, inside the oligopoly or not.  So the studios in the 1920s conducted a blitzkrieg of pressure tactics and purchases, which ended in the elimination of virtually all these independents.  The film business was now completely vertically integrated; if one wanted to produce, distribute or screen a movie, it was pretty much through one of the studios or not at all.  Here, the public's loss was a substantial curtailment of the type and content of movies it could see.

  These three cycles all came to a close, the first by technical obsolescence, the second by regulatory action, the third by court decision.  But cycles persist in other venues, even if not as egregiously.  Cable TV distributors have a hammerlock on pricey packages of video channels they offer their customers, often in monopoly markets; there are frequent outbreaks of warfare between these distributors and their content providers, with blackouts both threatened and occurring.  Television networks have increasingly combined with Hollywood studios in exclusive content-distribution packages. Cellular telephone is almost a duopoly of AT&T and Verizon, each trying to limit the amounts of information that can be downloaded or uploaded, and proposing "fast lanes" for those willing to pay more.  In all of these venues, consumer choice is constrained.

  On the Internet, however, the user is still largely in control.  The interaction between content and distribution is unresolved.  The first major attempt at their fusion, the merger of AOL (distribution) and Time Warner (content), was a spectacular failure.  Neither company understood the nature of the Internet, whose very design gives the user unprecedented power to choose both distributors and content providers. 

  The failure of this merger by no means ended the story.  Wu points to three very powerful forces still at work in the Internet:

    • The pursuit of a closed-system model, not unlike that of the old AT&T.  It is typified by Apple, which says to the user, "Come to us!  We will provide you with an integrated system of hardware, software, applications, movies, books, music--whatever we think you want, but let us be the judge of what you want and how you get it.  We promise that you will be stunned by the orderly beauty and efficiency of our system."   Apple's control of content is exemplified by its refusal to carry books, video, music and apps except on its own terms, to the chagrin of many content providers. 

   • The pursuit of an open-system model, not unlike that of very early movie-industry days.  It is typified by Google/YouTube, which says, "Come to us!  We will provide you access to the entire world in all its chaos.  You be the judge of the content you want to see.  If you create your own content, we will distribute it. You will be amazed at the independent power we will make available to you."  One need only visit YouTube to see this open system in all of its expansiveness.  This blog, hosted and distributed free by Google, is another example.
   • Lurking beneath both are the owners of the Internet's transmission pipes, the underlying digital network that makes the Internet possible.  These are the old, now born-again conglomerates of telephony (the duopoly of AT&T and Verizon) together with the newer cable networks.  They grin wolfishly and say, "Neither the Apples nor the Googles can operate without using our pipes, and we will make sure that they (and the public) pay the piper."  They control the master switches of Wu's title.

  Wu wonders whether either the Apple or Google model will dominate the Internet, or neither.  I say neither. There will be some Apples and some Googles. As now, they will remain co-existing and competing subsystems, not the system,  and the user, still in control, will jump from site to site to get what he or she wants.

  Regarding the controllers of the "master switches," I strongly agree with Wu that they should be subject to "network neutrality" regulation, which would ban selective blocking of individual sites or selective limitation of amounts of data that can be distributed.  That issue is currently being intensely fought out in Congress, the FCC and the courts. 

  I noted at the outset that the Internet revolution has totally changed our expectations.  The genie cannot be put back in the bottle.  The younger generation, unused to yesteryear's limitations on information availability, weaned on and empowered by the openness of the Internet, will demand through their clicks, voices, wallets and votes that openness be maintained. They will not countenance constraint on their choice of distributors or content providers, or their ability to create independent content and distribute it themselves.  They will insist that the underlying network-neutrality question be resolved favorably toward this end.  In short I believe that, at least on the Internet, the perennial cycle that Wu describes has finally been broken.