Wednesday, November 18, 2015

Maybe Your College Does Matter


            As a school head, I was always telling parents and students that your choice of college is not a defining moment in your life.  Studies show that the prestige of your college is not a significant determinant of later success, on any measure, from income to fame (e.g. in math and science) to happiness.  Recently Frank Bruni’s Where You Go Is Not Who You’ll Be stated the case thoroughly and reached the Top 20 in Kindle e-books.  (It’s much lower in hardcover results, which may mean a lot more young people are reading it.)
            But I recently wandered into a byway on the Internet and found that in one very narrow professional field, where you go seems to play an exceptionally large role.  If you want to be President, go to a highly selective college.  If you’re okay with being Vice President, go elsewhere.
            The raw numbers are striking.  Of our 44 presidents, 26 went to what we currently consider the 100 most highly selective colleges.  Taking out the four military leaders who became president without college (Washington, Jackson, Harrison, and Taylor), that’s 65% of the presidents.  On the other hand, only 9 of the 31 vice presidents who never made the top spot went to the highly selectives, or 29%.
            Looking more closely, the pattern gets stronger.  The “highly selective" distinction is a  recent one, and in the early days of the Republic, there were relatively few schools besides those that later reached the top tier. Ten of the earliest 25 are now in the highly selective 100.  In the young country, therefore, the distinction was between going to college or not, and geography generally was the primary influence.  So four early presidents, all Virginians, went to William and Mary, the Adamses went to Harvard, and others also stayed for the most part in-state.
           Back then there was little difference between the presidents and the vice presidents.  Through 1844, three presidents and four vice presidents went to the future Ivy League schools.  Oddly, for rest of the 19th century no Ivy grad made it to either of the two offices.
            Starting in 1900, however, the pattern changed almost completely.  Eight of the presidents since Teddy Roosevelt went to the Ivies (ten if we add Ivy Law schools).  Only one elected VP (Al Gore) attended the Ivies, and many of us believe he should have been on the president’s list.  Nelson Rockefeller, who was appointed VP, also makes the list.
            The rise of state universities might have changed the picture, but it really didn’t.  There have only been two state university presidents, one each from the highly selective universities of North Carolina and Michigan.  Of the seven state university VPs only one attended a highly selective school.
            Another odd note: while being a war hero, whether a graduate of the service academies or un-degreed, has been a path to the presidency for seven presidents, eight if we add Teddy Roosevelt, it has never produced a vice president,  including those who later became president.
            Finally, there seems to be a risk associated with selecting a well-degreed vice president. Two of the best-degreed vice presidents are the most infamous of all those holding the office: Aaron Burr (Princeton) and Spiro Agnew (Johns Hopkins).
            So tell your children, if they really, really, really want to be president, and not settle for #2, then where you go may tell who’ll you’ll be, but otherwise, it’s pretty irrelevant.

Monday, November 9, 2015

Left Out by the Left


I never dreamed I would borrow the words of Ronald Reagan, but recently his comment about switching from Democrat to Republican, “I didn’t leave the party, the party left me,” keeps running through my head.  What’s even scarier, is that I too feel left by the left.  Let me explain.
            I remember declaring myself as a liberal somewhere early in high school, when Barry Goldwater conservatism was briefly flourishing.  While I didn’t march in Selma or burn my draft card,  my left credentials are pretty good: writing in the college paper against the Vietnam War and offering the resolution that the school I was working at should close in protest of Kent State; hiring a majority staff of color at the agency I worked at, founding board member of the Gay, Lesbian, and Straight Education Network, and so on. 
            But now I found myself almost as astonished and distressed at the left’s cultural and political screeds as at the right’s.  A few cases in point:’
            The Boston Museum of Fine Arts cancels a show in which women can dress in a replica kimono next to Monet’s portrait of his wife in the kimono.  Protesters had called the exhibit “racist,” “cultural appropriation,” and of course “orientalism.” 
            The Wesleyan student newspaper publishes an article concerned that the Black Lives Matter movement could also incite extremists to take violent action, citing examples of anti-police chants and other specifics.  (The article is here if you want to read it for yourself:  http://wesleyanargus.com/2015/09/14/of-race-and-sex/.)  The student government responds by unanimously (!) voting to consider shifting $17,000 in funding away from the paper.
            Yale makes headlines when one affinity group says that certain costumes, including turbans, headdresses, and the like, are to be avoided at Halloween because they are culturally insensitive, and a white early childhood educator responds negatively.  According to the NY Times, students confront the professor’s husband demanding that he apologize for her and saying he should lose his job. 
            This is besides many other examples of extreme sensitivity, like the law students who object to discussion of rape law because it may trigger sensitivities in people who have been sexually assaulted.  “Trigger warnings” now seem even to include such literary works as Alexander Pope’s “The Rape of the Lock,” which, for those of you who didn’t have to wade through masses of eighteenth century literature, is about the snipping of a lock of (head) hair.
            There are so many things wrong with these incidents that it’s hard to know where to begin.  Let’s start with the other end of the censorship spectrum.
            For decades -- no centuries,-- the left has fought against, and still fights against censorship from the right of “offensive material," particularly in schools and other institutions,  This included everything from the portrayal of nudity by Michelangelo and in movies, to banning hundreds of books like Of Mice and Men and The Color Purple.  Hitler burned books that portrayed Jews positively.  Stalin banned biology that offended him.  Students used to protest when colleges and universities censored or defunded publications for political or sexual content.
            But what rational position can maintain that we shouldn’t ban works that upset some groups, for example, Christians, but not others that upset others, for example Muslims?  Furthermore, the very notion of banning the upsetting seems linked with antagonism to free speech and free thought.  The Charlie Hebdo assassinations, the fatwah against Salman Rushdie, the Danish cartoon murder, were all extremes of “cultural sensitivity.”  During his years in hiding, Rushdie wrote an essay entitled “Is Nothing Sacred,” in which he hoped the answer was “No,” since sacredness only allows some ideas to be buried, and others to go unchallenged.
            Let me be clear.  I am not a free speech advocate even to the extent that this notion is espoused in America.  I agree with Germany that Holocaust denial, especially there, should be illegal, and that hate speech should be more strictly defined and controlled.  But none of the examples above comes close to a deliberate or even, I would argue, a plausible, attack on one culture by another. 
            For years, America has drifted toward the right, so that moderate positions of the past are now seen as liberal, and liberal positions as radical.  Remember, Reagan was for a path to legalized status for many immigrants, and Nelson Rockefeller and Jacob Javits proposed universal Medicare in 1970.
            But now the Young Left, particularly at schools, is making itself a mirror image of the worst of the right, and the sensitivities of any individual, right or left, apparently take precedence over what we used go call common sense. 

Thursday, September 24, 2015

What a Difference a Week Makes (Air Force One or Vatican One?)



Anyone notice that it was exactly a week between the second Republican debate and the Pope’s arrival in Washington?  The number of ways in which the two events contrasted is innumerable, but a few struck me as particularly telling.  The first is, in that misused word of political spin, the “optics.”  The debate at the Reagan Library, featured the bizarre backdrop of Reagan’s Air Force One.  My first thought was that the Dozen Dwarfs and the Ice Princess were simultaneously paying homage to St. Ronald, and showing symbolically how tiny they are in relation to Reagan’s hyped legacy. 
            Then came the Pope, in his Vatican Fiat, or Vatican One, as its license plate says.  Setting aside the absurdity of preserving a whole plane just because a president rested his butt in it, the symbolism is obvious: political power is just that: power, dependent on size, muscle, weapons, and so on.  Spiritual power is entirely different: from the original Francis to Mother Teresa, Gandhi, this Pope, Dorothy Day, the Dalai Lama, and an endless chain of exemplars in every faith I have heard of, and many I haven’t, spiritual leaders send the message that their kingdom is not of this world, or in the words of the Hebrew Bible, that God came not in the wind, nor the earthquake, nor the fire, but in the still, small voice. (Perhaps this should be a so-called “litmus test” in religion: the bigger your personal stature, the greater your accouterments, especially those garnered by you for yourself, the less likely you are to be a genuine “saint,” however we define that term.)
            A second contrast is between the canonized figures symbolically present at each event.  Roland Reagan’s ascent to the pantheon of Republicanism, replacing even Abraham Lincoln, depends in large part on ignoring who he was and what he did.  Reagan’s tax increases, Medicare expansion, immigrant amnesty, and so forth, are now anathema to Republicans, who have created a false idol if ever there was one.
            On the other hand, Junipero Serra, the so-called “controversial saint,” was canonized by a church that took full account of all his actions, and elevated a complete human being, who lived in his time and place.  If we go by the standards of our time, who should “‘scape whipping?” as Hamlet says.  Not violent Joan of Arc, participant in a dynastic war that no one today would label just, heretic-burner Thomas More, maybe not even Patrick, who also imposed a religion on a native people. 
            For all the claims that religious individuals are deluded believers in a comforting unreality, the past week suggests that the Pope in particular has a more clear-eyed view of who he is, of what the world needs, and of what really matters, than any of the politicians who sought to be sainted last week.

           

Friday, August 28, 2015

Guns, Words, Meanings -- an Originalist Proposal




Isn’t it strange that the Supreme Court “originalists” can ignore perfectly clear statements when it suits their political agenda.  There is no question that the Founders wrote this preamble to the Second Amendment, the only such explanatory statement in the Bill of Rights: “A well regulated Militia, being necessary to the security of a free State...”  It is also clear that they did not write “a free Country.” So any state that wishes to regulate arms should so in the language of the amendment. The law might read something like this: “We the state of _________, being desirous of having a well regulated Militia, do hereby enact the following legislation:

All persons wishing to keep and bear arms shall register as members of this state’s militia.   Any person denied membership in the state militia shall also be denied the keeping and bearing of arms.  Reasons for being denied such membership may include criminal record, mental or emotional disability, membership in an organization whose aims and purposes are inimical to the laws and statutes of this state or of the national government, or such other reasons as would normally disqualify an individual from being a member of the state militia or the federal military.  Every person so registering shall be deemed to consent to a full review of their qualifications to serve under this legislation.

Be it further enacted that the arms described in this legislation shall be limited to those currently or in the future to be issued to individual members of the state militia, or such arms as may be useful for developing appropriate skills for eventual service, including such lesser arms as shotguns and smaller caliber weapons, and that no arms of greater firepower than those mentioned above shall be allowed to persons who have so registered, unless said persons shall be actively serving in a local, state, or federal entity that issues them such arms.

This law shall include the following exceptions:
1.     Persons whose physical condition may disqualify them from active duty, but who otherwise meet the standards of the militia.
2.     Persons under the age of military service who indicate their willingness to register for the militia when they come of age, and who take appropriate pre-militia training in the responsibilities of bearing arms.
3.     Persons who, having registered for the militia and passing the age limit for service, or who have passed the age limit for service prior to the enactment of this legislation, unless and until they no longer qualify, except by age or physical disability, for such possession.

Be it noted that registering for the militia in no way implies that a person is liable for militia duty except under such circumstances as shall seem to the legislature and the executive to require a general call-up of all such persons.  Nor does it imply that any person who does not seek the right to bear arms need register under the law.

Persons who do not wish to meet these requirements shall have 90 days following the passage of this law to turn in arms currently in their possession, and shall be compensated for those arms according to their current value.  Alternatively, they may choose to render these arms permanently incapable of use, and submit them to inspection after their disabling, after which they may retain these items.



Saturday, August 15, 2015

Time Present and Time Past


            Rolling Stone has just announced its “100 Greatest Song Writers of All Time.”  Oddly,  these all-time greats were born between 1911 and 1989.  So Rolling Stone, like those pre-Darwin clergymen, evidently has a peculiarly unscientific notion of time.  (Even those old guys saw “all time” as 75 times longer than RS.) Just for fun, let’s re-do the all-time list, in the spirit of Richard Thompson, the British singer who replied to a similar list of “greatest songs of all time” by recording an album that started with “Sumer is icumen in,” which dates from before 1260 CE.

The oldest known song, with words and notation, is around 3400 years old.  Some hymns over 1000 years old are still being sung.  But for Rolling Stone, “all time” began whenever Robert Johnson (b. 1911) wrote his first number.  That means, among other things, that my own mother was a teenager by the time the first of the all-time great songs was written.  In fact, except for Hank Williams, all the top 25 started writing in my lifetime, and only Chuck Berry and Leiber and Stoller were born more than 5 years before me.  So by the time I got my first transistor radio, about 1957, I was able to listen to each of these composers hot off the 78- or 45-press.  Very flattering, but then I can listen to almost all the known writers before them just as easily.  

Oddly, lists of the 100 greatest scientists of all time, which you might think would skew more heavily to the modern era, usually begin over 2400 years ago. Of the two such lists I found, 82-88 of the 100 scientists ere born before the oldest of the RS musicians. Various top ten scientist lists never mention anyone born in the twentieth century, except for one credit to Alan Turing.
           
What’s missing? First, what almost everyone calls “the great age of American popular music,” from say 1910 to 1950.  Look up the term “Great American Songbook” and you won’t find a single person on the RS list.  Nor will you find a single jazz musician.  Then, of course, anything before 1925, of any sort.  And even more recently, any songwriter from other genres, such as musical theater, cabaret, etc. 
           
So here’s my quick list of genres and songwriters who belong ahead of at least 2/3 of the RS list:

1.     Maybe the greatest of all – Anonymous. Wrote all the ballads, folk tunes, etc. from “Greensleeves” on up.  No one will ever come close.
2.     The great hymn writers: Isaac Watts, Charles Wesley, William Billings, etc.  People still sing their songs every week throughout the world, a hundred and more years after their deaths.  Take that, “Imagine.”
3.     The true classics: Franz Schubert, Hugo Wolf, Gustav Mahler, and others.
4.     The nineteenth century’s greats, especially Stephen Foster
5.     Musical theater and operetta: Gilbert and Sullivan (the first rappers), Victor Herbert, Rodgers and Hart (and Hammerstein), Lerner and Loewe, Andrew Lloyd Webber, Stephen Sondheim, and more
6.     Tin Pan Alley: Irving Berlin, Cole Porter, the Gershwins, Yip Harburg, Jerome Kern, Harold Arlen, Sammy Kahn, Johnny Mercer—Wikipedia lists 50 well-knonw names  in this category alone.
7.     Jazz and Blues: Leadbelly, Ellington ,Hancock, Coltrane, Basie, Mingus, Waller, Jelly Roll Morton, all the way back to Buddy Bolden
8.     The Europeans, Kurt Weil, Jacques Brel, Charles Aznavour, and others less well-known here.

Last but not least, the true “one hit wonders.” John Newton of “Amazing Grace (actually a two-hitter with “Zion City of Our God”), Mel Tormé (“Chestnuts Roasting”), Julia Ward Howe, Katherine Lee Bates (“America the Beautiful”).  Let’s end with the greatest one-hit wonder of all time, the man whose one song is sung pretty much every day of every year to large audiences, and has been for at least as long as anything by the writers in Rolling Stone.  Every time you go to a baseball, football, basketball or hockey game, professional or college, you’re likely to hear his number. A rough estimate suggests that limiting ourselves to just those venues, more people have heard his song in person over the past half-century or so than are alive today.  So, far behind Anonymous in compositions, but perhaps his/her only competitor for audience, Francis Scott Key and “The Star-Spangled Banner.”

Tuesday, August 4, 2015

To Prepare a Face to Meet the Faces That you Meet


            There’s a new acronym making the rounds, joining the internet’s WTF, LOL, IMHO, and such.  But this one is about real life, and it’s caught the attention of the New York Times, with an essay followed the next day by  “Readers Discuss” item.  Since such reader weigh-ins are quite rare (I could find only one other in the past three months), evidently we’re on to a new social issue of gender discrimination.  Except we’re not.
            The ugly term is “Resting Bitch Face,” described thus by the first writer to bring it up: “RBF is a face that, when at ease, is perceived as angry, irritated or simply … expressionless.”  Obviously the noun before face suggests that this is a problem for women rather than men.  Numerous readers replied, explaining that it’s just their face, that they’re always being told to smile more, that their father has the same expression and no one ever comments on it to him. 
            Why do I believe that this is almost entirely a non-issue? Well, for one thing, the case is largely bolstered by photos of actresses and other female celebrities, who are caught in hundreds of poses annually, some of which must be non-smiling.  Nor are non-smiling looks the result of gotcha shots.  I just looked at 40 Vogue covers.  Of the 40, only 4 showed a full smile, and one of those was of Audrey Hepburn.  The standard Vogue photo is serious, intense, often slightly aggressive.  Look at full-page ads in magazines and you see the same thing: more than half the women – and men – look bored or irritated.  My own emotional reaction to these pictures is that  these beautiful people are so superior to us that they needn’t deign to give us the benefit of a smile.
            As for ordinary life, here’s where we need to do something I usually don’t – go all evolutionary on the issue.  Why do we smile?  Because we want to send one of a very few messages: that we are harmless, that we are friendly, or that we hope you are harmless, friendly, and perhaps helpful.  Of course we can smile to ourselves, but most of those smiles come when we are having a mental or non-present-moment parallel to the others: reading or recalling something positive, seeing something non-human that we find funny, attractive, uplifting, etc.
            Start with babies.  We smile at them, get them to smile at us, and thereby make the most basic of human connections, after providing them with food.  Babies who never smile are of great concern, as they seem to lack normal affect.  As we grow older, we learn to smile when approaching people, when we want to ask them for something, when we recognize them, admire them, and so on.  Taking it back to adult evolution, smiles are one of the most obvious ways to say “I come in peace.”  Remember Mr. Spock?  Not only did he greet people with a weird hand signal, he maintained a stolid face more than 99% of the time.  (I found a fan site that catalogued Spock’s smiles: as a child before he learned Vulcan ways, when he was drugged, and once – a very important moment – when he finds out he has not, as he feared, killed Kirk.)  We smile, androids don’t.
            So smiling has always been one of the clearest signs that we are not an opponent and that we hope you aren’t either.  We have also become extremely adept at reading facial expressions, and again, see it as a sign of abnormalcy when someone can’t describe facial expressions accurately.  We know the Duchesne smile, where the involuntary movement of muscles near the eyes confirms that the mouth’s smile is real.  Evil smiles, cruel smiles, arrogant smiles, hopeful smiles, casual smiles, ecstatic smiles – we know them all.
            We also know the opposite of smiles.  We know when a parent is angry at us as soon as we see their face, sometimes even when we only see their eyes.  Even dogs react to frowns, especially when they connect the look to something they’ve just done. 
            So we have been wired from birth, and to some extent from a time when we had not yet split off from other large hominids, to know what facial expressions mean in social context.  In fact our reactions to facial expressions are so fundamental that they occur in the parts of the brain we share with many ancestors, even non-mammals.  Just as we jump even before we know what that snake-like thing on the path really is, we react to the emotional message we’re receiving from a face before we can process and review the reaction. 
            Then, when we process, we usually go for the most commonly experienced explanation. (“When you hear hoofbeats, think horses, not zebras.”)  Why would this person be angry?  Have I done something?  Are they dangerous? Why do they look distressed?  Has something bad happened? What a pleasant half-smile; I imagine they’re a nice person.  Some of those who commented on the article complained that they were told to be more cheerful, others that they were asked if there’s anything wrong. So it's not just a case of demanding a behavior, but sometimes of offering help.
            In fact, we not only decide about others on the basis of their expressions, we often use their expressions to persuade others to share our view.  Look at any newspaper doing an article on a politician.  If the accompanying photo was shot at a neutral time, not at a moment of pleasure or dismay (e.g. winning an election or receiving news of a terrorist attack), its easy to tell where the paper lies on the political spectrum: if John Boehner is frowning or Chris Christie is raging, it’s left-leaning; if they’re smiling, it’s right-leaning.  Just flip the left-right for photos of Obama or Hilary Clinton (If there’s balance, that says something about the publication’s integrity.)
            Imagine an analogous situation.  Someone you know says “People coming toward me often look away abruptly or even veer from my path.”  You reply, “Well, I’ve noticed that when you walk, you often clench your fists.”  "But that doesn’t mean anything.  Its just the way I walk when I’m thinking.”  Our hands and  our faces need to be open in greeting if we want our fellow homo sapiens to feel comfortable around us.
            It may be true that some faces are structurally more friendly-seeming than others.  But do you know of anyone without a birth defect or an injury to the face who cannot smile?  The studies of Paul Ekman and others show that when people practice positive faces they experience positive moods, and vice versa, event when the are acting as test subjects, without any emotional stimulation.  I was once speaking to a Buddhist (Buddhists are mentioned in the article as among the great smilers.) and quoted the proverb, perhaps first stated by Abraham Lincoln: “Every man over 40 is responsible for his own face.”  The Buddhist agreed.  Those whose faces, in repose, are nearer to smiling than to frowning suggest to everyone they meet “I am enjoying life, I hope you are too.”  Try it; you’ll like it and so will they. 

Saturday, July 18, 2015

A Southern Heritage, Impressive and Unstained


In the debate over the Confederate flag, its defenders commonly make two claims:
1.     The flag is central to southern heritage
2.     The flag has no racist connotations, because the Civil War was not about slavery, but about states rights.
I’m not arguing #2, because I think it’s evident from innumerable statements by southern individuals and legislatures that the only “right” the South was defending was the right to own slaves.
            But there are two other more interesting points.  As James Loewen has pointed out (http://www.washingtonpost.com/outlook/five-myths-about-why-the-south-seceded/2011/01/03/ABHr6jD_story.html) the South was energetically  resisting the rights of the North by demanding the return of fugitive slaves, and seeking federal help, both judicially and militarily, to abrogate the rights of states like Massachusetts to enforce its own laws.  (There’s an interesting parallel: English courts decreed in 1722 that a slave who stepped onto English soil was thereby free. 
            But on to #1, where the story get more complicated and leves room for a different discussion.  Here’s another side: yes, that flag flew for four years, and meant a great deal to some Southerners.  But without it, the South would have 240 years of pre-1861 heritage, and 150 of post-1865 heritage, including enough history for any region or even country.
Quick, make a list of the greatest Southerners in American history before 1861.  Then erase those who lived under the Confederate flag.  Aside from Jefferson Davis and the Civil War generals, you’d still have the vast majority of the list.  For example, you’d have 17 signers of the Declaration of Independence, including its author.  Also 25 delegates to the Constitutional Convention, 8 presidents, the consensus greatest Supreme Court Chief Justice, Senators Calhoun and Clay, Dolly Madison, Davy Crockett, Jim Bowie, Stephen F. Austin, and on and on.
Odd notes: as governor of Texas Sam Houston refused to support secession and was removed from office; and Andrew Johnson, the last Southern president before LBJ (okay, Woodrow Wilson if we’re going by birth state), was  a diehard Unionist and the only Senator from the South to remain in the Congress during the war. Only President John Tyler  saw and supported secession, for the last nine month of his life (April 1861 to January 1862).
Then there’s the 150 years after the flag came down, during which a president from a former slave state integrated the U.S. military, four Supreme Court justices from former slave states voted to end school desegregation in 1954, a Texan pushed through the Civil Rights and Voting Rights Acts, and southern presidents have appointed more than 2/3 of all African-American cabinet members.  Oh, and the Nobel Peace Prize has gone to Carter, Gore, and the Virginia-born Wilson.
Finally, without the Confederate flag “Southern heritage” would include Frederick Douglass, George Washington Carver, Harriet Tubman, Booker T. Washington, MLK Jr., Rosa Parks, Andrew Young, Ralph Abernathy, etc. etc., not to mention at least half of America’s greatest musicians.
So let’s agree with Southerners that the Confederate flag represents 1% of its history, just as the Nazi flag represents 8% of the history of a unified Germany.  Then let the South join the rest of us in celebrating the other 99%.

Monday, July 13, 2015

It’s a Book, People!


Reading the initial comments about Harper Lee’s  Go Set A Watchman, I'm wondering if our obsession with the virtual world has caused even professional journalists to confuse reality and fiction.  Commentators are shocked that Atticus Finch at age 72 is (was?) prejudiced against African Americans. They're talking as if he's a real person who changed views as he got older.  Please remember this: he's a name in a book, given certain qualities by a writer who used the same name for someone in another book.  Apparently he's not the lawyer who lost the case of a black man accused of rape, since that's supposedly peripheral to this book, and that trial ends in an acquittal.  (I admit I haven't read the book, but I'm talking about the reactions. You don't have to have seen a play to know that the member of the audience who leapt onto the stage to slug the villain was also confused.)

A few simple facts: Watchman was written before Mockingbird. So Atticus was 72 before he was 52 to 55, as he was in Mockingbird.  In, say, 1955 he was an elderly racist, but by 1960 he was a middle-aged idealist.  So he is evidently another Benjamin Button.  Or, he's just a name, so is Scout, and, by the way, Jem is not really dead, because he never lived.  Harper Lee wrote two books about the same characters, or rather about characters whose names she didn't change.  The books are about two alternate realities, and Lee made no attempt to connect them, nor should we. Now if Watchman were a sequel, we could ask why Lee never cleared up how Atticus had changed.  But it's a prequel in the real world, as Lee tested out what her characters were like, then went, as they say, in another direction.

Let's put it this way: authors create alternate realities.  Sometimes they create an alternate reality and carry it through several books.  Then we can look at changes: How does Bilbo change as he ages in Tolkien’s writing?  Does Snape change or does he hide his true character until the last Harry Potter?  But when an author sits down to write a book, he or she can start fresh every time and build a new world. Shakespeare's Falstaff in Henry IV Parts 1 and 2 is not the Falstaff of The Merry Wives of Windsor, and it would be foolish to debate how he changed from one to the other.  (For one thing, he's dead in the histories before he's wooing in the comedy, at least as Shakespeare’s authorship went.) 

The confusion seems to be caused by the simple fact that Harper Lee wrote about a man named Atticus twice.  Take another character written about by several people: is Shakespeare's Brutus, whom even Marc Antony extols for his principles, the same Brutus condemned to the lowest circle of hell in Dante?  Is Dante saying Roman morality was inadequate?  No, because again, Dante puts Brutus in hell centuries before Antony praises him.  Again, Odysseus is either a tired soldier trying to get home (Homer), an arrogant defier of the limits God has put on humans (Dante), a sneaky, cynical pragmatist (Shakespeare), or a courageous explorer showing the indomitability of the human spirit (Tennyson). But Harper Lee writing in the 1950s and Harper Lee writing a few years later, are two different people, at least as creative minds.  As T.S. Eliot put it, “every moment is a new and shocking valuation of all we have been.”

Trying, as some are already doing, to use Atticus’s “change” to analyze American racism is as inappropriate as asking gerontologists to decide whether Alzheimer’s or some other brain condition is behind his alleged alteration.  Let’s spend our time trying to figure out how living racists can be led to change, whether by reading Mockingbird, studying history, or hearing the stories of actual victims of racism.

Saturday, July 11, 2015

Moving On


Dear Friends,

Welcome to my new blog.  If you saw my farewell at readerweeper, you know some of what's below:

readerweeper.blogspot.com is now a Diary on the Daily Kos (dailykos.com) where it joins a world of political commentary.  This new site, "Old Artificer,"  will provide a broader commentary on literature and society, with more accent on solutions, celebrations, and general observations. Not that there won't be an occasional screed, but wityh politics over on Daily Kos, they will be fewer and perhaps more subtle -- or not.

So if you want to stick around, you can choose the bitter, the sweeter, or both.

Hope to hear from you.

DrReader45