Sunday, April 22, 2018

Redemption: Martin Luther King, Jr.'s Last 31 Hours by Joseph Rosenbloom


    In the summer of 1968, the city of Memphis was still reeling from the shock and horror of the assassination of the Reverend Dr. Martin Luther King, Jr. As a young reporter interning at the Commercial Appeal, Joseph Rosenbloom vowed to one day write the story of the last hours leading up to King's death. Returning to Memphis between 2006 and 2014, Rosenbloom interviewed two dozen people connected to the events of that fateful day. The result is Redemption, which Charles Blow of the New York Times hails as "immersive, humanizing, and demystifying."
    Rosenbloom, a former Frontline investigative reporter, examines in detail the last thirty-one hours and twenty-eight minutes of King's life, from the bomb threat against him that delayed his flight from Atlanta on the morning of April 3, 1968 to his murder the following night. Drawing on fresh material from the recently opened King archives elucidated other facets of the story, including a lack of progress towards the launch of the Poor People's Campaign and King's state of mind in the spring of 1968. By digging into the hearing transcripts of the House Select Committee on Assassinations and the Memphis police records, Rosenbloom highlights how a previously unreported lapse in police security left King vulnerable. Police director Frank Holloman received warning calls that something was liable to happen to King in Memphis—yet he neglected to notify King of the threats and only provided security for King and his team for 6 hours after their arrival in town. All these years later, reports Rosenbloom, African-American security officer Ed Redditt, who was assigned to a surveillance team monitoring King from a distance, still feels anguish that had he been adequately guarding King, he might have been able to prevent the assassination.
    Revealing the accumulating toll the movement was taking on King, who was emotionally and physically exhausted, Rosenbloom examines the extraordinary pressure he was under to recruit volunteers and raise funds for the Poor People's Campaign, his ambitious project to eliminate poverty in the United States. He had spoken out against the Vietnam War and was facing blowback from the Johnson administration and dissension within his staff, while rioting had spread throughout the nation's cities and there was a growing perception that King was "old news" compared to the new Black Power movement.
    Rosenbloom underscores how dangerous it was for King and his associates to venture into Memphis. King knew little of the city’s political and racial environment, yet he was determined to organize a peaceful march. At the same time, he was trying to broker talks with a local Black Power group, the Invaders, In the hopes that they would not only keep the peace, but also provide security as parade marshals during a violence-free march. Rosenbloom details how King, his mood increasingly darkening, was unable to convince the invaders to commit unequivocally to nonviolence. "He was trying to redeem his reputation as a nonviolent leader by leading a nonviolent march in Memphis," reflects Rosenbloom. "He was drawing deeply on his faith in the redemptive power of sacrifice for a noble cause, as he risked his life—a faith rooted In the biblical example of Jesus."
In vivid detail, Rosenbloom recreates the city of Memphis, the Lorraine Hotel, and the cast of characters surrounding King. There is Mayor Henry Loeb, who was unwilling to negotiate a settlement to end the garbage workers' strike; King's legendary local lawyer, Lucius Burch, who worked with King to fight the federal injunction against his proposed march; his brother, A.D.; his wife Coretta Scott King with whom he regularly talked on the phone; and his staff and fellow activists, including Dorothy Cotton, Ralph Abernathy, Bernard Lee, Andrew Young, and Jesse Jackson. Rosenbloom also writes of then-Senator Georgia Davis, who arrived in the pre-dawn hours of April 4^'^ and spent the king with King, momentarily lifting his spirits.
    Rosenbloom also traces the story of James Earl Ray, a petty criminal and drifter who tracks King through news reports and eerily makes his way to Memphis. There, writes Rosenbloom, Ray set up the perfect sniper's nest in the bathroom of a derelict rooming house across from the Lorraine Hotel. Crouching near the window, Ray posed a Gamemaster 760 rifle on the ledge, and waited to kill the nation's foremost civil rights leader.
    The book culminates with King's now-famous and prophetic "From the Mountaintop" speech at Mason Temple, during which he was often overcome with emotion as he exhorted his audience to understand that the choice today was between nonviolence and "nonexistence." Then his speech took a highly personal turn, notes Rosenbloom. "it was unusual for King to dwell openly on the depth of his despair as he pondered his own death. This night in Memphis, however, he seemed near panic, anxious that he might be the target of an assassin's bullet at any moment."

About the Author
: Joseph Rosenbloom is an award-winning journalist who has been a staff reporter for the Boston Globe, an investigative reporter for Frontline, and a senior editor for Inc. magazine. He has written for the Wall Street Journal, New York Times, American Prospect, among other publications, and lives in Newton, Massachusetts.

Friday, March 30, 2018

Studying Text - "Learning is the process of turning information into knowledge"


“Learning is the process of turning information into knowledge.  That is, establishing information in your memory that can be recalled and used.  The two main strategies are repetition and elaboration.”[1]  “While the Common Core State Standards share many features and concepts with existing standards, the new standards also represent a substantial departure from current practice in a number of respects.”[2]

The Common Core Standards set consistent and clear expectations for what students must know at the completion of each grade from Kindergarten through high school.  The standards establish expectations in three academic areas: mathematics, English language arts, and Literacy.  “The literacy standards establish reading and writing expectations for students in social studies, science, and technology. These standards provide few specifics on what students need to read or write, focusing instead on how students should read and write in these courses and how to evaluate what qualifies as good writing.”[3]  There are a number of fundamental literacy components in the Common Core Standards that educators, students, and parents should focus on adopting.  They include: improving reading comprehension, honing writing skills, and cultivating speaking and listening skills.  “Instructional materials will need to challenge students to read and understand more complex texts, build vocabulary, and extract details from texts to use as supporting material in essays and other written work.”[4]

“What I find troubling is the lack of concern of what a colleague of mine calls unwarranted self-regard.[5]

Marlene Zuk is a professor of ecology, evolution and behavior at the University of Minnesota and has published several treatises on evolution and science.  A reviewer of her book “Paleofantasy – What Evolution Really Tells Us About Sex, Diet, and How We Live” wrote, “As Zuk compellingly argues, such beliefs incorrectly assume that we’re stuck—finished evolving—and have been for tens of thousands of years. She draws on fascinating evidence that examines everything from adults’ ability to drink milk to the texture of our ear wax to show that we’ve actually never stopped evolving. Our nostalgic visions of an ideal evolutionary past in which we ate, lived, and reproduced as we were “meant to” fail to recognize that we were never perfectly suited to our environment. Evolution is about change, and every organism is full of trade-offs.[6]  Zuk’s article regarding student self-esteem is along similar lines of thought.  That students de-couple their own level of intelligence and reasoning from test scores or exam results.  Zuk states, “Maybe it’s all that self-esteem this generation of students was inculcated with as youngsters, or maybe it’s the emphasis on respecting everyone else’s opinion, to the point where no answer, even a mathematical one, can be truly wrong because that might offend the one who gave it.”[7]  While her opinion is forceful and direct, it is opinion and not fact.  Students may or may not on the whole have high sense of self-esteem.  Equally students have a low sense of self-esteem and are in a critical phase of self-identity and self-confidence which compounds their ability to learn or properly engage in the classroom.  Issues of high self-esteem are not nearly as important as educational resources and supportive instructional curriculum.


[1] Londe. The Biology of Learning. 1
[2] Rothman. Nine Ways the Common Core Will Change Classroom Practice.
[3] California Common Core State Standards, http://www.cde.ca.gov/be/st/ss/documents/finalelaccssstandards.pdf
[4] Idib.
[5] Zuk. Right, Wrong..What’s the Dif?
[6] http://books.wwnorton.com/books/detail.aspx?ID=24733
[7] Idib.

Sunday, February 25, 2018

Writing Across the Curriculum



“… Reading and writing are two side of the same coin. whereas the writer works to make a text sensible, the reader works to make a text sensible, the reader works to make sense from a text. As a result, the two processes, rooted in language, are intertwined and share common cognitive and sociocultureal characteristics. Both reading and writing, for example, involve purpose, commitment, schema activation, planning, working with ideas, revision and rethinking, and monitoring.”[1]

“The relationships between reading and writing have been a source of inquiry by language researchers since the mid-1970s.”[2]  My parents where strong advocates of reading, specially early reading because they believed that good readers make good writers.  As Richard T. Vacca and Jo Anne L. Vacca, authors of Content Area Reading: Literacy and Learning Across the Curriculum point out, “students who are good readers and writers perceive themselves as such and are more likely to engage in reading and writing on their own.”[3]  There is no mystery to that observation.  The challenge is how do you encourage a student who is a weak reader and a weak writer to become stronger in either area?  Both ends of the spectrum are self-reinforcing scaffolds.  As a student increasing become frustrated with their writing, they are equally frustrated by difficult or challenging reading material.  How does a teacher break the cycle?  On suggestion is short and informal writing tasks: Writing to Learn (WTL).  Using ‘quick-writes’ which focus on capturing an student’s ideas, or concepts very quickly and spontaneously – they spend less time frustrated with composition and more time focus on expression and understanding.  “WLT should not be confused with learning to write”[4]- rather student concentrate on “summarizing and extending their thoughts about a subject.”[5]

Vacca explains another writing technique that is creative, nonthreatening, and fun - POVG (point of view guide) are designed to "trigger thoughtful reading and writing by having students 'get inside the skin' of a character or a subject under study.”[6]  This exercise is simple and short, to have the student think outside of themselves in a creative fashion.  Students involved in a POVG can, ask questions in an interview format; or role-play; or use first-person inquiry to help overcome the fear or frustration of writing.  Once a student is comfortable writing, nearly anything, then as an instructor, we can introduce additional structure to their writing.  “Using a variety of instructional activities, including microthemes, POVGs, unsent letters, biopoems, dialogues, and admit/exit slips”[7] – all do the same, they allow the student to write without fear of failure.

Vacca uses Anne Frank, the world’s most famous child diarist, as an example of the merits of keeping a personal journal.  Does he anticipate our student’s experiences to equal hiding against Nazis and the death camps as the motivation to write?  I think journaling is an excellent activity to encourage a student to write without fear of evaluation or judgment.  Great writing can be developed if the student is allow to “freely express their feelings and thoughts in response to what they are reading and learning.”[8]


[1]      Richard T. Vacca and Jo Anne L. Vacca. Content Area Reading: Literacy and Learning Across the Curriculum Close, Hull, & Langer 2005. p.282.
[2]      Ibid., p.283.
[3]      Ibid., p.283.
[4]      Ibid, p.285
[5]      Ibid., p.285.
[6]      Ibid., p.286.
[7]      Ibid., p.285.
[8]      Ibid., p.294.

NonProfit Board Liabilities - are there legal responsibilities?

I am aware of nonprofit organizations with 501(c) designations that are unable to maintain their obligation and duty to faithfully meet their primary mission: honesty and transparancy.  Attorney Paul D'Alessandro outlines in an excellent article, that every member of a nonprofit board has an obligation to the charitable organization and also the community to operate with a "duty of care".  "Duty of care" translates into actions and measures that any reasonable person would behave and proceed in a way that is not reckless.  Specifically and especially when it relates to an organization's finances and expenses, to operate in a way that is ethical and honest. "If, for example, a charity had an excellent fundraising quarter, they cannot then decide to pay a bonus to the board members, because this would go against the trust society and the nonprofit has placed in its board members."
I would highly recommend reading and reflecting on Paul's article:
http://www.nonprofitpro.com/article/nonprofit-board-liabilities-legal-responsibilities-board/

Thursday, February 08, 2018

Use A Powerful Tool To Immediately Stop Paying Interest On Your Credit Card Balance

Think of someone carrying a credit card balance like a patient who enters an emergency room bleeding badly. The first thing a doctor will do is stop the bleeding. It's no different when attacking a credit card balance; the first thing you do is stop the interest charges.
There is a simple way to do this, and its brilliance is that it actually uses the banks' marketing offers to your advantage: find a card offering a long "0% intro APR balance transfer" promotional offer, and transfer your balance to it. These are cards which offer new customers a long period of time (often as much as 18 months) during which the card charges no interest on all balances transferred to it.

More info can be found at:
http://www.comparecards.com/guide/heres-a-brilliant-way-to-pay-credit-cards?utm_source=msnhp&utm_medium=native&pla=www.msn.com&tar=cms&acqs=prospecting&bdst=cpc&utm_content=fastest&adt=handingdollarfan&utm_campaign=20180208&mpch=ads

Sunday, October 08, 2017

The Limits of “Diversity”

Where affirmative action was about compensatory justice, diversity is meant to be a shared benefit. But does the rationale carry weight?


By Kelefa Sanneh
The New Yorker, October 9, 2017

https://www.newyorker.com/magazine/2017/10/09/the-limits-of-diversity

As universities learned to reframe their affirmative-action programs as diversity programs, students learned to believe them. This notion of diversity has proved remarkably flexible, and therefore popular, especially in comparison with terms like “integration” or “racial parity,” which connote conflict, rather than harmony. Liberals have been particularly enthusiastic exponents of Powell’s diversity doctrine, but the ideal of diversity is generally nonpartisan. Last year, Reince Priebus, who was then the chair of the Republican National Committee, voiced his hope that Trump’s running mate would add “a degree of diversity.” Later, when it was reported that the final list of choices consisted entirely of white men, Priebus said that he was not disappointed. “There’s also something called ‘diversity of experience,’ too, that’s necessary,” he said. And at the Republican Convention a student leader rallied the crowd with a Powellian affirmation: “We are the party of youth and diversity. Not the Democrats!”

Even as the idea of diversity was conquering the country, some on the left were having second thoughts. In 2006, the literature professor Walter Benn Michaels published a brusque polemic, “The Trouble with Diversity,” which depicted the whole concept as profoundly conservative. Because diversity meant “the appreciation (rather than the elimination) of difference,” he argued, it was the ideology of “bosses and owners,” who could celebrate their own increasing “cultural diversity” while ignoring the economic inequality with which they were complicit. Ellen Berrey, the sociologist, found Michaels’s argument “simplistic,” and, in “The Enigma of Diversity,” she sets out to discover how this ideology functions, by spending time in the field. Three fields, in fact: a large (and, by agreement, anonymous) Fortune 500 corporation, a mixed-income neighborhood in Chicago, and a selective public university, the University of Michigan. All three realms were proudly and self-consciously diverse, although carefully so—Michigan had been sued over its affirmative-action program. Berrey’s smart and subtle book aims to show exactly how differently people and institutions use this malleable concept.

Sunday, June 18, 2017

The History Test - How Should the Courts use History?

by Jill Lepore

Constitutional interpretation reaches back to the dawn of time.


On the night of April 9, 1931, James M. Kiley, thirty-nine, was shot with a .32-calibre pistol at a gas station in Somerville, Massachusetts, during a botched holdup. Kiley, the night manager, had twenty-four dollars in his pocket; the cash in the register was untouched. Herman Snyder, nineteen, was found guilty of first-degree murder and sentenced to death. “Well, that’s that,” Snyder said, when the jury delivered the verdict. But that wasn’t that. Snyder filed an appeal arguing that his constitutional rights had been violated: during his trial, when the judge, the jury, lawyers for both sides, and a court stenographer visited the gas station, the judge refused to allow Snyder to go along. Even Lizzie Borden had been offered a chance to go with the jury to the crime scene, Snyder’s lawyers pointed out, and so had Sacco and Vanzetti.

In the summer of 1933, Snyder’s lawyers went to see Louis Brandeis, the Supreme Court Justice, at his summer home, on Cape Cod; Brandeis, in an extraordinary gesture from the highest court, issued a stay of execution. The Court agreed to hear the appeal, and, in January, 1934, upheld Snyder’s conviction in a 5–4 opinion that proposed a standard for measuring the weight of tradition in fundamental-rights cases, a standard sometimes known as the history test.

Some rights, like freedom of religion, are written down, which doesn’t always make them easier to secure; and some, like the right to marry, aren’t, which doesn’t mean that they’re less fundamental. The Constitution, as originally drafted, did not include a bill of rights. At the time, a lot of people thought that listing rights was a bad idea because, in a republic, the people retain all the rights not specifically granted to the government and because anything written down is both limited and open to interpretation. “What is the liberty of the press?” Alexander Hamilton asked. “Who can give it any definition which would not leave the utmost latitude for evasion?” These were excellent questions, but Hamilton lost the argument. The Bill of Rights was ratified in 1791. Past the question of which rights there remained the question of whose rights. In 1857, in Dred Scott, the Supreme Court asked whether any “negro whose ancestors were imported into this country and sold as slaves” is “entitled to all the rights, and privileges, and immunities” guaranteed in the Constitution. Relying on “historical facts,” the Court answered no, arguing that, at the time of the framing, black people “had for more than a century before been regarded as beings of an inferior order, and altogether unfit to associate with the white race either in social or political relations, and so far inferior that they had no rights which the white man was bound to respect.” After Emancipation, the Fourteenth Amendment, ratified in 1868, cast off the shackles of history with this guarantee: “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.” Then, in a series of cases in the early twentieth century, the courts began applying parts of the Bill of Rights to the states, mainly by way of the Fourteenth Amendment.

Yet how would judges decide what rights fall under the definition of due process and equal protection? There seemed to be two possibilities: precedent and reasonable judgment. In Snyder v. Massachusetts, Snyder’s attorneys argued that Snyder had a fundamental right to go on the trip to the gas station, under the due-process clause. But Justice Benjamin Cardozo, writing for the majority, said that the question turned not only on a reasonable reading of the Fourteenth Amendment or on precedent but also on whether refusing to bring a defendant with the jury to the crime scene “offends some principle of justice so rooted in the traditions and conscience of our people as to be ranked as fundamental.” He then recited instances, going back to 1747, to show that what Snyder had been denied did not meet this standard.

History, in one fashion or another, has a place in most constitutional arguments, as it does in most arguments of any kind, even those about whose turn it is to wash the dishes. Generally, appeals to tradition provide little relief for people who, historically, have been treated unfairly by the law. You can’t fight segregation, say, by an appeal to tradition; segregation was an entrenched American tradition. In 1896, Plessy v. Ferguson, essentially reprising Dred, cited the “established usages, customs, and traditions of the people” in affirming the constitutionality of Jim Crow laws. In 1954, to challenge such laws, Brown v. Board of Education disavowed historical analysis and cited, instead, social science: empirical data. Meanwhile, Snyder was chiefly cited in appeals of murder convictions involving defendants who claimed that their rights had been violated. In 1945, Justice William O. Douglas cited Snyder in a 5–4 decision reversing the conviction of a Georgia sheriff who had arrested a young black man for stealing a tire and then beaten him to death. The killing was “shocking and revolting,” Douglas wrote, but it was impossible to know whether the victim’s civil rights had been violated. In a fierce dissent, Francis Murphy argued that the reversal was absurd: “Knowledge of a comprehensive law library is unnecessary for officers of the law to know that the right to murder individuals in the course of their duties is unrecognized in this nation.”

But, in recent decades, the history test applied in cases like Snyder has quietly taken a special place; it has been used to help determine the constitutionality of everything from assisted suicide to deportation, by the unlikely route of judicial decisions about sex. History’s place in American jurisprudence took a turn in 1973, in Roe v. Wade, when the Court dusted off its incunabula and looked into what “history reveals about man’s attitudes toward the abortion procedure over the centuries,” as Justice Harry Blackmun explained. Abortion had not been a crime in Britain’s North American colonies, nor was it a crime in most parts of the United States until after the Civil War. “It perhaps is not generally appreciated that the restrictive criminal abortion laws in effect in a majority of States today are of relatively recent vintage,” Blackmun wrote. In turning back the hands of time, he didn’t stop there. “We are told that, at the time of the Persian Empire, abortifacients were known, and that criminal abortions were severely punished. We are also told, however, that abortion was practiced in Greek times as well as in the Roman Era, and that ‘it was resorted to without scruple.’ ” Roe overturned laws passed by state legislatures by appealing to ancient history. William Rehnquist, in his dissent, cited Snyder: “The fact that a majority of the States reflecting, after all, the majority sentiment in those States, have had restrictions on abortions for at least a century is a strong indication, it seems to me, that the asserted right to an abortion is not ‘so rooted in the traditions and conscience of our people as to be ranked as fundamental.’ ”

Not coincidentally, liberals began applying the history test to fundamental-rights cases at the very moment that women and minorities were entering the historical profession and writing history that liberal-minded judges might be able to cite. Conservatives, meanwhile, defined a new historical method: originalism, a method with roots in the kind of analysis made in Dred Scott. Originalism is essentially a very tightly defined history test. Snyder’s invocation of “the traditions and conscience of our people” is like a reader’s pass to the library stacks. There is virtually no end of places in the historical record to look for the traditions and conscience of our people, especially when “our people” is everyone. Originalism, a term coined in 1980, asks judges to read only the books on a single shelf in the library: the writings of delegates to the Constitutional Convention and the ratifying conventions, the Federalist Papers, and a handful of other newspapers and pamphlets published between 1787 and 1791 (and, occasionally, public records relating to debates over subsequent amendments, especially the Fourteenth). Even more narrowly, some originalists insist on consulting only documents that convey the “public understanding” of the writings of these great men. “If someone found a letter from George Washington to Martha telling her that what he meant by the power to lay taxes was not what other people meant,” Robert Bork once wrote, “that would not change our reading of the Constitution in the slightest.”

Roe, along with a series of civil-rights decisions made by the Warren Court, fuelled the growth of a conservative legal movement. The Federalist Society, founded in a number of law schools in 1982, developed an intellectual tradition, promoted scholarship, and sought to place its members on the courts. (Justices Samuel Alito and Clarence Thomas, along with Neil Gorsuch, who has been nominated to join them, are affiliated with the Federalist Society.) Within five years of its founding, the society had chapters at more than seventy law schools.

In 1985, in a speech to the Federalist Society, Ronald Reagan’s Attorney General, Edwin Meese, announced that “the Administration’s approach to constitutional interpretation” was to be “rooted in the text of the Constitution as illuminated by those who drafted, proposed, and ratified it.” He called this a “jurisprudence of original intention,” and contrasted it with the “misuse of history” by jurists who saw, in the Constitution’s “spirit,” things like “concepts of human dignity,” with which they had turned the Constitution into a “charter for judicial activism.” Meese’s statement met with a reply from Justice William Brennan, who said that anyone who had ever studied in the archives knew better than to believe that the records of the Constitutional Convention and the ratifying conventions offered so certain, exact, and singular a verdict as that which Meese expected to find there. (Obama’s Supreme Court nominee Merrick B. Garland clerked for Brennan.) Brennan called the idea that modern judges could discern the framers’ original intention “little more than arrogance cloaked as humility.”

In opposing fundamental-rights arguments, though, the Reagan-era Court used not only originalist arguments but also the history test. In June, 1986, the Court ruled, 5–4, in Bowers v. Hardwick, that the right to engage in homosexual sex was not rooted in tradition; instead, prohibitions on homosexual sex were rooted in tradition. Justice Byron White, writing for the majority, said that these prohibitions had “ancient roots.” In a concurring opinion, Justice Lewis Powell wrote, “I cannot say that conduct condemned for hundreds of years has now become a fundamental right.” Blackmun, in his dissent, argued against this use of history: “I cannot agree that either the length of time a majority has held its convictions or the passions with which it defends them can withdraw legislation from this Court’s scrutiny.”

Antonin Scalia joined the Court in the next term. And, soon afterward, in 1987, Reagan had the opportunity to appoint another Justice, and named Robert Bork. Less than an hour after the nomination was announced, Senator Edward M. Kennedy called for Democrats to resist what he described as Reagan’s attempt to “impose his reactionary vision of the Constitution on the Supreme Court and on the next generation of Americans.” Laurence Tribe, the Harvard law professor, testified in opposition to Bork’s nomination. But concerns about Bork’s vantage on history were not limited to liberal legal scholars. His most determined critics included the federal judge Richard Posner, who wrote of Bork’s views, “There are other reasons for obeying a judicial decision besides the Court’s ability to display, like the owner of a champion airedale, an impeccable pedigree for the decision, connecting it to its remote eighteenth-century ancestor.” In retrospect, the way this debate reached the public was mostly a distraction. The press generally reduced the disagreement to a stubbornly partisan battle in which conservatives and the past squared off against liberals and the future, and missed most of what was at stake: the relationship between history and the law.

Scalia was the Court’s most determined and eloquent originalist, but he also frequently invoked tradition. In 1989, writing for the majority in Michael H. v. Gerald M., a case involving the assertion of parental visitation rights, he argued that finding rights “rooted in history and tradition” required identifying the “most specific” tradition; Brennan, in his dissent, questioned Scalia’s method, writing that the opinion’s “exclusively historical analysis portends a significant and unfortunate departure from our prior cases and from sound constitutional decisionmaking.” As he had in his debate with Meese, Brennan charged Scalia with something between ignorance and duplicity. “It would be comforting to believe that a search for ‘tradition’ involves nothing more idiosyncratic or complicated than poring through dusty volumes on American history,” Brennan wrote, but history is more complicated than that, “because reasonable people can disagree about the content of particular traditions, and because they can disagree even about which traditions are relevant.” Even more fundamentally, Brennan argued that the appeal to tradition essentially nullifies the Fourteenth Amendment, whose whole point was to guarantee constitutional protections to those Americans who had not been protected by the traditions and consciences of other Americans.

If less carefully observed than the debate over originalism, the debate over the history test has influenced judicial nominations for decades. “A core question is whether, in examining this nation’s history and tradition, the Court will protect only those interests supported by a specific and longlasting tradition, or whether the Court will not so constrict its analysis,” Senator Joseph Biden said during hearings on David Souter’s nomination, in 1990. (Biden had been coached by Tribe.) Souter’s answer—“It has got to be a quest for reliable evidence, and there may be reliable evidence of great generality”—satisfied Democrats. Liberal legal scholars, meanwhile, had grown increasingly alarmed by Scalia’s use of history: in a 1990 case, for example, he cited a book written in 1482 in a narrowing definition of due process, and in a 1991 case he cited punishments imposed during the reign of James II to uphold a mandatory life sentence without the possibility of parole for the possession of six hundred and fifty grams of cocaine. The legal scholar Erwin Chemerinsky argued that conservatives on the Court had turned to history-test historicism because originalism is so patently flawed as a mode of constitutional interpretation. (The framers weren’t originalists; Brown v. Board can’t be squared with originalism; originalism can’t be reconciled with democratic self-government.) “The constant use of history to justify conservative results leads to the cynical conclusion that the country has a seventeenth century Court as it enters the twenty-first century,” Chemerinsky wrote in 1993. “It is not enough to make one want to take all the history books out of the Supreme Court’s library, but it makes one come close.”

Or you could write new history books. Geoffrey R. Stone, a distinguished professor and a former dean of the University of Chicago Law School, is a past chairman of the American Constitution Society, which was founded, in 2001, as an answer to the Federalist Society. His new book, “Sex and the Constitution: Sex, Religion, and Law from America’s Origins to the Twenty-first Century” (Liveright), locates “America’s origins” in antiquity. Applying the history test to the regulation of sex, Stone begins his inquiry in the sixth century B.C.E., and expands into a learned, illuminating, and analytical compendium that brings together the extraordinary research of a generation of historians in service of a constitutional call to arms.

Stone started working on the book about a decade ago, not long after the Court reversed Bowers. In Lawrence v. Texas, in 2003, the majority opinion overturned state sodomy laws by rejecting the history presented as evidence in Bowers. Colonial anti-sodomy laws did exist, Kennedy wrote in Lawrence, but they applied to everyone, not just to men; also, they were hardly ever enforced and “it was not until the 1970’s that any State singled out same-sex relations for criminal prosecution, and only nine States have done so.” In short, Kennedy wrote, “the historical grounds relied upon in Bowers are more complex than the majority opinion and the concurring opinion by Chief Justice Burger indicate.”

The tables had turned. Between Bowers and Lawrence, academic historians had produced a considerable body of scholarship about the regulation of sexuality, on which the Court was able to draw. Scalia, in an uncharacteristically incoherent dissent, mainly fumed about this, arguing that “whether homosexual sodomy was prohibited by a law targeted at same-sex sexual relations or by a more general law prohibiting both homosexual and heterosexual sodomy, the only relevant point is that it was criminalized—which suffices to establish that homosexual sodomy is not a right ‘deeply rooted in our Nation’s history and tradition.’ ” Scalia, in effect, accused the majority of doing too much historical research.

The inconsistency is perhaps best explained by the Court’s wish to pretend that it is not exercising judicial discretion. One legal scholar has suggested that the history test is like Dumbo’s feather. Dumbo can fly because he’s got big ears, but he doesn’t like having big ears, so he decides he can fly because he’s got a magic feather. The Court has got big, activist ears; it would rather believe it’s got a magical history feather.

Lately, the field of argument, if not always of battle, in many fundamental-rights cases has moved from the parchment pages of the Constitution to the clay of Mesopotamia. In Obergefell v. Hodges, the 2015 Supreme Court decision that overturned state bans on same-sex marriage, Justice Kennedy, writing for the majority, reached back almost to the earliest written records of human societies. “From their beginning to their most recent page, the annals of human history reveal the transcendent importance of marriage,” he said. “Since the dawn of history, marriage has transformed strangers into relatives, binding families and societies together.” He cited Confucius. He quoted Cicero. The states that wanted to ban same-sex marriage described its practice as a betrayal of that history, but Kennedy saw it as a continuation, a testament to “the enduring importance of marriage.” Marriage is an institution with “ancient origins,” Kennedy said, but that doesn’t mean it’s changeless. Scalia, in a heated dissent, called Kennedy’s opinion “silly” and “pretentious.” As a matter of historical analysis, Scalia mostly confined himself to the past century and a half. “When the Fourteenth Amendment was ratified in 1868, every State limited marriage to one man and one woman, and no one doubted the constitutionality of doing so,” he said. “That resolves these cases.”

Liberal legal scholars disagree, and Stone’s “Sex and the Constitution” is an attempt to pull together all their evidence, for the sake of court battles to come. Ancient Greeks, Romans, and Jews believed that sex was natural and didn’t have a lot of rules about it, Stone argues. Early Christians, influenced by Augustine of Hippo, who in the fifth century decided that Adam and Eve had been thrown out of the Garden of Eden because of lust, decided that sex was a sin, and condemned all sorts of things, including masturbation. Stone speculates that the medieval church’s condemnation of same-sex sex, a concern that emerged in the eleventh century and that became pronounced in the writings of Thomas Aquinas, was a consequence of a new requirement: clerical celibacy. According to Stone, Aquinas argued that the sins of mutual masturbation, oral sex, and anal sex were worse if they involved two members of the same sex, a position that became church dogma in the sixteenth century.

During the Reformation, Protestants redeemed one kind of sex: intercourse between a married man and woman. (Martin Luther argued that sex was as “necessary to the nature of man as eating and drinking.”) Protestants also rejected the Catholic Church’s condemnation of contraception. But they believed that governments ought to regulate sexual behavior for the sake of public order. In the seventeenth century, most of England’s American colonies had an established religion, an arrangement that, a revolution later, they abdicated.

Enlightenment philosophers rejected Christian teachings about sex, and, believing in the pursuit of happiness, they believed, too, in the pursuit of pleasure. The Constitution and the Bill of Rights say nothing about sex, of any kind, with anyone, under any circumstances. Nor do any of the original state constitutions. Nor did any laws in any of the states, at the time of the founding, forbid sexual expression, or abortion before quickening, and sodomy laws were seldom enforced. That changed in the first half of the nineteenth century, when a religious revival led states to pass new laws, including the first law against obscenity. A campaign against the long-standing practice of abortion began, followed by a crusade against contraception and, at the turn of the twentieth century, the persecution of homosexuals. The cases from Roe to Lawrence to Obergefell, Stone suggests, constitute a revolution, not a turning away but a turning back, toward the Enlightenment.

History written to win a legal argument has a different claim to authority than history written to find out what happened. In a study of sex, Stone might have been interested in any number of practices, but he has confined his investigation to matters that are sources of ongoing constitutional and political debate in the United States today: abortion, contraception, obscenity, and sodomy or homosexuality. Practices that were once crimes, like fornication and adultery, or that are still crimes, like incest, infanticide, and rape, generally lie outside the scope of his concern. This has the effect of obscuring the relationship between things he’s interested in and things he’s not interested in, and it introduces a circularity: he has defined the scope of his study by drawing a line between what’s criminal and what’s not, when how that line came to be drawn is the subject of his study.

The history of the regulation of sexuality, especially the parts he’s chosen to gloss over—which happen to be parts that particularly concern the vulnerability of women and children—is a chronicle of a staggeringly long reign of sanctioned brutality. That reign rests on a claim on the bodies of women and children, as a right of property, made by men. “The page of history teems with woman’s wrongs,” Sarah Grimk√© wrote in 1837. Stone only skimmed that page. Or consider this page, from the Congressional Record in 1866, during the debate over the Fourteenth Amendment. Jacob Howard, a Republican senator from Michigan, explained that the amendment “protects the black man in his fundamental rights as a citizen with the same shield which it throws over the white man.” Howard assured his audience that the amendment did not guarantee black men the right to vote, even though he wished that it did, and here he quoted James Madison, who’d written that “those who are to be bound by laws, ought to have a voice in making them,” at which point Reverdy Johnson, a Democrat from Maryland, wondered how far such a proposition could be extended, especially given the amendment’s use of the word “person”:
Mr. Johnson: Females as well as males?
Mr. Howard: Mr. Madison does not say anything about females.
Mr. Johnson: “Persons.”
Mr. Howard: I believe Mr. Madison was old enough and wise enough to take it for granted that there was such a thing as the law of nature which has a certain influence even in political affairs, and that by that law women and children are not regarded as the equals of men.

History isn’t a feather. It’s an albatross.

Last year, Neil Gorsuch delivered a memorial tribute to Scalia, in which he said that the Justice’s greatest contribution to jurisprudence was his commitment to historical inquiry. Gorsuch said that Scalia had reminded legal scholars that, rather than contemplating the future, “judges should instead strive (if humanly and so imperfectly) to apply the law as it is, focusing backward, not forward.”

Scalia spent much of his career arguing for the importance of history in the interpretation of the law. “If ideological judging is the malady,” Scalia said in 2010, “the avowed application of such personal preferences will surely hasten the patient’s demise, and the use of history is far closer to being the cure than being the disease.”

Gorsuch’s account of this debate is more measured. Whose history? How far back? “In due process cases, the Supreme Court has frequently looked not only to this nation’s history, but also to English common law,” Gorsuch has written. “But why stop there? Why not examine Roman or Greek or some other ancient precedent as, say, Justice Blackmun did in his opinion for the Court in Roe v. Wade? And what about contemporary experience in other Western countries?” His book on assisted suicide contains a chapter, called “The Debate Over History,” that applies the history test to the question of the right to die. He began his survey with Plato, hopscotched across the centuries, and decided that, while a consensus had grown “that suicide is essentially a medical problem,” the historical record offers, at best, limited support for the idea of a right to assisted suicide and euthanasia. Gorsuch, an eloquent and candid writer, has his doubts about the history test. He writes, “The history test, for all its promise of constraining judicial discretion, carries with it a host of unanswered methodological questions and does not always guarantee the sort of certainty one might perhaps hope for.”

Gorsuch may be dubious about the history test, but he happens to be a particularly subtle scholar of precedent. (He’s a co-author of a new book, “The Law of Judicial Precedent”; Scalia had been meant to write the foreword.) And he’s written powerfully about the relationship between history and the law. In 2015, Gorsuch wrote an opinion in a case that concerned Alfonzo Deniz Robles. Deniz, a Mexican citizen, twice entered the United States illegally. He married an American citizen, and had four children. In 2005, the Tenth Circuit court ruled that an immigrant in Deniz’s position was grandfathered into a lapsed program that allowed him to pay a fine and apply for residency, so Deniz applied for a visa. The government held up his application for years, and by the time it was reviewed the Board of Immigration Appeals, an executive agency, overruled the court, requiring him to leave the country for ten years before applying for residency. (“It was, like, Today you can wear a purple hat but tomorrow you can’t,” Deniz’s wife, Teresa, told me. “It was mind-boggling.”) Deniz appealed, on the ground that his rights to due process had been violated.

The appeal reached Gorsuch’s court in 2014, at which point immigration services told Deniz, as Gorsuch explained, “that he’d have to start the decade-long clock now even though if he’d known back in 2005 that this was his only option, his wait would be almost over.” Writing for the court, Gorsuch explained that judicial reasoning is always backward-looking, while legislation is forward-looking; he cited a thirteenth-century English jurist to establish that the presumption against retroactive legislation is nearly as old as common law, and the retrospective effect of judicial decisions, he said, has been established for almost a thousand years. But what about acts of the executive branch? Gorsuch said that if an executive agency is acting like a judge its rulings are retroactive, but if it’s acting like a legislature its rulings are prospective. That is, if the Board of Immigration Appeals makes a new policy, it can’t apply it to people who made choices under the old policy. The Tenth Circuit ruled in favor of Deniz. He still doesn’t have a green card. That will likely take years.

The chain of cases that are of interest to Stone in “Sex and the Constitution” will be revisited by a newly constituted Supreme Court, once Scalia’s replacement finally takes a seat. More immediately, though, the Court will be asked to rule on the due-process and equal-protection-violation claims made in opposition to President Trump’s early executive orders, as a matter of federal law. “A temporary absence from the country does not deprive longtime residents of their right to due process,” eighteen state attorneys general and others argued in a brief challenging the Trump Administration’s travel ban. Gorsuch’s several rulings urging restraint of the executive branch carry a particular weight in this new political moment, in which the history test is already being applied to those orders. “The framers worried that placing the power to legislate, prosecute, and jail in the hands of the Executive would invite the sort of tyranny they experienced at the hands of a whimsical king,” Gorsuch wrote in a dissent from 2015. A lot of people are still worried about that.

Alfonzo and Teresa Deniz, who live in Wyoming with their kids, have so far spent more than forty thousand dollars on legal fees. They’ve got another court date, on March 21st, the day after the Senate Judiciary Committee begins hearings on Gorsuch’s nomination. The law keeps changing. “You hear a lot of things,” Teresa told me. “It’s scary.” She’s terrified that her children will lose their father. I asked Teresa if she and her husband had ever met Neil Gorsuch. She said no. She didn’t know that he’d been nominated to the Supreme Court. I asked her if she had a message for the Court. “Look at the families,” she said. She began to cry. She said, “I just hope that they can come up with something that is justice.”

Jill Lepore is a staff writer with The New Yorker and a professor of history at Harvard. “The Secret History of Wonder Woman” is her latest book.

Is Socially Responsible Capitalism Losing?

No More Mr. Nice Guy by Sheelah Kolhatkar


In December, 2015, a new startup called Juno entered the ride-hailing market in New York City with a simple proposition: it was going to treat its drivers better than its competitors, notably Uber, did theirs—and do “something that was socially responsible,” as one of Juno’s co-founders, Talmon Marco, told me last fall. In practice, that meant drivers would keep a bigger part of their fares and be eligible for a form of stock ownership in the company. But, on April 26th, when an Israeli company named Gett announced that it was buying Juno for two hundred million dollars, that changed. The merged company is dropping the restricted stock plan for drivers, and those who already hold stock are being offered small cash payments, reportedly in the hundred-dollar range, in exchange.

Juno’s founders had adopted the language of a doing-well-by-doing-good philosophy that has spread in the business world in recent years. Some call it conscious or socially responsible capitalism, but the basic idea is that any business has multiple stakeholders—not just owners but employees, consumers, and also the community—and each of their interests should be taken into account. The idea arose in response to an even more powerful principle: the primacy of investor rights. In a new book, “The Golden Passport,” the journalist Duff McDonald lays much of the blame for that thinking at the feet of a Harvard Business School professor named Michael Jensen, whose “agency theory,” developed in the nineteen-eighties, sought to align the interests of managers with those of the company’s investors. (Gordon Gekko spoke eloquently on its behalf in the movie “Wall Street.”) This alignment led to huge stock-option pay packages for top corporate managers and, McDonald argues, provided an intellectual framework that justifies doing anything (within the law) to increase a company’s stock price, whether that be firing workers or polluting the environment.

In this philosophical tension, the investors-above-all doctrine seems to have triumphed over the more inclusive approach. “I think what’s recent is maybe being so completely blatant about it,” Peter Cappelli, a professor and labor economist at Wharton, said. When American Airlines agreed to give raises to its pilots and flight attendants in April, analysts at a handful of investment banks reacted bitterly. “This is frustrating,” a Citigroup analyst named Kevin Crissey wrote in a note that was sent to the bank’s clients. “Labor is being paid first again. Shareholders get leftovers.” Jamie Baker, of JPMorgan, also chimed in: “We are troubled by AAL’s wealth transfer of nearly $1 billion to its labor groups.”

Those comments were mocked online, but similar sentiments are everywhere in the financial establishment. Both Costco and Whole Foods—whose C.E.O., John Mackey, wrote the book “Conscious Capitalism”—have been criticized by Wall Street investors and analysts for years for, among other things, their habit of paying workers above the bare minimum. Paul Polman, who, as C.E.O. of the Anglo-Dutch conglomerate Unilever, has made reducing the company’s carbon footprint a priority, recently fought off a takeover bid from Kraft Heinz, which is known for its ruthless cost-cutting.

Newer platform companies have also encountered the phenomenon. An app called Maple, which made the nearly unheard-of decision to offer health benefits and employee status to its food-delivery people, folded in recent months. Etsy, which allows craftspeople to sell their goods online, and which became known for its employee perks, has lost most of its stock-market value since it went public, in 2015; hedge-fund investors have been pushing the company to reduce its costs and to lay off employees. In the case of Juno, according to a person familiar with its operations, the founders sold the company and agreed to cut its driver stock awards because they couldn’t find new investors to finance its growth. “They were stuck from an expansion perspective, and this was what had to give,” I was told. “It came with some huge compromises.”

Many factors contributed to the troubles of these companies, but Cappelli notes how “vociferously the investment community seems to object to being nice to employees. It’s a reminder that, in the corporate world, things are constantly yielding to the finance guys—whether they know what they’re doing or not.”

This fixation on short-term stock gains is inherently unstable, Cappelli said. “The interesting thing is always to ask them, ‘What’s the value proposition for employees? Why should these people work only for the interest of the shareholders? How are you going to get people to work hard?’?” He went on, “I don’t think they have an answer.”

When I called a Juno driver named Salin Sarder to ask about the latest developments, he was surprised to learn that the Juno stock-grant program had been cancelled, and blamed his ignorance on the fact that he hadn’t checked his e-mail. (The company has not made a public statement and did not respond to my inquiries.) He was, on the other hand, pleased to learn that the new Juno-Gett would be honoring the favorable commission rate Juno had been offering, at least for a few months. He also had a few thoughts about the app-economy business model favored by Silicon Valley investors. “If you are a millionaire and all around you is poor, you have no safety,” Sarder, who comes from Bangladesh, said. “Happiness is there when everyone has happiness.” 

This article appears in other versions of the June 5 & 12, 2017, issue of The New Yorker, with the headline “No More Mr. Nice Guy.”

The Work You Do, The Person You Are by Toni Morrison


All I had to do for the two dollars was clean Her house for a few hours after school. It was a beautiful house, too, with a plastic-covered sofa and chairs, wall-to-wall blue-and-white carpeting, a white enamel stove, a washing machine and a dryer—things that were common in Her neighborhood, absent in mine. In the middle of the war, She had butter, sugar, steaks, and seam-up-the-back stockings.

I knew how to scrub floors on my knees and how to wash clothes in our zinc tub, but I had never seen a Hoover vacuum cleaner or an iron that wasn’t heated by fire.

Part of my pride in working for Her was earning money I could squander: on movies, candy, paddleballs, jacks, ice-cream cones. But a larger part of my pride was based on the fact that I gave half my wages to my mother, which meant that some of my earnings were used for real things—an insurance-policy payment or what was owed to the milkman or the iceman. The pleasure of being necessary to my parents was profound. I was not like the children in folktales: burdensome mouths to feed, nuisances to be corrected, problems so severe that they were abandoned to the forest. I had a status that doing routine chores in my house did not provide—and it earned me a slow smile, an approving nod from an adult. Confirmations that I was adultlike, not childlike.

In those days, the forties, children were not just loved or liked; they were needed. They could earn money; they could care for children younger than themselves; they could work the farm, take care of the herd, run errands, and much more. I suspect that children aren’t needed in that way now. They are loved, doted on, protected, and helped. Fine, and yet . . .

Little by little, I got better at cleaning Her house—good enough to be given more to do, much more. I was ordered to carry bookcases upstairs and, once, to move a piano from one side of a room to the other. I fell carrying the bookcases. And after pushing the piano my arms and legs hurt so badly. I wanted to refuse, or at least to complain, but I was afraid She would fire me, and I would lose the freedom the dollar gave me, as well as the standing I had at home—although both were slowly being eroded. She began to offer me her clothes, for a price. Impressed by these worn things, which looked simply gorgeous to a little girl who had only two dresses to wear to school, I bought a few. Until my mother asked me if I really wanted to work for castoffs. So I learned to say “No, thank you” to a faded sweater offered for a quarter of a week’s pay.

Still, I had trouble summoning the courage to discuss or object to the increasing demands She made. And I knew that if I told my mother how unhappy I was she would tell me to quit. Then one day, alone in the kitchen with my father, I let drop a few whines about the job. I gave him details, examples of what troubled me, yet although he listened intently, I saw no sympathy in his eyes. No “Oh, you poor little thing.” Perhaps he understood that what I wanted was a solution to the job, not an escape from it. In any case, he put down his cup of coffee and said, “Listen. You don’t live there. You live here. With your people. Go to work. Get your money. And come on home.”

That was what he said. This was what I heard:

1. Whatever the work is, do it well—not for the boss but for yourself.

2. You make the job; it doesn’t make you.

3. Your real life is with us, your family.

4. You are not the work you do; you are the person you are.

I have worked for all sorts of people since then, geniuses and morons, quick-witted and dull, bighearted and narrow. I’ve had many kinds of jobs, but since that conversation with my father I have never considered the level of labor to be the measure of myself, and I have never placed the security of a job above the value of home.