Saturday, November 28, 2009

November 15, 2009

In House, Many Spoke With One Voice: Lobbyists’

WASHINGTON — In the official record of the historic House debate on overhauling health care, the speeches of many lawmakers echo with similarities. Often, that was no accident.

Statements by more than a dozen lawmakers were ghostwritten, in whole or in part, by Washington lobbyists working for Genentech, one of the world’s largest biotechnology companies.

E-mail messages obtained by The New York Times show that the lobbyists drafted one statement for Democrats and another for Republicans.

The lobbyists, employed by Genentech and by two Washington law firms, were remarkably successful in getting the statements printed in the Congressional Record under the names of different members of Congress.

Genentech, a subsidiary of the Swiss drug giant Roche, estimates that 42 House members picked up some of its talking points — 22 Republicans and 20 Democrats, an unusual bipartisan coup for lobbyists.

In an interview, Representative Bill Pascrell Jr., Democrat of New Jersey, said: “I regret that the language was the same. I did not know it was.” He said he got his statement from his staff and “did not know where they got the information from.”

Members of Congress submit statements for publication in the Congressional Record all the time, often with a decorous request to “revise and extend my remarks.” It is unusual for so many revisions and extensions to match up word for word. It is even more unusual to find clear evidence that the statements originated with lobbyists.

The e-mail messages and their attached documents indicate that the statements were based on information supplied by Genentech employees to one of its lobbyists, Matthew L. Berzok, a lawyer at Ryan, MacKinnon, Vasapoli & Berzok who is identified as the “author” of the documents. The statements were disseminated by lobbyists at a big law firm, Sonnenschein Nath & Rosenthal.

In an e-mail message to fellow lobbyists on Nov. 5, two days before the House vote, Todd M. Weiss, senior managing director of Sonnenschein, said, “We are trying to secure as many House R’s and D’s to offer this/these statements for the record as humanly possible.”

He told the lobbyists to “conduct aggressive outreach to your contacts on the Hill to see if their bosses would offer the attached statements (or an edited version) for the record.”

In recent years, Genentech’s political action committee and lobbyists for Roche and Genentech have made campaign contributions to many House members, including some who filed statements in the Congressional Record. And company employees have been among the hosts at fund-raisers for some of those lawmakers. But Evan L. Morris, head of Genentech’s Washington office, said, “There was no connection between the contributions and the statements.”

Mr. Morris said Republicans and Democrats, concerned about the unemployment rate, were receptive to the company’s arguments about the need to keep research jobs in the United States.

The statements were not intended to change the bill, which was not open for much amendment during the debate. They were meant to show bipartisan support for certain provisions, even though the vote on passage generally followed party lines.

Democrats emphasized the bill’s potential to create jobs in health care, health information technology and clinical research on new drugs.

Republicans opposed the bill, but praised a provision that would give the Food and Drug Administration the authority to approve generic versions of expensive biotechnology drugs, along the lines favored by brand-name companies like Genentech.

Lawmakers from both parties said it was important to conduct research on such “biosimilar” products in the United States. Several took a swipe at aggressive Indian competitors.

Asked about the Congressional statements, a lobbyist close to Genentech said: “This happens all the time. There was nothing nefarious about it.”

In separate statements using language suggested by the lobbyists, Representatives Blaine Luetkemeyer of Missouri and Joe Wilson of South Carolina, both Republicans, said: “One of the reasons I have long supported the U.S. biotechnology industry is that it is a homegrown success story that has been an engine of job creation in this country. Unfortunately, many of the largest companies that would seek to enter the biosimilar market have made their money by outsourcing their research to foreign countries like India.”

In remarks on the House floor, Representative Phil Hare, Democrat of Illinois, recalled that his family had faced eviction when his father was sick and could not make payments on their home. He said the House bill would save others from such hardship.

In a written addendum in the Congressional Record, Mr. Hare said the bill would also create high-paying jobs. Timothy Schlittner, a spokesman for Mr. Hare, said: “That part of his statement was drafted for us by Roche pharmaceutical company. It is something he agrees with.”

The boilerplate in the Congressional Record included some conversational touches, as if actually delivered on the House floor.

In the standard Democratic statement, Representative Robert A. Brady of Pennsylvania said: “Let me repeat that for some of my friends on the other side of the aisle. This bill will create high-paying, high-quality jobs in health care delivery, technology and research in the United States.”

Mr. Brady’s chief of staff, Stanley V. White, said he had received the draft statement from a lobbyist for Genentech’s parent company, Roche.

“We were approached by the lobbyist, who asked if we would be willing to enter a statement in the Congressional Record,” Mr. White said. “I asked him for a draft. I tweaked a couple of words. There’s not much reason to reinvent the wheel on a Congressional Record entry.”

Some differences were just a matter of style. Representative Yvette D. Clarke, Democrat of New York, said, “I see this bill as an exciting opportunity to create the kind of jobs we so desperately need in this country, while at the same time improving the lives of all Americans.”

Representative Donald M. Payne, Democrat of New Jersey, used the same words, but said the bill would improve the lives of “ALL Americans.”

Mr. Payne and Mr. Brady said the bill would “create new opportunities and markets for our brightest technology minds.” Mr. Pascrell said the bill would “create new opportunities and markets for our brightest minds in technology.”

In nearly identical words, three Republicans — Representatives K. Michael Conaway of Texas, Lynn Jenkins of Kansas and Lee Terry of Nebraska — said they had criticized many provisions of the bill, and “rightfully so.”

But, each said, “I do believe the sections relating to the creation of a market for biosimilar products is one area of the bill that strikes the appropriate balance in providing lower cost options.”

http://www.nytimes.com/2009/11/15/us/politics/15health.html?partner=rss&emc=rss

Thursday, November 26, 2009

Mistrial Declared in Murder Trial

Judge declares mistrial in 2007 drive-by murder case
By TONY RIZZOThe Kansas City Star
“I knew him prior.”
Those four words uttered recently by a police detective in front of a jury prompted a Jackson County judge Wednesday to dismiss the murder case against a Kansas City man.
The dismissal “with prejudice” means that prosecutors cannot re-try Markus D. Lee for the 2007 drive-by killing of Eliseo Thomas. Assistant Public Defender Molly Hastings requested, and the judge ordered, that Lee be released Wednesday.
It was the third time this decade that Lee, 25, was charged with committing a murder and the third time he has avoided conviction.
Jackson County Prosecutor Jim Kanatzar could not be reached Wednesday about whether he will seek an appeal.
Kansas City police officials said they wanted to see the judge’s written order before commenting. The judge said he planned to have the written order available Monday.
In Wednesday’s oral ruling, Circuit Judge Robert M. Schieber said he believed that the detective’s comment during Lee’s trial earlier this month was an intentional effort to “goad” Lee’s attorneys into seeking a mistrial because the case wasn’t going well.
Because he considered the mistrial to be the result of governmental misconduct, Schieber ruled that trying Lee a second time would violate his constitutional protection against double jeopardy.
“It is with a great deal of angst that I do this,” Schieber said.
But the judge said that he had to hold law enforcement officers to the same rules and standards that attorneys must follow to ensure a “level playing field” in the courtroom.
“For me to not do that would render those rules meaningless,” he said.
Schieber said that if there was no sanction against such intentional misconduct then anytime a law enforcement officer felt a case “was going south” he could say something inappropriate and prompt a mistrial.
“I can’t allow that to happen,” Schieber said.
He noted that after he declared a mistrial in Lee’s case, the jurors and alternates told him that they would have voted unanimously for acquittal. They also told him that the detective’s statement about knowing Lee implied to them that he had been arrested previously.
That demonstrated that the comment was prejudicial, the judge said.
Lee, who has been in custody since shortly after the March 2007 incident, was charged along with two other men with killing Thomas and wounding three others during a drive-by shooting near 30th Street and Agnes Avenue. The shooting sparked a high-speed chase in which suspects fired shots at pursuing police officers.
The two other defendants are in custody pending their trials.
In 2006, a jury acquitted Lee on charges that he killed a man during a 2002 block party and later gunned down a witness to that crime. At trial, witnesses who had initially identified Lee changed their stories and said they didn’t witness the shootings.
His trial for allegedly killing Thomas began Nov. 9 and was close to wrapping up Nov. 12 when Detective Danny Phillips testified about collecting shell casings and obtaining a DNA sample from Lee after his arrest.
Phillips was being cross-examined by Hastings about when he collected the DNA sample when he added “I knew him prior.” Hastings moved for a mistrial, which Schieber granted.
She later filed the motion that Schieber ruled on Wednesday. Phillips could not be reached for comment after Wednesday’s ruling.
After Wednesday’s hearing, Hastings said she appreciated the judge holding police accountable.
“They are not exempt from following the rules,” she said.

Thursday, September 3, 2009

Obama vs. Education

Obama speech to students draws conservative ire
By LIBBY QUAID and LINDA STEWART BALL, Associated Press Writers Libby Quaid And Linda Stewart Ball, Associated Press Writers 32 mins ago
DALLAS – President Barack Obama's back-to-school address next week was supposed to be a feel-good story for an administration battered over its health care agenda. Now Republican critics are calling it an effort to foist a political agenda on children, creating yet another confrontation with the White House.
Obama plans to speak directly to students Tuesday about the need to work hard and stay in school. His address will be shown live on the White House Web site and on C-SPAN at noon EDT, a time when classrooms across the country will be able to tune in.
Schools don't have to show it. But districts across the country have been inundated with phone calls from parents and are struggling to address the controversy that broke out after Education Secretary Arne Duncan sent a letter to principals urging schools to watch.
Districts in states including Texas, Illinois, Minnesota, Missouri, Virginia, Wisconsin have decided not to show the speech to students. Others are still thinking it over or are letting parents have their kids opt out.
Some conservatives, driven by radio pundits and bloggers, are urging schools and parents to boycott the address. They say Obama is using the opportunity to promote a political agenda and is overstepping the boundaries of federal involvement in schools.
"As far as I am concerned, this is not civics education — it gives the appearance of creating a cult of personality," said Oklahoma state Sen. Steve Russell. "This is something you'd expect to see in North Korea or in Saddam Hussein's Iraq."
Arizona state schools superintendent Tom Horne, a Republican, said lesson plans for teachers created by Obama's Education Department "call for a worshipful rather than critical approach."
The White House plans to release the speech online Monday so parents can read it. He will deliver the speech at Wakefield High School in Arlington, Va.
"I think it's really unfortunate that politics has been brought into this," White House deputy policy director Heather Higginbottom said in an interview with The Associated Press.
"It's simply a plea to students to really take their learning seriously. Find out what they're good at. Set goals. And take the school year seriously."
She noted that President George H.W. Bush made a similar address to schools in 1991. Like Obama, Bush drew criticism, with Democrats accusing the Republican president of making the event into a campaign commercial.
Critics are particularly upset about lesson plans the administration created to accompany the speech. The lesson plans, available online, originally recommended having students "write letters to themselves about what they can do to help the president."
The White House revised the plans Wednesday to say students could "write letters to themselves about how they can achieve their short-term and long-term education goals."
"That was inartfully worded, and we corrected it," Higginbottom said.
In the Dallas suburb of Plano, Texas, the 54,000-student school district is not showing the 15- to 20-minute address but will make the video available later.
PTA council president Cara Mendelsohn said Obama is "cutting out the parent" by speaking to kids during school hours.
"Why can't a parent be watching this with their kid in the evening?" Mendelsohn said. "Because that's what makes a powerful statement, when a parent is sitting there saying, 'This is what I dream for you. This is what I want you to achieve.'"
Texas Gov. Rick Perry, a Republican, said in an interview with the AP that he's "certainly not going to advise anybody not to send their kids to school that day."
"Hearing the president speak is always a memorable moment," he said.
But he also said he understood where the criticism was coming from.
"Nobody seems to know what he's going to be talking about," Perry said. "Why didn't he spend more time talking to the local districts and superintendents, at least give them a heads-up about it?"
Several other Texas districts have decided not to show the speech, although the district in Houston is leaving the decision up to individual school principals. In suburban Houston, the Cypress-Fairbanks district planned to show the address and has had its social studies teachers assemble a curriculum and activities for students.
"If someone objected, we would not force them to listen to the speech," spokeswoman Kelli Durham said.
In Wisconsin, the Green Bay school district decided not to show the speech live and to let teachers decide individually whether to show it later.
In Florida, GOP chairman Jim Greer released a statement that he was "absolutely appalled that taxpayer dollars are being used to spread President Obama's socialist ideology."
Despite his rhetoric, two of the larger Florida districts, Miami-Dade and Hillsborough, plan to have classes watch the speech. Students whose parents object will not have to watch.
"We're extending the same courtesy to the president as we do with any elected official that wants to enter our schools," said Linda Cobbe, a Hillsborough schools spokeswoman. Cobbe said the district, which includes Tampa, has gotten calls from upset parents but said officials don't think the White House is trying to force politics on kids.
The Minnesota Association of School Administrators is recommending against disrupting the first day of school to show the speech, but Minnesota's biggest teachers' union is urging schools to show it.
Quincy, Ill., schools decided Thursday not to show the speech. Superintendent Lonny Lemon said phone calls "hit like a load of bricks" on Wednesday.
One Idaho school superintendent, Murray Dalgleish of Council, urged people not to rush to judgment.
"Is the president dictating to these kids? I don't think so," Dalgleish said. "He's trying to get out the same message we're trying to get out, which is, `You are in charge of your education.'"

Thursday, August 27, 2009

Madonna booed in Eastern Europe.

BUCHAREST, Romania - At first, fans politely applauded the Roma performers sharing a stage with Madonna. Then the pop star condemned widespread discrimination against Roma, or Gypsies — and the cheers gave way to jeers.

The sharp mood change that swept the crowd of 60,000, who had packed a park for Wednesday night's concert, underscores how prejudice against Gypsies remains deeply entrenched across Eastern Europe.

Despite long-standing efforts to stamp out rampant bias, human rights advocates say Roma probably suffer more humiliation and endure more discrimination than any other people group on the continent.

Sometimes, it can be deadly: In neighboring Hungary, six Roma have been killed and several wounded in a recent series of apparently racially motivated attacks targeting small countryside villages predominantly settled by Gypsies.

"There is generally widespread resentment against Gypsies in Eastern Europe. They have historically been the underdog," Radu Motoc, an official with the Soros Foundation Romania, said Thursday.

Roma, or Gypsies, are a nomadic ethnic group believed to have their roots in the Indian subcontinent. They live mostly in southern and eastern Europe, but hundreds of thousands have migrated west over the past few decades in search of jobs and better living conditions.

Romania has the largest number of Roma in the region. Some say the population could be as high as 2 million, although official data put it at 500,000.

Until the 19th century, Romanian Gypsies were slaves, and they've gotten a mixed response ever since: While discrimination is widespread, many East Europeans are enthusiastic about Gypsy music and dance, which they embrace as part of the region's cultural heritage.

That explains why the Roma musicians and a dancer who had briefly joined Madonna onstage got enthusiastic applause. And it also may explain why some in the crowd turned on Madonna when she paused during the two-hour show — a stop on her worldwide "Sticky and Sweet" tour — to touch on their plight.

"It has been brought to my attention ... that there is a lot of discrimination against Romanies and Gypsies in general in Eastern Europe," she said. "It made me feel very sad."

Thousands booed and jeered her.

A few cheered when she added: "We don't believe in discrimination ... we believe in freedom and equal rights for everyone." But she got more boos when she mentioned discrimination against homosexuals and others.

"I jeered her because it seemed false what she was telling us. What business does she have telling us these things?" said Ionut Dinu, 23.

Madonna did not react and carried on with her concert, held near the hulking palace of the late communist dictator Nicolae Ceausescu.

Her publicist, Liz Rosenberg, said Madonna and other had told her there were cheers as well as jeers.

"Madonna has been touring with a phenomenal troupe of Roma musicians who made her aware of the discrimination toward them in several countries so she felt compelled to make a brief statement," Rosenberg said in an e-mail. "She will not be issuing a further statement."

One Roma musician said the attitude toward Gypsies is contradictory.

"Romanians watch Gypsy soap operas, they like Gypsy music and go to Gypsy concerts," said Damian Draghici, a Grammy Award-winner who has performed with James Brown and Joe Cocker.

"But there has been a wave of aggression against Roma people in Italy, Hungary and Romania, which shows me something is not OK," he told the AP in an interview. "The politicians have to do something about it. People have to be educated not to be prejudiced. All people are equal, and that is the message politicians must give."

Nearly one in two of Europe's estimated 12 million Roma claimed to have suffered an act of discrimination over the past 12 months, according to a recent report by the Vienna-based EU Fundamental Rights Agency. The group says Roma face "overt discrimination" in housing, health care and education.

Many do not have official identification, which means they cannot get social benefits, are undereducated and struggle to find decent jobs.

Roma children are more likely to drop out of school than their peers from other ethnic groups. Many Romanians label Gypsies as thieves, and many are outraged by those who beg or commit petty crimes in Western Europe, believing they spoil Romania's image abroad.

In May 2007, Romanian President Traian Basescu was heard to call a Romanian journalist a "stinky Gypsy" during a conversation with his wife. Romania's anti-discrimination board criticized Basescu, who later apologized.

Human rights activists say the attacks in Hungary, which began in July 2008, may be tied to that country's economic crisis and the rising popularity of far-right vigilantes angered by a rash of petty thefts and other so-called "Gypsy crime." Last week, police arrested four suspects in a nightclub in the eastern city of Debrecen.

Bulgaria, the Czech Republic and Slovakia also have been criticized for widespread bias against Roma.

Madonna's outrage touched a nerve in Romania, but it seems doubtful it will change anything, said the Soros Foundation's Motoc.

"Madonna is a pop star. She is not an expert on interethnic relations," he said.

___

AP Writers Alison Mutler in Bucharest, William J. Kole in Vienna and Nekesa Mumbi Moody in New York contributed to this report.

Wednesday, August 26, 2009

Difficulty in justifying detention
from SCOTUSblog by Lyle Denniston

A federal judge’s lengthy but heavily censored opinion released on Friday demonstrated anew the difficulty that the Pentagon and U.S. intelligence agencies are having in trying to justify in court the continued holding of some of the prisoners at Guantanamo Bay, Cuba. A prisoner with family links to terrorist leader Osama Bin Laden, with personal encounters with Bin Laden, with at least a brief round of training in an Al-Qaeda military camp, with close knowledge of some of bin Laden’s bodyguards, and with other alleged links to Al-Qaeda soldiers — all of that was not enough, singly or together, to justify the detention of a Yemeni national, Mohammed Al-Adahi, Senior U.S. District Judge Gladys Kessler ruled.

Last Monday, the judge released a one-page order finding his detention unlawful, and ordering the government to take steps to arrange for his release, and to report back on Sept. 18 on what had been done to bring about release. On Friday, after clearance by a court intelligence-reviewing officer, her 42-page, redacted opinion explaining that ruling was made public. It can be downloaded here.

The Al-Adahi case is both typical and somewhat novel. It is like many other cases in which the government is relying on intelligence reports based on what people said or said they observed, rather than on hard, physical facts, evidence that is sometimes second-hand statements of what others had said or claimed they saw, coincidences that seem to add up to a “mosaic” that suggested involvement with terrorist “jihad” or enemy soldiers, and information from other detainees, as well as interviews with the detainee himself.

The case is somewhat unusual because the evidence offered against Al-Adahi goes considerably beyond the detail that government intelligence has produced in other detainee cases. Judge Kessler, in fact, described the evidence assembled against Al-Adahi as appearing to be “sensational and compelling.” In the end, however, she found it did not add up to enough to meet the legal standard she adopted for U.S. military detention of terrorist suspects.

While her definition of who may be detained is not novel (she borrowed it from another District judge), it does demand more proof of terrorist links than the standard used by some of Kessler’s other colleagues. She requires proof, for example, that a detainee have been a part of a terrorist “organized armed forces,” not just an individual who may have provided some support for Al-Qaeda or other terrorist organizations. In that sense, the Pentagon’s burden of justifying detention is higher.

Between the lines of the Kessler opinion, it was clear that government officials believed they had a very strong case for keeping Al-Adahi imprisoned. While the kind of evidence was typical, the scope and detail of it, they clearly assumed, would be more convincing.

But that was not how this judge found it. The end result was that information that showed enough linkages to satisfy the standards for an intelligence report, or series of reports, was found to be wanting as legal evidence to support further detention.

Here is the way Judge Kessler summed up: “When all is said and done, this is the evidence we have in this case. Al-Adahi probably had several relatives who served as bodyguards for Usama Bin Laden and were deeply involved with and supportive of al-Qaida and its activities. One of those relatives became his brother-in-law by virtue of marriage to his sister,…Al-Adahi accompanied his sister to Afghanistan so that she could be with her husband….The wedding celebration was held in Bin Laden’s compound and many of his associates attended.

“At that celebration, [Al-Adahi] was introduced to Bin Laden, with whom he had a very brief conversation. Several days later, [he] had a five-to-ten-minute conversation with Bin Laden. Thereafter, [he] stayed at an al-Qaida guesthouse for one nhight and attended the Al Farouq [Al-Qaeda] training camp for seven to ten days. He was expelled from Al Farouq for failure to obey the rules. This training represents the strongest basis that the government has for detaining Al-Adahi.”

But, the judge said at that point, those assertions “simply do not bring him within the ambit of the Executive’s power to detain.”

She then went on: “After his expulsion [from Al Farouq], Al-Adahi returned to the home of his sister and brother-in-law for several weeks and then traveled to other places in Afghanistan because he had no other obligations. Like many thousands of people, he sought to flee Afghanistan when it was bombed shortly after September 11, 2001.”

On the other side of the case, Kessler concluded, “There is no reliable evidence in the record that [he] was a trainer at Al Farouq, that he ever fought for al-Qaida and/or the Taliban, or that he affirmatively provided any actual support to al-Qaida and/or the Taliban. There is no reliable evidence in the record that [he] was a member of al-Qaida and/or the Taliban.”

Thus, she concluded, “while it is tempting to be swayed” by Al-Adahi’s admitted meetings with Bin Laden and his link by marriage to Bin Laden bodyguards who were “enthusiastic followers of Bin Laden,” that evidence does not constitute actual, reliable evidence that would justify the Government’s detention of this man.”

Kessler ordered government officials to use diplomatic measures to find a way to release him to another country. She indicated, as other judges have, that she could not order his outright release, because the D.C. Circuit Court — in a ruling facing an early challenge in the Supreme Court (Kiyemba v. Obama)– has denied that authority to federal judges handling Guantanamo cases.

Supreme Court facing the issue of Torture

Tracking new cases: Torture case returns
from SCOTUSblog by Lyle Denniston

NOTE: From time to time, the blog will examine significant new cases as they are filed at the Supreme Court. This post is one in that series. Some of these cases very likely will appear later in the blog’s Petitions to Watch feature when the Court is ready to consider them.

———————-

UPDATE: The case has been docketed as 09-227.

Lawyers for four Britons who formerly were held at Guantanamo Bay returned to the Supreme Court on Monday, seeking the first ruling by the Justices on claims of torture of terrorism suspects by U.S. agents. The new petition asked the Court to rule that the Constitution protects those held by the U.S. military or intelligence agencies from being tortured or abused, and to declare that a federal law protects them from discrimination based on their Muslim religion. They are challenging an April ruling by the D.C. Ciruit Court, rejecting for a second time their constitutional and legal challenges. The case is Rasul, et al., v. Myers, et al. , not yet assigned a docket number.

The Circuit Court, after once ruling that detainees had no constitutional rights, avoided any new ruling on the Britons’ constitutional claism after the case had been sent back to it by the Supreme Court last December. The Justices told the lower court to reconsider its prior decision in the wake of the Court’s 2008 ruling in Boumediene v. Bush, establishing a constitutional right for Guantanamo prisoners to challenge their captivity.

Instead, the Circuit Court panel in April responded by throwing out the case this time based on a finding of qualified immunity for former Defense Secretary Donald Rumsfeld and ten senior military officers sued in the case. The panel said it was taking an option created by the Supreme Court in another case last January to bypass a constitutional ruling and instead focus on officials’ immunity claim.

Whatever rights detainees might have as a result of the Boumediene ruling, that ruling came four years after the Britons had been released from Guantanamo, the Circuit Court concluded. It also renewed its earkuer ruling that the Britons could not claim religious bias under the federal Religious Freedom Restoration Act, because they were not ”persons” within the Act’s meaning.

The new Circuit Court decision, the Britons claimed on Monday , was a “manifest refusal to abide by this Court’s mandate and give due effect to Boumediene on the constitutional issues raised in this case.” The Justices, they argued, must “affirm the Court’s authority and compel an inferior court to abide by its mandate.”

The petition, though, added that there were “even more compelling issues which demand this Court;s attention.” Those, it said, were “whether detainees imprisoned the United States at Guantanamo have a right to be free from abuse and humiliation in the practice of their religion, whether Guantanamo detainees have a constitutional right to be free from torture, and whether public officials who knowingly violate these rights can escape accountability for their conduct by raising the shield of qualified immunity when they cannot assert this defense in good faith.”

“Torture and religious humilation of Muslim detainees at Guantanmo stands as a uniquely shameful episode in our history,” the Britons said. “This petition enables the Court to remedy that stain on the moral authority of our nation and its laws, to overrule an obdurately insupportable exercise in statutory construction that effectively renders these [detainees], and other other detainees at Gujantnamo, non-persons, and to facilitate accountability for these terrible acts.”

The petition argued that the federal government had chosen the detention site at Guantanamo Bay “in a cynical attempt to avoid acocuntability for conduct that had long been held unconstitutional when it occureed in U.S. prisons. But Guantanamo is not a Hobbesian enclave where [officials] could violate clear prohibitions on their conduct imposed by statute and regulations and then point to a purported constitutional void as a basis for immunity.”

The government officials sued in the case will have an opportunity to respond before the Justices act on the new appeal. Previously, the Obama Administration had opposed the Britons’ lawsuit when the case was being considered anew by the Circuit Court following the case’s return from the Supreme Court.

By coincidence, the new torture case reached the Supreme Court on the same day that Attorney General Eric Holder, Jr., announced that he was ordering a preliminary inquiry “into whether federal laws were violated in connection with the interrogation of specific detainees at overseas locations.” He said it was too soon to say whether there would actually be any prosecutions resulting from the probe. His statement can be found here. It is unclear whether that investigation will cover claims of abuse at Guantanamo, as opposed to Central Intelligence Agency “black sites” in other countries.

Tuesday, August 18, 2009

Gender differences in the workplace. . .

He Said, She Said: Communicating between Genders at Work

By Beth Banks Cohn, Ph.D. and Roz Usheroff, co-authors of "Taking the Leap: Managing Your Career in Turbulent Times ... and Beyond"


If you think it's difficult to discern what your significant other is really trying to say to you, consider how complicated communication between genders gets when you throw office politics, power struggles, and work challenges into the mix.

At work, men and women use strategies in communicating with each other that the opposite sex may view negatively. Often, misunderstandings can be avoided when co-workers look beyond personalities and consider the different ways men and women communicate.

Let's look at a few examples from both sides.

Men's Behavior
Trash talk
Men use negative banter, joking, teasing and playful putdowns as a way to subtly keep themselves at the top of the power hierarchy. Such "trash talking" is a common component of male relating.

What women think: Making others feel small is decidedly not a female trait. Women tend to see putdowns as arrogant or hostile.

The middle ground: In general, trash talk is usually harmless, as long as both parties "play." When both parties engage in it, it can even be a way to bond around a problem, such as a trying work assignment or demanding sales quotas.

Prideful self-sufficiency
You've heard the jokes about men not asking for directions? In work settings, males sometimes ask few questions, fearing that doing so will communicate to others that they don't know something. Males tend to equate knowledge with power and don't want to diminish their image by showing they lack the necessary know-how.

What women think: Women see this behavior as childish and even arrogant. They also look at it as a giant waste of time, figuring it is more time-effective to ask a question, get the answer and move on.

The middle ground: Some workplace cultures discourage questions, and indeed make people feel self-conscious about asking too many. In meetings or other settings where everyone needs to be on the same page in order to develop the best strategy, both genders need to find ways to get and give clarification.

Not gving feedback
Because men don't solicit feedback, good or bad, they also don't give feedback in return. Males don't want to be criticized, feel that compliments make someone less effective, and think women who seek feedback are "needy" and "high maintenance."

What women think: Women think men don't value their contributions and are overly critical. They may even feel that men withhold positive feedback in order to avoid giving women promotions or good projects.

The middle ground: Constructive feedback should be built into the workplace culture. Both genders need to find a way to make it a tool for improving performance and productivity.

Women's behavior
Equality-minded
Women try to maintain an appearance of equality amongst everyone. They are concerned with the effect of the exchange on the other person and want to make sure everyone feels like a worthy contributor.

What men think: Men tend to see this as a sign that women lack confidence and competence as leaders. They feel it makes women look weak.

The middle ground: Females can wield an enormous amount of power by orchestrating collaboration and enlisting cooperation between many parties. Men can learn from this. Nevertheless, women in leadership positions need to maintain a clear boundary between their authority and that of others.

Outside-in negotiating
Females want to see the full picture and make sure everyone's on the same page with the same level of understanding before making a decision.

What men think: Since this is the exact opposite of what men typically do, men think this tactic means women don't have a clear position or aren't decisive enough.

The middle ground: In negotiations, it's imperative to know all the factors involved before making a decision. On the other hand, trying to make everyone happy is not how leaders make good decisions. A balanced blend of female thoroughness and male decisiveness is ideal.

Likely to downplay certainty
Women don't want to appear pushy or uncaring of others' positions or ideas.

What men think: Men think, therefore, that women aren't certain and need someone to take charge.

The middle ground: Moderate self-deprecation and humility are good qualities in leaders. But always deferring to others' opinions and perspectives will be perceived as a sign of weakness. Find a middle way.

When it comes to communicating between genders in the workplace, the cardinal rule is this: Don't judge. Instead, try to look carefully at your co-worker's behavior, consider that some of it may be gender based, and try to gain insight on how this behavior serves or does not serve his or her objectives. If you want to step in and give support, do it from a position of understanding.

Thursday, January 29, 2009

Should the Filibuster be put to Rest?

The filibuster is obstructive, anachronistic, and undemocratic. It's time to kill it off for good.

by Matthew Yglesias
The Silenced Majority

In March 2005, Senator Harry Reid, the leader of the Democratic Party’s then-minority in the Senate, engaged in some legislative brinkmanship. If the Republicans went through with a dastardly plan they had devised, he warned, “the majority should not expect to receive cooperation from the minority in the conduct of Senate business … even on routine matters.” Senator Ted Kennedy hailed Reid’s stand and called on Republicans to “obey the rule of law and abandon their reckless threat to use the ‘nuclear option.’”

What was the outrageous threat that Democrats were so eager to block? Some nefarious Patriot Act provision? A bill authorizing torture, or secret surveillance? No. The Republicans, as you may recall, wanted to change the Senate rules to prevent Democrats from blocking judicial nominees by using the filibuster, a parliamentary procedure in which a minority of senators can endlessly extend debate to prevent an issue from being voted on. Eventually, a group of legislators known as the “Gang of 14”—seven Democrats and seven Republicans—struck a deal on the nominations, thus saving the filibuster and forestalling any changes to the Senate rules, and the dispute ended.

But Democrats were right to look on the nuclear option skeptically, and not because the proposed change was “reckless.” Rather, it didn’t go far enough. Every word the Republicans said about the nominees’ deserving an up-or-down vote was perfectly true—and their argument applies not just to judicial nominees, but to every other case in which the filibuster subverts the will of the majority.

Democrats no doubt see that more clearly today. Since 2006, when they won majorities in both the House and the Senate, their approval ratings have plummeted, in large part because moderates and liberals have noticed their inability to get much of anything done. House Speaker Nancy Pelosi tried to blame “the obstructionism of the Republicans,” but realistically, one can hardly blame Senate Republicans for obstructing legislation they oppose. The fault lies not with the obstructionists, but with the procedural rule that facilitates obstruction. In short, with the filibuster—a dubious tradition that encourages senators to act as spoilers rather than legislators, and that has locked the political system into semipermanent paralysis by ensuring that important decisions are endlessly deferred. It should be done away with.

Back in 2005, Senate Democrats seeking to block the GOP majority portrayed the filibuster as a pillar of America’s democratic tradition. In fact, it’s no such thing. The original rules of the Senate allowed a simple majority of legislators to make a motion to end debate. In 1806, at the recommendation of Aaron Burr, those rules were amended to allow for unlimited argument—not to create a counter­majoritarian check on legislation, but because the motion had been so rarely invoked that it “could not be necessary.” This decision paved the way for the modern filibuster. But no one actually attempted to use it until 1837, when a minority block of Whig senators prolonged debate to prevent Andrew Jackson’s allies from expunging a resolution of censure against him. The unlimited-debate rule eventually became so cumbersome that senators made attempts at reform in 1850, 1873, 1883, and 1890, all unsuccessful. Finally, in 1917, the Senate adopted a rule allowing a two-thirds super­majority to cut off debate.

Under this rule, in the years that followed, segregationists mounted a series of filibusters meant to block civil-rights legislation. In 1922, the mere threat of the procedure was enough to torpedo a bill to prevent lynchings. In 1946, a filibuster undermined a bill by Senator Dennis Chavez of New Mexico intended to block workplace discrimination. Strom Thurmond set the record for longest individual filibuster—at more than 24 hours—in an ultimately unsuccessful attempt to block the relatively mild Civil Rights Act of 1957. And the landmark Civil Rights Act of 1964 secured a filibuster-proof majority only after 57 days of debate and substantial watering down.

By 1975, the Senate was finally prepared for reform. But rather than eliminate the filibuster entirely and return to majority rule, the members merely diluted it, reducing the number of votes required to end debate from 67 to 60.

Since then, filibustering has only grown more frequent. In the 1960s, no Congress had more than seven filibusters. In the early 1990s, the 102nd Congress witnessed 47, more than had occurred throughout the entire 19th century. And that was not an especially filibuster-prone Congress—each subsequent one has seen progressively more. The 110th Congress, which just ended, featured 137.

The minority party of the day will inevitably defend such obstruction as a crucial bulwark of liberty. During the judicial-confirmations fight, the liberal Interfaith Alliance warned that a filibuster-free Senate “would leave the majority with the power to reign with absolute tyranny.” But the risk of one-party rule shouldn’t be exaggerated. Majority voting works fine for democracies around the world, and the need for legislation to pass through two separately elected houses of Congress and be signed into law by the president still gives our government more chances to veto objectionable bills than most other countries allow for.

In recent decades, periods of one-party rule have been rare and brief. The only circumstances under which party-line legislation is even a theoretical possibility for any length of time would be when the country feels that the party in power is doing a decent job. And that, one would think, is exactly the sort of situation in which an extended period of one-party rule might be deemed unobjectionable. The filibuster is hardly the only impediment to legislative change, but it’s the one least justified by our Constitution and least supported by our values. And eliminating it would drastically reduce excuses for inaction—the one thing Congress has produced in abundance in recent years.
The URL for this page is http://www.theatlantic.com/doc/200812u/filibuster

Stimulus Package

House OKs $819B stimulus bill with GOP opposition
By LIZ SIDOTI, Associated Press Writer Liz Sidoti, Associated Press Writer Thu Jan 29, 2:35 am ET

WASHINGTON – In a swift victory for President Barack Obama, the Democratic-controlled House approved a historically huge $819 billion stimulus bill Wednesday night with spending increases and tax cuts at the heart of the young administration's plan to revive a badly ailing economy. The vote was 244-188, with Republicans unanimous in opposition despite Obama's frequent pleas for bipartisan support.

"This recovery plan will save or create more than three million new jobs over the next few years," the president said in a written statement released moments after the House voted. Still later, he welcomed congressional leaders of both parties to the White House for drinks as he continued to lobby for the legislation.

Earlier, Obama declared, "We don't have a moment to spare" as congressional allies hastened to do his bidding in the face of the worst economic crisis since the Great Depression.

The vote sent the bill to the Senate, where debate could begin as early as Monday on a companion measure already taking shape. Democratic leaders have pledged to have legislation ready for Obama's signature by mid-February.

A mere eight days after Inauguration Day, Speaker Nancy Pelosi said the events heralded a new era. "The ship of state is difficult to turn," said the California Democrat. "But that is what we must do. That is what President Obama called us to do in his inaugural address."

With unemployment at its highest level in a quarter-century, the banking industry wobbling despite the infusion of staggering sums of bailout money and states struggling with budget crises, Democrats said the legislation was desperately needed.

"Another week that we delay is another 100,000 or more people unemployed. I don't think we want that on our consciences," said Rep. David Obey, D-Wis., chairman of the House Appropriations Committee and one of the leading architects of the legislation.

Republicans said the bill was short on tax cuts and contained too much spending, much of it wasteful, and would fall far short of administration's predictions of job creation.

The party's leader, Rep. John Boehner of Ohio, said the measure "won't create many jobs, but it will create plenty of programs and projects through slow-moving government spending." A GOP alternative, comprised almost entirely of tax cuts, was defeated, 266-170.

On the final vote, the legislation drew the support of all but 11 Democrats, while all Republicans opposed it.

The White House-backed legislation includes an estimated $544 billion in federal spending and $275 billion in tax cuts for individuals and businesses. The totals remained in flux nearly until the final vote, due to official re-estimates and a last-minute addition of $3 billion for mass transit.

Included is money for traditional job-creating programs such as highway construction and mass transit projects. But the measure tickets far more for unemployment benefits, health care and food stamp increases designed to aid victims of the worst economic downturn since the Great Depression of the 1930s.

Tens of billions of additional dollars would go to the states, which confront the prospect of deep budget cuts of their own. That money marks an attempt to ease the recession's impact on schools and law enforcement. With funding for housing weatherization and other provisions, the bill also makes a down payment on Obama's campaign promise of creating jobs that can reduce the nation's dependence on foreign oil.

The centerpiece tax cut calls for a $500 break for single workers and $1,000 for couples, including those who don't earn enough to owe federal income taxes.

The House vote marked merely the first of several major milestones a for the legislation, which Democratic leaders have pledged to deliver to the White House for Obama's signature by mid-February.

Already a more bipartisan — and costlier — measure is taking shape in the Senate, and Obama personally pledged to House and Senate Republicans in closed-door meetings on Tuesday that he is ready to accept modifications as the legislation advances.

Rahm Emanuel, a former Illinois congressman who is Obama's chief of staff, invited nearly a dozen House Republicans to the White House late Tuesday for what one participant said was a soft sales job.

This lawmaker quoted Emanuel as telling the group that polling shows roughly 80 percent support for the legislation, and that Republicans oppose it at their political peril. The lawmaker spoke on condition of anonymity, saying there was no agreement to speak publicly about the session.

In fact, though, many Republicans in the House are virtually immune from Democratic challenges because of the makeup of their districts, and have more to fear from GOP primary challenges in 2010. As a result, they have relatively little political incentive to break with conservative orthodoxy and support hundreds of billions in new federal spending.

Also, some Republican lawmakers have said in recent days they know they will have a second chance to support a bill when the final House-Senate compromise emerges in a few weeks.

Rep. Randy Neugebauer, R-Texas, sought to strip out all the spending from the legislation before final passage, arguing that the entire cost of the bill would merely add to soaring federal deficits. "Where are we going to get the money," he asked, but his attempt failed overwhelmingly, 302-134.

Obey had a ready retort. "They don't look like Herbert Hoover, I guess, but there are an awful lot of people in this chamber who think like Herbert Hoover," he said, referring to the president whose term is forever linked in history with the Great Depression.

___

Associated Press writers Andrew Taylor, Liz Sidoti and Ben Feller contributed to this story.

http://news.yahoo.com/s/ap/20090129/ap_on_go_co/obama_economy_208/print

In the Valley of Elah? A clash of norms



January 24, 2009
Associated Press
FORT BRAGG, N.C. - A Soldier found dead last summer complained about the price of beer and got in a fight at a bar before seven members of his own unit punched, choked and restrained him, a paratrooper testified at a hearing Friday.
Sgt. Mitchell Lafortune testified during an Article 32, similar to a civilian grand jury, for five of seven Soldiers charged with involuntary manslaughter in Pfc. Luke Brown's July death. The other two are scheduled to appear Feb. 27. The division commander will decide whether to convene a formal trial, or court-martial.
Defense attorney Todd Connormon, who represents 24-year-old Spc. Charles B. DeLong, one of those charged, called the situation "a tragedy," and said the Soldiers were trying to take care of a friend.
"I'm hoping this doesn't go to court," Connormon said. "I don't think it should."
Lafortune's testimony was the first public account of the night Brown died.
He said he saw the Soldiers "aggressively assault" Brown in a patch of woods after the group left a Fayetteville bar called the Ugly Stick early July 20. When the men drove him back to the barracks on Fort Bragg, Lafortune said he thought Brown was dead because he was pale and his eyes were closed.
"I should have done something to make sure he was OK," said Lafortune, who has not been charged and testified that he did not participate in choking Brown. "I should have been smart enough to walk out of the woods and at least call Fayetteville (police). It's something I regret to this day."
Lafortune said Brown, 27, an intelligence officer from Fredericksburg, Va., was drinking and socializing at the bar but seemed in a bad mood, complaining about the price of beer. Brown got into an argument with a Soldier from another unit, grabbed the man's beer and drank it.
When the group left, a Soldier found Brown in a patch of woods behind the bar. Lafortune said he heard a commotion and saw Brown being choked and punched. He said the Soldiers were trying to get Brown, who weighed 250 pounds, to pass out so they could move him.
The group carried Brown to the edge of the woods and bound his hands with a zip tie when he began to wake up. Then they put Brown in a vehicle and drove back to the barracks. Lafortune said he heard one of the other Soldiers say, "You've got to breathe Brown, breathe."
They cut the zip ties off of his wrists and started CPR. Shortly after, an ambulance and military police arrived.
Chief Warrant Officer James Lyonais, called as a character witness for 28-year-old Sgt. Justin A. Boyle, discussed guidance he'd received in the past on safely getting a drunk Soldier home.
He said it was common to be told "it doesn't matter how you get them home. You knock them out, you bring them home and we'll deal with it later."
A prosecutor then asked if it was appropriate to "kick, punch, choke to unconsciousness and zip tie" a paratrooper if Soldiers needed to get him back to the base.
"If you were trying to save a Soldier from trouble with the law downtown ... it is acceptable," Lyonais said. "I don't think the answer is to physically harm people."
Navy Cmdr. Carol Solomon, a pathologist at the Washington-based Armed Forces Institute of Pathology, testified later Friday that choking a person to unconsciousness can cause a fatal brain injury. She said injuries on Brown's neck were consistent with choking.
"I believe their actions were involved in causing Pfc. Brown's death," she said of the accused.
Solomon had originally ruled the cause of Brown's death undetermined because she was concerned he may have had an enlarged heart. She said she changed her opinion after determining his heart was normal.
The Soldiers charged are DeLong, of Dade City, Fla.; Boyle, of Rocky Point, N.Y.; Sgt. Christopher Mignocchi, 22, of Hollywood, Fla.; Sgt. Kyle G. Saltz, 25, of Richland, Wash.; Spc. Ryan Sullivan, 23, of Mount Laurel, N.J.; Spc. Joseph A. Misuraca, 22, of Harper Woods, Mich.; and Pfc. Andrey Udalov, 21, of Brooklyn, N.Y.
The seven men are assigned to the 82nd Airborne Division's Headquarters and Headquarters Company, which was Brown's unit. The involuntary manslaughter charges carry a maximum 10-year prison sentence. Some of the Soldiers also face other charges.
http://www.military.com/news/article/dead-soldier-was-punched-choked.html?col=1186032325324&ESRC=army-a.nl

End of White America? Threatening?

State of the Union January/February 2009

The Election of Barack Obama is just the most startling manifestation of a larger trend: the gradual erosion of “whiteness” as the touchstone of what it means to be American. If the end of white America is a cultural and demographic inevitability, what will the new mainstream look like—and how will white Americans fit into it? What will it mean to be white when whiteness is no longer the norm? And will a post-white America be less racially divided—or more so?

by Hua Hsu

The End of White America?

Illustrations By Felix Sockwell

"Civilization’s going to pieces,” he remarks. He is in polite company, gathered with friends around a bottle of wine in the late-afternoon sun, chatting and gossiping. “I’ve gotten to be a terrible pessimist about things. Have you read The Rise of the Colored Empires by this man Goddard?” They hadn’t. “Well, it’s a fine book, and everybody ought to read it. The idea is if we don’t look out the white race will be—will be utterly submerged. It’s all scientific stuff; it’s been proved.”

He is Tom Buchanan, a character in F. Scott Fitzgerald’s The Great Gatsby, a book that nearly everyone who passes through the American education system is compelled to read at least once. Although Gatsby doesn’t gloss as a book on racial anxiety—it’s too busy exploring a different set of anxieties entirely—Buchanan was hardly alone in feeling besieged. The book by “this man Goddard” had a real-world analogue: Lothrop Stoddard’s The Rising Tide of Color Against White World-Supremacy, published in 1920, five years before Gatsby. Nine decades later, Stoddard’s polemic remains oddly engrossing. He refers to World War I as the “White Civil War” and laments the “cycle of ruin” that may result if the “white world” continues its infighting. The book features a series of foldout maps depicting the distribution of “color” throughout the world and warns, “Colored migration is a universal peril, menacing every part of the white world.”

As briefs for racial supremacy go, The Rising Tide of Color is eerily serene. Its tone is scholarly and gentlemanly, its hatred rationalized and, in Buchanan’s term, “scientific.” And the book was hardly a fringe phenomenon. It was published by Scribner, also Fitzgerald’s publisher, and Stoddard, who received a doctorate in history from Harvard, was a member of many professional academic associations. It was precisely the kind of book that a 1920s man of Buchanan’s profile—wealthy, Ivy League–educated, at once pretentious and intellectually insecure—might have been expected to bring up in casual conversation.

As white men of comfort and privilege living in an age of limited social mobility, of course, Stoddard and the Buchanans in his audience had nothing literal to fear. Their sense of dread hovered somewhere above the concerns of everyday life. It was linked less to any immediate danger to their class’s political and cultural power than to the perceived fraying of the fixed, monolithic identity of whiteness that sewed together the fortunes of the fair-skinned.

From the hysteria over Eastern European immigration to the vibrant cultural miscegenation of the Harlem Renaissance, it is easy to see how this imagined worldwide white kinship might have seemed imperiled in the 1920s. There’s no better example of the era’s insecurities than the 1923 Supreme Court case United States v. Bhagat Singh Thind, in which an Indian American veteran of World War I sought to become a naturalized citizen by proving that he was Caucasian. The Court considered new anthropological studies that expanded the definition of the Caucasian race to include Indians, and the justices even agreed that traces of “Aryan blood” coursed through Thind’s body. But these technicalities availed him little. The Court determined that Thind was not white “in accordance with the understanding of the common man” and therefore could be excluded from the “statutory category” of whiteness. Put another way: Thind was white, in that he was Caucasian and even Aryan. But he was not white in the way Stoddard or Buchanan were white.

The ’20s debate over the definition of whiteness—a legal category? a commonsense understanding? a worldwide civilization?—took place in a society gripped by an acute sense of racial paranoia, and it is easy to regard these episodes as evidence of how far we have come. But consider that these anxieties surfaced when whiteness was synonymous with the American mainstream, when threats to its status were largely imaginary. What happens once this is no longer the case—when the fears of Lothrop Stoddard and Tom Buchanan are realized, and white people actually become an American minority?

Whether you describe it as the dawning of a post-racial age or just the end of white America, we’re approaching a profound demographic tipping point. According to an August 2008 report by the U.S. Census Bureau, those groups currently categorized as racial minorities—blacks and Hispanics, East Asians and South Asians—will account for a majority of the U.S. population by the year 2042. Among Americans under the age of 18, this shift is projected to take place in 2023, which means that every child born in the United States from here on out will belong to the first post-white generation.

Obviously, steadily ascending rates of interracial marriage complicate this picture, pointing toward what Michael Lind has described as the “beiging” of America. And it’s possible that “beige Americans” will self-identify as “white” in sufficient numbers to push the tipping point further into the future than the Census Bureau projects. But even if they do, whiteness will be a label adopted out of convenience and even indifference, rather than aspiration and necessity. For an earlier generation of minorities and immigrants, to be recognized as a “white American,” whether you were an Italian or a Pole or a Hungarian, was to enter the mainstream of American life; to be recognized as something else, as the Thind case suggests, was to be permanently excluded. As Bill Imada, head of the IW Group, a prominent Asian American communications and marketing company, puts it: “I think in the 1920s, 1930s, and 1940s, [for] anyone who immigrated, the aspiration was to blend in and be as American as possible so that white America wouldn’t be intimidated by them. They wanted to imitate white America as much as possible: learn English, go to church, go to the same schools.”

Today, the picture is far more complex. To take the most obvious example, whiteness is no longer a precondition for entry into the highest levels of public office. The son of Indian immigrants doesn’t have to become “white” in order to be elected governor of Louisiana. A half-Kenyan, half-Kansan politician can self-identify as black and be elected president of the United States.

As a purely demographic matter, then, the “white America” that Lothrop Stoddard believed in so fervently may cease to exist in 2040, 2050, or 2060, or later still. But where the culture is concerned, it’s already all but finished. Instead of the long-standing model of assimilation toward a common center, the culture is being remade in the image of white America’s multiethnic, multicolored heirs.

For some, the disappearance of this centrifugal core heralds a future rich with promise. In 1998, President Bill Clinton, in a now-famous address to students at Portland State University, remarked:

Today, largely because of immigration, there is no majority race in Hawaii or Houston or New York City. Within five years, there will be no majority race in our largest state, California. In a little more than 50 years, there will be no majority race in the United States. No other nation in history has gone through demographic change of this magnitude in so short a time ... [These immigrants] are energizing our culture and broadening our vision of the world. They are renewing our most basic values and reminding us all of what it truly means to be American.

Not everyone was so enthused. Clinton’s remarks caught the attention of another anxious Buchanan—Pat Buchanan, the conservative thinker. Revisiting the president’s speech in his 2001 book, The Death of the West, Buchanan wrote: “Mr. Clinton assured us that it will be a better America when we are all minorities and realize true ‘diversity.’ Well, those students [at Portland State] are going to find out, for they will spend their golden years in a Third World America.”

Today, the arrival of what Buchanan derided as “Third World America” is all but inevitable. What will the new mainstream of America look like, and what ideas or values might it rally around? What will it mean to be white after “whiteness” no longer defines the mainstream? Will anyone mourn the end of white America? Will anyone try to preserve it?


Another moment from The Great Gatsby: as Fitzgerald’s narrator and Gatsby drive across the Queensboro Bridge into Manhattan, a car passes them, and Nick Carraway notices that it is a limousine “driven by a white chauffeur, in which sat three modish negroes, two bucks and a girl.” The novelty of this topsy-turvy arrangement inspires Carraway to laugh aloud and think to himself, “Anything can happen now that we’ve slid over this bridge, anything at all …”

For a contemporary embodiment of the upheaval that this scene portended, consider Sean Combs, a hip-hop mogul and one of the most famous African Americans on the planet. Combs grew up during hip-hop’s late-1970s rise, and he belongs to the first generation that could safely make a living working in the industry—as a plucky young promoter and record-label intern in the late 1980s and early 1990s, and as a fashion designer, artist, and music executive worth hundreds of millions of dollars a brief decade later.

In the late 1990s, Combs made a fascinating gesture toward New York’s high society. He announced his arrival into the circles of the rich and powerful not by crashing their parties, but by inviting them into his own spectacularly over-the-top world. Combs began to stage elaborate annual parties in the Hamptons, not far from where Fitzgerald’s novel takes place. These “white parties”—attendees are required to wear white—quickly became legendary for their opulence (in 2004, Combs showcased a 1776 copy of the Declaration of Independence) as well as for the cultures-colliding quality of Hamptons elites paying their respects to someone so comfortably nouveau riche. Prospective business partners angled to get close to him and praised him as a guru of the lucrative “urban” market, while grateful partygoers hailed him as a modern-day Gatsby.

“Have I read The Great Gatsby?” Combs said to a London newspaper in 2001. “I am the Great Gatsby.”

Yet whereas Gatsby felt pressure to hide his status as an arriviste, Combs celebrated his position as an outsider-insider—someone who appropriates elements of the culture he seeks to join without attempting to assimilate outright. In a sense, Combs was imitating the old WASP establishment; in another sense, he was subtly provoking it, by over-enunciating its formality and never letting his guests forget that there was something slightly off about his presence. There’s a silent power to throwing parties where the best-dressed man in the room is also the one whose public profile once consisted primarily of dancing in the background of Biggie Smalls videos. (“No one would ever expect a young black man to be coming to a party with the Declaration of Independence, but I got it, and it’s coming with me,” Combs joked at his 2004 party, as he made the rounds with the document, promising not to spill champagne on it.)

In this regard, Combs is both a product and a hero of the new cultural mainstream, which prizes diversity above all else, and whose ultimate goal is some vague notion of racial transcendence, rather than subversion or assimilation. Although Combs’s vision is far from representative—not many hip-hop stars vacation in St. Tropez with a parasol-toting manservant shading their every step—his industry lies at the heart of this new mainstream. Over the past 30 years, few changes in American culture have been as significant as the rise of hip-hop. The genre has radically reshaped the way we listen to and consume music, first by opposing the pop mainstream and then by becoming it. From its constant sampling of past styles and eras—old records, fashions, slang, anything—to its mythologization of the self-made black antihero, hip-hop is more than a musical genre: it’s a philosophy, a political statement, a way of approaching and remaking culture. It’s a lingua franca not just among kids in America, but also among young people worldwide. And its economic impact extends beyond the music industry, to fashion, advertising, and film. (Consider the producer Russell Simmons—the ur-Combs and a music, fashion, and television mogul—or the rapper 50 Cent, who has parlayed his rags-to-riches story line into extracurricular successes that include a clothing line; book, video-game, and film deals; and a startlingly lucrative partnership with the makers of Vitamin Water.)

But hip-hop’s deepest impact is symbolic. During popular music’s rise in the 20th century, white artists and producers consistently “mainstreamed” African American innovations. Hip-hop’s ascension has been different. Eminem notwithstanding, hip-hop never suffered through anything like an Elvis Presley moment, in which a white artist made a musical form safe for white America. This is no dig at Elvis—the constrictive racial logic of the 1950s demanded the erasure of rock and roll’s black roots, and if it hadn’t been him, it would have been someone else. But hip-hop—the sound of the post- civil-rights, post-soul generation—found a global audience on its own terms.

Today, hip-hop’s colonization of the global imagination, from fashion runways in Europe to dance competitions in Asia, is Disney-esque. This transformation has bred an unprecedented cultural confidence in its black originators. Whiteness is no longer a threat, or an ideal: it’s kitsch to be appropriated, whether with gestures like Combs’s “white parties” or the trickle-down epidemic of collared shirts and cuff links currently afflicting rappers. And an expansive multiculturalism is replacing the us-against-the-world bunker mentality that lent a thrilling edge to hip-hop’s mid-1990s rise.

Peter Rosenberg, a self-proclaimed “nerdy Jewish kid” and radio personality on New York’s Hot 97 FM—and a living example of how hip-hop has created new identities for its listeners that don’t fall neatly along lines of black and white—shares another example: “I interviewed [the St. Louis rapper] Nelly this morning, and he said it’s now very cool and in to have multicultural friends. Like you’re not really considered hip or ‘you’ve made it’ if you’re rolling with all the same people.”

Just as Tiger Woods forever changed the country-club culture of golf, and Will Smith confounded stereotypes about the ideal Hollywood leading man, hip-hop’s rise is helping redefine the American mainstream, which no longer aspires toward a single iconic image of style or class. Successful network-television shows like Lost, Heroes, and Grey’s Anatomy feature wildly diverse casts, and an entire genre of half-hour comedy, from The Colbert Report to The Office, seems dedicated to having fun with the persona of the clueless white male. The youth market is following the same pattern: consider the Cheetah Girls, a multicultural, multiplatinum, multiplatform trio of teenyboppers who recently starred in their third movie, or Dora the Explorer, the precocious bilingual 7-year-old Latina adventurer who is arguably the most successful animated character on children’s television today. In a recent address to the Association of Hispanic Advertising Agencies, Brown Johnson, the Nickelodeon executive who has overseen Dora’s rise, explained the importance of creating a character who does not conform to “the white, middle-class mold.” When Johnson pointed out that Dora’s wares were outselling Barbie’s in France, the crowd hooted in delight.

Pop culture today rallies around an ethic of multicultural inclusion that seems to value every identity—except whiteness. “It’s become harder for the blond-haired, blue-eyed commercial actor,” remarks Rochelle Newman-Carrasco, of the Hispanic marketing firm Enlace. “You read casting notices, and they like to cast people with brown hair because they could be Hispanic. The language of casting notices is pretty shocking because it’s so specific: ‘Brown hair, brown eyes, could look Hispanic.’ Or, as one notice put it: ‘Ethnically ambiguous.’”

“I think white people feel like they’re under siege right now—like it’s not okay to be white right now, especially if you’re a white male,” laughs Bill Imada, of the IW Group. Imada and Newman-Carrasco are part of a movement within advertising, marketing, and communications firms to reimagine the profile of the typical American consumer. (Tellingly, every person I spoke with from these industries knew the Census Bureau’s projections by heart.)

“There’s a lot of fear and a lot of resentment,” Newman-Carrasco observes, describing the flak she caught after writing an article for a trade publication on the need for more-diverse hiring practices. “I got a response from a friend—he’s, like, a 60-something white male, and he’s been involved with multicultural recruiting,” she recalls. “And he said, ‘I really feel like the hunted. It’s a hard time to be a white man in America right now, because I feel like I’m being lumped in with all white males in America, and I’ve tried to do stuff, but it’s a tough time.’”

“I always tell the white men in the room, ‘We need you,’” Imada says. “We cannot talk about diversity and inclusion and engagement without you at the table. It’s okay to be white!

“But people are stressed out about it. ‘We used to be in control! We’re losing control!’”

If they’re right—if white America is indeed “losing control,” and if the future will belong to people who can successfully navigate a post-racial, multicultural landscape—then it’s no surprise that many white Americans are eager to divest themselves of their whiteness entirely.

For some, this renunciation can take a radical form. In 1994, a young graffiti artist and activist named William “Upski” Wimsatt, the son of a university professor, published Bomb the Suburbs, the spiritual heir to Norman Mailer’s celebratory 1957 essay, “The White Negro.” Wimsatt was deeply committed to hip-hop’s transformative powers, going so far as to embrace the status of the lowly “wigger,” a pejorative term popularized in the early 1990s to describe white kids who steep themselves in black culture. Wimsatt viewed the wigger’s immersion in two cultures as an engine for change. “If channeled in the right way,” he wrote, “the wigger can go a long way toward repairing the sickness of race in America.”

Wimsatt’s painfully earnest attempts to put his own relationship with whiteness under the microscope coincided with the emergence of an academic discipline known as “whiteness studies.” In colleges and universities across the country, scholars began examining the history of “whiteness” and unpacking its contradictions. Why, for example, had the Irish and the Italians fallen beyond the pale at different moments in our history? Were Jewish Americans white? And, as the historian Matthew Frye Jacobson asked, “Why is it that in the United States, a white woman can have black children but a black woman cannot have white children?”

Much like Wimsatt, the whiteness-studies academics—figures such as Jacobson, David Roediger, Eric Lott, and Noel Ignatiev—were attempting to come to terms with their own relationships with whiteness, in its past and present forms. In the early 1990s, Ignatiev, a former labor activist and the author of How the Irish Became White, set out to “abolish” the idea of the white race by starting the New Abolitionist Movement and founding a journal titled Race Traitor. “There is nothing positive about white identity,” he wrote in 1998. “As James Baldwin said, ‘As long as you think you’re white, there’s no hope for you.’”

Although most white Americans haven’t read Bomb the Suburbs or Race Traitor, this view of whiteness as something to be interrogated, if not shrugged off completely, has migrated to less academic spheres. The perspective of the whiteness-studies academics is commonplace now, even if the language used to express it is different.

“I get it: as a straight white male, I’m the worst thing on Earth,” Christian Lander says. Lander is a Canadian-born, Los Angeles–based satirist who in January 2008 started a blog called Stuff White People Like (stuffwhitepeoplelike.com), which pokes fun at the manners and mores of a specific species of young, hip, upwardly mobile whites. (He has written more than 100 entries about whites’ passion for things like bottled water, “the idea of soccer,” and “being the only white person around.”) At its best, Lander’s site—which formed the basis for a recently published book of the same name (reviewed in the October 2008 Atlantic)—is a cunningly precise distillation of the identity crisis plaguing well-meaning, well-off white kids in a post-white world.

“Like, I’m aware of all the horrible crimes that my demographic has done in the world,” Lander says. “And there’s a bunch of white people who are desperate—desperate—to say, ‘You know what? My skin’s white, but I’m not one of the white people who’s destroying the world.’”

For Lander, whiteness has become a vacuum. The “white identity” he limns on his blog is predicated on the quest for authenticity—usually other people’s authenticity. “As a white person, you’re just desperate to find something else to grab onto. You’re jealous! Pretty much every white person I grew up with wished they’d grown up in, you know, an ethnic home that gave them a second language. White culture is Family Ties and Led Zeppelin and Guns N’ Roses—like, this is white culture. This is all we have.”

Lander’s “white people” are products of a very specific historical moment, raised by well-meaning Baby Boomers to reject the old ideal of white American gentility and to embrace diversity and fluidity instead. (“It’s strange that we are the kids of Baby Boomers, right? How the hell do you rebel against that? Like, your parents will march against the World Trade Organization next to you. They’ll have bigger white dreadlocks than you. What do you do?”) But his lighthearted anthropology suggests that the multicultural harmony they were raised to worship has bred a kind of self-denial.

Matt Wray, a sociologist at Temple University who is a fan of Lander’s humor, has observed that many of his white students are plagued by a racial-identity crisis: “They don’t care about socioeconomics; they care about culture. And to be white is to be culturally broke. The classic thing white students say when you ask them to talk about who they are is, ‘I don’t have a culture.’ They might be privileged, they might be loaded socioeconomically, but they feel bankrupt when it comes to culture … They feel disadvantaged, and they feel marginalized. They don’t have a culture that’s cool or oppositional.” Wray says that this feeling of being culturally bereft often prevents students from recognizing what it means to be a child of privilege—a strange irony that the first wave of whiteness-studies scholars, in the 1990s, failed to anticipate.

Of course, the obvious material advantages that come with being born white—lower infant-mortality rates and easier-to-acquire bank loans, for example—tend to undercut any sympathy that this sense of marginalization might generate. And in the right context, cultural-identity crises can turn well-meaning whites into instant punch lines. Consider ego trip’s The (White) Rapper Show, a brilliant and critically acclaimed reality show that VH1 debuted in 2007. It depicted 10 (mostly hapless) white rappers living together in a dilapidated house—dubbed “Tha White House”—in the South Bronx. Despite the contestants’ best intentions, each one seemed like a profoundly confused caricature, whether it was the solemn graduate student committed to fighting racism or the ghetto-obsessed suburbanite who had, seemingly by accident, named himself after the abolitionist John Brown.

Similarly, Smirnoff struck marketing gold in 2006 with a viral music video titled “Tea Partay,” featuring a trio of strikingly bad, V-neck-sweater-clad white rappers called the Prep Unit. “Haters like to clown our Ivy League educations / But they’re just jealous ’cause our families run the nation,” the trio brayed, as a pair of bottle-blond women in spiffy tennis whites shimmied behind them. There was no nonironic way to enjoy the video; its entire appeal was in its self-aware lampooning of WASP culture: verdant country clubs, “old money,” croquet, popped collars, and the like.

“The best defense is to be constantly pulling the rug out from underneath yourself,” Wray remarks, describing the way self-aware whites contend with their complicated identity. “Beat people to the punch. You’re forced as a white person into a sense of ironic detachment. Irony is what fuels a lot of white subcultures. You also see things like Burning Man, when a lot of white people are going into the desert and trying to invent something that is entirely new and not a form of racial mimicry. That’s its own kind of flight from whiteness. We’re going through a period where whites are really trying to figure out: Who are we?”

The “flight from whiteness” of urban, college-educated, liberal whites isn’t the only attempt to answer this question. You can flee into whiteness as well. This can mean pursuing the authenticity of an imagined past: think of the deliberately white-bread world of Mormon America, where the ’50s never ended, or the anachronistic WASP entitlement flaunted in books like last year’s A Privileged Life: Celebrating WASP Style, a handsome coffee-table book compiled by Susanna Salk, depicting a world of seersucker blazers, whale pants, and deck shoes. (What the book celebrates is the “inability to be outdone,” and the “self-confidence and security that comes with it,” Salk tells me. “That’s why I call it ‘privilege.’ It’s this privilege of time, of heritage, of being in a place longer than anybody else.”) But these enclaves of preserved-in-amber whiteness are likely to be less important to the American future than the construction of whiteness as a somewhat pissed-off minority culture.

This notion of a self-consciously white expression of minority empowerment will be familiar to anyone who has come across the comedian Larry the Cable Guy—he of “Farting Jingle Bells”—or witnessed the transformation of Detroit-born-and-bred Kid Rock from teenage rapper into “American Bad Ass” southern-style rocker. The 1990s may have been a decade when multiculturalism advanced dramatically—when American culture became “colorized,” as the critic Jeff Chang put it—but it was also an era when a very different form of identity politics crystallized. Hip-hop may have provided the decade’s soundtrack, but the highest-selling artist of the ’90s was Garth Brooks. Michael Jordan and Tiger Woods may have been the faces of athletic superstardom, but it was NASCAR that emerged as professional sports’ fastest-growing institution, with ratings second only to the NFL’s.

As with the unexpected success of the apocalyptic Left Behind novels, or the Jeff Foxworthy–organized Blue Collar Comedy Tour, the rise of country music and auto racing took place well off the American elite’s radar screen. (None of Christian Lander’s white people would be caught dead at a NASCAR race.) These phenomena reflected a growing sense of cultural solidarity among lower-middle-class whites—a solidarity defined by a yearning for American “authenticity,” a folksy realness that rejects the global, the urban, and the effete in favor of nostalgia for “the way things used to be.”

Like other forms of identity politics, white solidarity comes complete with its own folk heroes, conspiracy theories (Barack Obama is a secret Muslim! The U.S. is going to merge with Canada and Mexico!), and laundry lists of injustices. The targets and scapegoats vary—from multiculturalism and affirmative action to a loss of moral values, from immigration to an economy that no longer guarantees the American worker a fair chance—and so do the political programs they inspire. (Ross Perot and Pat Buchanan both tapped into this white identity politics in the 1990s; today, its tribunes run the ideological gamut, from Jim Webb to Ron Paul to Mike Huckabee to Sarah Palin.) But the core grievance, in each case, has to do with cultural and socioeconomic dislocation—the sense that the system that used to guarantee the white working class some stability has gone off-kilter.

Wray is one of the founders of what has been called “white-trash studies,” a field conceived as a response to the perceived elite-liberal marginalization of the white working class. He argues that the economic downturn of the 1970s was the precondition for the formation of an “oppositional” and “defiant” white-working-class sensibility—think of the rugged, anti-everything individualism of 1977’s Smokey and the Bandit. But those anxieties took their shape from the aftershocks of the identity-based movements of the 1960s. “I think that the political space that the civil-rights movement opens up in the mid-1950s and ’60s is the transformative thing,” Wray observes. “Following the black-power movement, all of the other minority groups that followed took up various forms of activism, including brown power and yellow power and red power. Of course the problem is, if you try and have a ‘white power’ movement, it doesn’t sound good.”

The result is a racial pride that dares not speak its name, and that defines itself through cultural cues instead—a suspicion of intellectual elites and city dwellers, a preference for folksiness and plainness of speech (whether real or feigned), and the association of a working-class white minority with “the real America.” (In the Scots-Irish belt that runs from Arkansas up through West Virginia, the most common ethnic label offered to census takers is “American.”) Arguably, this white identity politics helped swing the 2000 and 2004 elections, serving as the powerful counterpunch to urban white liberals, and the McCain-Palin campaign relied on it almost to the point of absurdity (as when a McCain surrogate dismissed Northern Virginia as somehow not part of “the real Virginia”) as a bulwark against the threatening multiculturalism of Barack Obama. Their strategy failed, of course, but it’s possible to imagine white identity politics growing more potent and more forthright in its racial identifications in the future, as “the real America” becomes an ever-smaller portion of, well, the real America, and as the soon-to-be white minority’s sense of being besieged and disdained by a multicultural majority grows apace.

This vision of the aggrieved white man lost in a world that no longer values him was given its most vivid expression in the 1993 film Falling Down. Michael Douglas plays Bill Foster, a downsized defense worker with a buzz cut and a pocket protector who rampages through a Los Angeles overrun by greedy Korean shop-owners and Hispanic gangsters, railing against the eclipse of the America he used to know. (The film came out just eight years before California became the nation’s first majority-minority state.) Falling Down ends with a soulful police officer apprehending Foster on the Santa Monica Pier, at which point the middle-class vigilante asks, almost innocently: “I’m the bad guy?”

But this is a nightmare vision. Of course most of America’s Bill Fosters aren’t the bad guys—just as civilization is not, in the words of Tom Buchanan, “going to pieces” and America is not, in the phrasing of Pat Buchanan, going “Third World.” The coming white minority does not mean that the racial hierarchy of American culture will suddenly become inverted, as in 1995’s White Man’s Burden, an awful thought experiment of a film, starring John Travolta, that envisions an upside-down world in which whites are subjugated to their high-class black oppressors. There will be dislocations and resentments along the way, but the demographic shifts of the next 40 years are likely to reduce the power of racial hierarchies over everyone’s lives, producing a culture that’s more likely than any before to treat its inhabitants as individuals, rather than members of a caste or identity group.

Consider the world of advertising and marketing, industries that set out to mold our desires at a subconscious level. Advertising strategy once assumed a “general market”—“a code word for ‘white people,’” jokes one ad executive—and smaller, mutually exclusive, satellite “ethnic markets.” In recent years, though, advertisers have begun revising their assumptions and strategies in anticipation of profound demographic shifts. Instead of herding consumers toward a discrete center, the goal today is to create versatile images and campaigns that can be adapted to highly individualized tastes. (Think of the dancing silhouettes in Apple’s iPod campaign, which emphasizes individuality and diversity without privileging—or even representing—any specific group.)

At the moment, we can call this the triumph of multiculturalism, or post-racialism. But just as whiteness has no inherent meaning—it is a vessel we fill with our hopes and anxieties—these terms may prove equally empty in the long run. Does being post-racial mean that we are past race completely, or merely that race is no longer essential to how we identify ourselves? Karl Carter, of Atlanta’s youth-oriented GTM Inc. (Guerrilla Tactics Media), suggests that marketers and advertisers would be better off focusing on matrices like “lifestyle” or “culture” rather than race or ethnicity. “You’ll have crazy in-depth studies of the white consumer or the Latino consumer,” he complains. “But how do skaters feel? How do hip-hoppers feel?”

The logic of online social networking points in a similar direction. The New York University sociologist Dalton Conley has written of a “network nation,” in which applications like Facebook and MySpace create “crosscutting social groups” and new, flexible identities that only vaguely overlap with racial identities. Perhaps this is where the future of identity after whiteness lies—in a dramatic departure from the racial logic that has defined American culture from the very beginning. What Conley, Carter, and others are describing isn’t merely the displacement of whiteness from our cultural center; they’re describing a social structure that treats race as just one of a seemingly infinite number of possible self-identifications.

From the archives:

The Freedmen's Bureau

(March 1901)
"The problem of the twentieth century is the problem of the color line..." By W.E.B. DuBois

The problem of the 20th century, W. E. B. DuBois famously predicted, would be the problem of the color line. Will this continue to be the case in the 21st century, when a black president will govern a country whose social networks increasingly cut across every conceivable line of identification? The ruling of United States v. Bhagat Singh Thind no longer holds weight, but its echoes have been inescapable: we aspire to be post-racial, but we still live within the structures of privilege, injustice, and racial categorization that we inherited from an older order. We can talk about defining ourselves by lifestyle rather than skin color, but our lifestyle choices are still racially coded. We know, more or less, that race is a fiction that often does more harm than good, and yet it is something we cling to without fully understanding why—as a social and legal fact, a vague sense of belonging and place that we make solid through culture and speech.

But maybe this is merely how it used to be—maybe this is already an outdated way of looking at things. “You have a lot of young adults going into a more diverse world,” Carter remarks. For the young Americans born in the 1980s and 1990s, culture is something to be taken apart and remade in their own image. “We came along in a generation that didn’t have to follow that path of race,” he goes on. “We saw something different.” This moment was not the end of white America; it was not the end of anything. It was a bridge, and we crossed it.