Thursday, January 29, 2009

Should the Filibuster be put to Rest?

The filibuster is obstructive, anachronistic, and undemocratic. It's time to kill it off for good.

by Matthew Yglesias
The Silenced Majority

In March 2005, Senator Harry Reid, the leader of the Democratic Party’s then-minority in the Senate, engaged in some legislative brinkmanship. If the Republicans went through with a dastardly plan they had devised, he warned, “the majority should not expect to receive cooperation from the minority in the conduct of Senate business … even on routine matters.” Senator Ted Kennedy hailed Reid’s stand and called on Republicans to “obey the rule of law and abandon their reckless threat to use the ‘nuclear option.’”

What was the outrageous threat that Democrats were so eager to block? Some nefarious Patriot Act provision? A bill authorizing torture, or secret surveillance? No. The Republicans, as you may recall, wanted to change the Senate rules to prevent Democrats from blocking judicial nominees by using the filibuster, a parliamentary procedure in which a minority of senators can endlessly extend debate to prevent an issue from being voted on. Eventually, a group of legislators known as the “Gang of 14”—seven Democrats and seven Republicans—struck a deal on the nominations, thus saving the filibuster and forestalling any changes to the Senate rules, and the dispute ended.

But Democrats were right to look on the nuclear option skeptically, and not because the proposed change was “reckless.” Rather, it didn’t go far enough. Every word the Republicans said about the nominees’ deserving an up-or-down vote was perfectly true—and their argument applies not just to judicial nominees, but to every other case in which the filibuster subverts the will of the majority.

Democrats no doubt see that more clearly today. Since 2006, when they won majorities in both the House and the Senate, their approval ratings have plummeted, in large part because moderates and liberals have noticed their inability to get much of anything done. House Speaker Nancy Pelosi tried to blame “the obstructionism of the Republicans,” but realistically, one can hardly blame Senate Republicans for obstructing legislation they oppose. The fault lies not with the obstructionists, but with the procedural rule that facilitates obstruction. In short, with the filibuster—a dubious tradition that encourages senators to act as spoilers rather than legislators, and that has locked the political system into semipermanent paralysis by ensuring that important decisions are endlessly deferred. It should be done away with.

Back in 2005, Senate Democrats seeking to block the GOP majority portrayed the filibuster as a pillar of America’s democratic tradition. In fact, it’s no such thing. The original rules of the Senate allowed a simple majority of legislators to make a motion to end debate. In 1806, at the recommendation of Aaron Burr, those rules were amended to allow for unlimited argument—not to create a counter­majoritarian check on legislation, but because the motion had been so rarely invoked that it “could not be necessary.” This decision paved the way for the modern filibuster. But no one actually attempted to use it until 1837, when a minority block of Whig senators prolonged debate to prevent Andrew Jackson’s allies from expunging a resolution of censure against him. The unlimited-debate rule eventually became so cumbersome that senators made attempts at reform in 1850, 1873, 1883, and 1890, all unsuccessful. Finally, in 1917, the Senate adopted a rule allowing a two-thirds super­majority to cut off debate.

Under this rule, in the years that followed, segregationists mounted a series of filibusters meant to block civil-rights legislation. In 1922, the mere threat of the procedure was enough to torpedo a bill to prevent lynchings. In 1946, a filibuster undermined a bill by Senator Dennis Chavez of New Mexico intended to block workplace discrimination. Strom Thurmond set the record for longest individual filibuster—at more than 24 hours—in an ultimately unsuccessful attempt to block the relatively mild Civil Rights Act of 1957. And the landmark Civil Rights Act of 1964 secured a filibuster-proof majority only after 57 days of debate and substantial watering down.

By 1975, the Senate was finally prepared for reform. But rather than eliminate the filibuster entirely and return to majority rule, the members merely diluted it, reducing the number of votes required to end debate from 67 to 60.

Since then, filibustering has only grown more frequent. In the 1960s, no Congress had more than seven filibusters. In the early 1990s, the 102nd Congress witnessed 47, more than had occurred throughout the entire 19th century. And that was not an especially filibuster-prone Congress—each subsequent one has seen progressively more. The 110th Congress, which just ended, featured 137.

The minority party of the day will inevitably defend such obstruction as a crucial bulwark of liberty. During the judicial-confirmations fight, the liberal Interfaith Alliance warned that a filibuster-free Senate “would leave the majority with the power to reign with absolute tyranny.” But the risk of one-party rule shouldn’t be exaggerated. Majority voting works fine for democracies around the world, and the need for legislation to pass through two separately elected houses of Congress and be signed into law by the president still gives our government more chances to veto objectionable bills than most other countries allow for.

In recent decades, periods of one-party rule have been rare and brief. The only circumstances under which party-line legislation is even a theoretical possibility for any length of time would be when the country feels that the party in power is doing a decent job. And that, one would think, is exactly the sort of situation in which an extended period of one-party rule might be deemed unobjectionable. The filibuster is hardly the only impediment to legislative change, but it’s the one least justified by our Constitution and least supported by our values. And eliminating it would drastically reduce excuses for inaction—the one thing Congress has produced in abundance in recent years.
The URL for this page is http://www.theatlantic.com/doc/200812u/filibuster

Stimulus Package

House OKs $819B stimulus bill with GOP opposition
By LIZ SIDOTI, Associated Press Writer Liz Sidoti, Associated Press Writer Thu Jan 29, 2:35 am ET

WASHINGTON – In a swift victory for President Barack Obama, the Democratic-controlled House approved a historically huge $819 billion stimulus bill Wednesday night with spending increases and tax cuts at the heart of the young administration's plan to revive a badly ailing economy. The vote was 244-188, with Republicans unanimous in opposition despite Obama's frequent pleas for bipartisan support.

"This recovery plan will save or create more than three million new jobs over the next few years," the president said in a written statement released moments after the House voted. Still later, he welcomed congressional leaders of both parties to the White House for drinks as he continued to lobby for the legislation.

Earlier, Obama declared, "We don't have a moment to spare" as congressional allies hastened to do his bidding in the face of the worst economic crisis since the Great Depression.

The vote sent the bill to the Senate, where debate could begin as early as Monday on a companion measure already taking shape. Democratic leaders have pledged to have legislation ready for Obama's signature by mid-February.

A mere eight days after Inauguration Day, Speaker Nancy Pelosi said the events heralded a new era. "The ship of state is difficult to turn," said the California Democrat. "But that is what we must do. That is what President Obama called us to do in his inaugural address."

With unemployment at its highest level in a quarter-century, the banking industry wobbling despite the infusion of staggering sums of bailout money and states struggling with budget crises, Democrats said the legislation was desperately needed.

"Another week that we delay is another 100,000 or more people unemployed. I don't think we want that on our consciences," said Rep. David Obey, D-Wis., chairman of the House Appropriations Committee and one of the leading architects of the legislation.

Republicans said the bill was short on tax cuts and contained too much spending, much of it wasteful, and would fall far short of administration's predictions of job creation.

The party's leader, Rep. John Boehner of Ohio, said the measure "won't create many jobs, but it will create plenty of programs and projects through slow-moving government spending." A GOP alternative, comprised almost entirely of tax cuts, was defeated, 266-170.

On the final vote, the legislation drew the support of all but 11 Democrats, while all Republicans opposed it.

The White House-backed legislation includes an estimated $544 billion in federal spending and $275 billion in tax cuts for individuals and businesses. The totals remained in flux nearly until the final vote, due to official re-estimates and a last-minute addition of $3 billion for mass transit.

Included is money for traditional job-creating programs such as highway construction and mass transit projects. But the measure tickets far more for unemployment benefits, health care and food stamp increases designed to aid victims of the worst economic downturn since the Great Depression of the 1930s.

Tens of billions of additional dollars would go to the states, which confront the prospect of deep budget cuts of their own. That money marks an attempt to ease the recession's impact on schools and law enforcement. With funding for housing weatherization and other provisions, the bill also makes a down payment on Obama's campaign promise of creating jobs that can reduce the nation's dependence on foreign oil.

The centerpiece tax cut calls for a $500 break for single workers and $1,000 for couples, including those who don't earn enough to owe federal income taxes.

The House vote marked merely the first of several major milestones a for the legislation, which Democratic leaders have pledged to deliver to the White House for Obama's signature by mid-February.

Already a more bipartisan — and costlier — measure is taking shape in the Senate, and Obama personally pledged to House and Senate Republicans in closed-door meetings on Tuesday that he is ready to accept modifications as the legislation advances.

Rahm Emanuel, a former Illinois congressman who is Obama's chief of staff, invited nearly a dozen House Republicans to the White House late Tuesday for what one participant said was a soft sales job.

This lawmaker quoted Emanuel as telling the group that polling shows roughly 80 percent support for the legislation, and that Republicans oppose it at their political peril. The lawmaker spoke on condition of anonymity, saying there was no agreement to speak publicly about the session.

In fact, though, many Republicans in the House are virtually immune from Democratic challenges because of the makeup of their districts, and have more to fear from GOP primary challenges in 2010. As a result, they have relatively little political incentive to break with conservative orthodoxy and support hundreds of billions in new federal spending.

Also, some Republican lawmakers have said in recent days they know they will have a second chance to support a bill when the final House-Senate compromise emerges in a few weeks.

Rep. Randy Neugebauer, R-Texas, sought to strip out all the spending from the legislation before final passage, arguing that the entire cost of the bill would merely add to soaring federal deficits. "Where are we going to get the money," he asked, but his attempt failed overwhelmingly, 302-134.

Obey had a ready retort. "They don't look like Herbert Hoover, I guess, but there are an awful lot of people in this chamber who think like Herbert Hoover," he said, referring to the president whose term is forever linked in history with the Great Depression.

___

Associated Press writers Andrew Taylor, Liz Sidoti and Ben Feller contributed to this story.

http://news.yahoo.com/s/ap/20090129/ap_on_go_co/obama_economy_208/print

In the Valley of Elah? A clash of norms



January 24, 2009
Associated Press
FORT BRAGG, N.C. - A Soldier found dead last summer complained about the price of beer and got in a fight at a bar before seven members of his own unit punched, choked and restrained him, a paratrooper testified at a hearing Friday.
Sgt. Mitchell Lafortune testified during an Article 32, similar to a civilian grand jury, for five of seven Soldiers charged with involuntary manslaughter in Pfc. Luke Brown's July death. The other two are scheduled to appear Feb. 27. The division commander will decide whether to convene a formal trial, or court-martial.
Defense attorney Todd Connormon, who represents 24-year-old Spc. Charles B. DeLong, one of those charged, called the situation "a tragedy," and said the Soldiers were trying to take care of a friend.
"I'm hoping this doesn't go to court," Connormon said. "I don't think it should."
Lafortune's testimony was the first public account of the night Brown died.
He said he saw the Soldiers "aggressively assault" Brown in a patch of woods after the group left a Fayetteville bar called the Ugly Stick early July 20. When the men drove him back to the barracks on Fort Bragg, Lafortune said he thought Brown was dead because he was pale and his eyes were closed.
"I should have done something to make sure he was OK," said Lafortune, who has not been charged and testified that he did not participate in choking Brown. "I should have been smart enough to walk out of the woods and at least call Fayetteville (police). It's something I regret to this day."
Lafortune said Brown, 27, an intelligence officer from Fredericksburg, Va., was drinking and socializing at the bar but seemed in a bad mood, complaining about the price of beer. Brown got into an argument with a Soldier from another unit, grabbed the man's beer and drank it.
When the group left, a Soldier found Brown in a patch of woods behind the bar. Lafortune said he heard a commotion and saw Brown being choked and punched. He said the Soldiers were trying to get Brown, who weighed 250 pounds, to pass out so they could move him.
The group carried Brown to the edge of the woods and bound his hands with a zip tie when he began to wake up. Then they put Brown in a vehicle and drove back to the barracks. Lafortune said he heard one of the other Soldiers say, "You've got to breathe Brown, breathe."
They cut the zip ties off of his wrists and started CPR. Shortly after, an ambulance and military police arrived.
Chief Warrant Officer James Lyonais, called as a character witness for 28-year-old Sgt. Justin A. Boyle, discussed guidance he'd received in the past on safely getting a drunk Soldier home.
He said it was common to be told "it doesn't matter how you get them home. You knock them out, you bring them home and we'll deal with it later."
A prosecutor then asked if it was appropriate to "kick, punch, choke to unconsciousness and zip tie" a paratrooper if Soldiers needed to get him back to the base.
"If you were trying to save a Soldier from trouble with the law downtown ... it is acceptable," Lyonais said. "I don't think the answer is to physically harm people."
Navy Cmdr. Carol Solomon, a pathologist at the Washington-based Armed Forces Institute of Pathology, testified later Friday that choking a person to unconsciousness can cause a fatal brain injury. She said injuries on Brown's neck were consistent with choking.
"I believe their actions were involved in causing Pfc. Brown's death," she said of the accused.
Solomon had originally ruled the cause of Brown's death undetermined because she was concerned he may have had an enlarged heart. She said she changed her opinion after determining his heart was normal.
The Soldiers charged are DeLong, of Dade City, Fla.; Boyle, of Rocky Point, N.Y.; Sgt. Christopher Mignocchi, 22, of Hollywood, Fla.; Sgt. Kyle G. Saltz, 25, of Richland, Wash.; Spc. Ryan Sullivan, 23, of Mount Laurel, N.J.; Spc. Joseph A. Misuraca, 22, of Harper Woods, Mich.; and Pfc. Andrey Udalov, 21, of Brooklyn, N.Y.
The seven men are assigned to the 82nd Airborne Division's Headquarters and Headquarters Company, which was Brown's unit. The involuntary manslaughter charges carry a maximum 10-year prison sentence. Some of the Soldiers also face other charges.
http://www.military.com/news/article/dead-soldier-was-punched-choked.html?col=1186032325324&ESRC=army-a.nl

End of White America? Threatening?

State of the Union January/February 2009

The Election of Barack Obama is just the most startling manifestation of a larger trend: the gradual erosion of “whiteness” as the touchstone of what it means to be American. If the end of white America is a cultural and demographic inevitability, what will the new mainstream look like—and how will white Americans fit into it? What will it mean to be white when whiteness is no longer the norm? And will a post-white America be less racially divided—or more so?

by Hua Hsu

The End of White America?

Illustrations By Felix Sockwell

"Civilization’s going to pieces,” he remarks. He is in polite company, gathered with friends around a bottle of wine in the late-afternoon sun, chatting and gossiping. “I’ve gotten to be a terrible pessimist about things. Have you read The Rise of the Colored Empires by this man Goddard?” They hadn’t. “Well, it’s a fine book, and everybody ought to read it. The idea is if we don’t look out the white race will be—will be utterly submerged. It’s all scientific stuff; it’s been proved.”

He is Tom Buchanan, a character in F. Scott Fitzgerald’s The Great Gatsby, a book that nearly everyone who passes through the American education system is compelled to read at least once. Although Gatsby doesn’t gloss as a book on racial anxiety—it’s too busy exploring a different set of anxieties entirely—Buchanan was hardly alone in feeling besieged. The book by “this man Goddard” had a real-world analogue: Lothrop Stoddard’s The Rising Tide of Color Against White World-Supremacy, published in 1920, five years before Gatsby. Nine decades later, Stoddard’s polemic remains oddly engrossing. He refers to World War I as the “White Civil War” and laments the “cycle of ruin” that may result if the “white world” continues its infighting. The book features a series of foldout maps depicting the distribution of “color” throughout the world and warns, “Colored migration is a universal peril, menacing every part of the white world.”

As briefs for racial supremacy go, The Rising Tide of Color is eerily serene. Its tone is scholarly and gentlemanly, its hatred rationalized and, in Buchanan’s term, “scientific.” And the book was hardly a fringe phenomenon. It was published by Scribner, also Fitzgerald’s publisher, and Stoddard, who received a doctorate in history from Harvard, was a member of many professional academic associations. It was precisely the kind of book that a 1920s man of Buchanan’s profile—wealthy, Ivy League–educated, at once pretentious and intellectually insecure—might have been expected to bring up in casual conversation.

As white men of comfort and privilege living in an age of limited social mobility, of course, Stoddard and the Buchanans in his audience had nothing literal to fear. Their sense of dread hovered somewhere above the concerns of everyday life. It was linked less to any immediate danger to their class’s political and cultural power than to the perceived fraying of the fixed, monolithic identity of whiteness that sewed together the fortunes of the fair-skinned.

From the hysteria over Eastern European immigration to the vibrant cultural miscegenation of the Harlem Renaissance, it is easy to see how this imagined worldwide white kinship might have seemed imperiled in the 1920s. There’s no better example of the era’s insecurities than the 1923 Supreme Court case United States v. Bhagat Singh Thind, in which an Indian American veteran of World War I sought to become a naturalized citizen by proving that he was Caucasian. The Court considered new anthropological studies that expanded the definition of the Caucasian race to include Indians, and the justices even agreed that traces of “Aryan blood” coursed through Thind’s body. But these technicalities availed him little. The Court determined that Thind was not white “in accordance with the understanding of the common man” and therefore could be excluded from the “statutory category” of whiteness. Put another way: Thind was white, in that he was Caucasian and even Aryan. But he was not white in the way Stoddard or Buchanan were white.

The ’20s debate over the definition of whiteness—a legal category? a commonsense understanding? a worldwide civilization?—took place in a society gripped by an acute sense of racial paranoia, and it is easy to regard these episodes as evidence of how far we have come. But consider that these anxieties surfaced when whiteness was synonymous with the American mainstream, when threats to its status were largely imaginary. What happens once this is no longer the case—when the fears of Lothrop Stoddard and Tom Buchanan are realized, and white people actually become an American minority?

Whether you describe it as the dawning of a post-racial age or just the end of white America, we’re approaching a profound demographic tipping point. According to an August 2008 report by the U.S. Census Bureau, those groups currently categorized as racial minorities—blacks and Hispanics, East Asians and South Asians—will account for a majority of the U.S. population by the year 2042. Among Americans under the age of 18, this shift is projected to take place in 2023, which means that every child born in the United States from here on out will belong to the first post-white generation.

Obviously, steadily ascending rates of interracial marriage complicate this picture, pointing toward what Michael Lind has described as the “beiging” of America. And it’s possible that “beige Americans” will self-identify as “white” in sufficient numbers to push the tipping point further into the future than the Census Bureau projects. But even if they do, whiteness will be a label adopted out of convenience and even indifference, rather than aspiration and necessity. For an earlier generation of minorities and immigrants, to be recognized as a “white American,” whether you were an Italian or a Pole or a Hungarian, was to enter the mainstream of American life; to be recognized as something else, as the Thind case suggests, was to be permanently excluded. As Bill Imada, head of the IW Group, a prominent Asian American communications and marketing company, puts it: “I think in the 1920s, 1930s, and 1940s, [for] anyone who immigrated, the aspiration was to blend in and be as American as possible so that white America wouldn’t be intimidated by them. They wanted to imitate white America as much as possible: learn English, go to church, go to the same schools.”

Today, the picture is far more complex. To take the most obvious example, whiteness is no longer a precondition for entry into the highest levels of public office. The son of Indian immigrants doesn’t have to become “white” in order to be elected governor of Louisiana. A half-Kenyan, half-Kansan politician can self-identify as black and be elected president of the United States.

As a purely demographic matter, then, the “white America” that Lothrop Stoddard believed in so fervently may cease to exist in 2040, 2050, or 2060, or later still. But where the culture is concerned, it’s already all but finished. Instead of the long-standing model of assimilation toward a common center, the culture is being remade in the image of white America’s multiethnic, multicolored heirs.

For some, the disappearance of this centrifugal core heralds a future rich with promise. In 1998, President Bill Clinton, in a now-famous address to students at Portland State University, remarked:

Today, largely because of immigration, there is no majority race in Hawaii or Houston or New York City. Within five years, there will be no majority race in our largest state, California. In a little more than 50 years, there will be no majority race in the United States. No other nation in history has gone through demographic change of this magnitude in so short a time ... [These immigrants] are energizing our culture and broadening our vision of the world. They are renewing our most basic values and reminding us all of what it truly means to be American.

Not everyone was so enthused. Clinton’s remarks caught the attention of another anxious Buchanan—Pat Buchanan, the conservative thinker. Revisiting the president’s speech in his 2001 book, The Death of the West, Buchanan wrote: “Mr. Clinton assured us that it will be a better America when we are all minorities and realize true ‘diversity.’ Well, those students [at Portland State] are going to find out, for they will spend their golden years in a Third World America.”

Today, the arrival of what Buchanan derided as “Third World America” is all but inevitable. What will the new mainstream of America look like, and what ideas or values might it rally around? What will it mean to be white after “whiteness” no longer defines the mainstream? Will anyone mourn the end of white America? Will anyone try to preserve it?


Another moment from The Great Gatsby: as Fitzgerald’s narrator and Gatsby drive across the Queensboro Bridge into Manhattan, a car passes them, and Nick Carraway notices that it is a limousine “driven by a white chauffeur, in which sat three modish negroes, two bucks and a girl.” The novelty of this topsy-turvy arrangement inspires Carraway to laugh aloud and think to himself, “Anything can happen now that we’ve slid over this bridge, anything at all …”

For a contemporary embodiment of the upheaval that this scene portended, consider Sean Combs, a hip-hop mogul and one of the most famous African Americans on the planet. Combs grew up during hip-hop’s late-1970s rise, and he belongs to the first generation that could safely make a living working in the industry—as a plucky young promoter and record-label intern in the late 1980s and early 1990s, and as a fashion designer, artist, and music executive worth hundreds of millions of dollars a brief decade later.

In the late 1990s, Combs made a fascinating gesture toward New York’s high society. He announced his arrival into the circles of the rich and powerful not by crashing their parties, but by inviting them into his own spectacularly over-the-top world. Combs began to stage elaborate annual parties in the Hamptons, not far from where Fitzgerald’s novel takes place. These “white parties”—attendees are required to wear white—quickly became legendary for their opulence (in 2004, Combs showcased a 1776 copy of the Declaration of Independence) as well as for the cultures-colliding quality of Hamptons elites paying their respects to someone so comfortably nouveau riche. Prospective business partners angled to get close to him and praised him as a guru of the lucrative “urban” market, while grateful partygoers hailed him as a modern-day Gatsby.

“Have I read The Great Gatsby?” Combs said to a London newspaper in 2001. “I am the Great Gatsby.”

Yet whereas Gatsby felt pressure to hide his status as an arriviste, Combs celebrated his position as an outsider-insider—someone who appropriates elements of the culture he seeks to join without attempting to assimilate outright. In a sense, Combs was imitating the old WASP establishment; in another sense, he was subtly provoking it, by over-enunciating its formality and never letting his guests forget that there was something slightly off about his presence. There’s a silent power to throwing parties where the best-dressed man in the room is also the one whose public profile once consisted primarily of dancing in the background of Biggie Smalls videos. (“No one would ever expect a young black man to be coming to a party with the Declaration of Independence, but I got it, and it’s coming with me,” Combs joked at his 2004 party, as he made the rounds with the document, promising not to spill champagne on it.)

In this regard, Combs is both a product and a hero of the new cultural mainstream, which prizes diversity above all else, and whose ultimate goal is some vague notion of racial transcendence, rather than subversion or assimilation. Although Combs’s vision is far from representative—not many hip-hop stars vacation in St. Tropez with a parasol-toting manservant shading their every step—his industry lies at the heart of this new mainstream. Over the past 30 years, few changes in American culture have been as significant as the rise of hip-hop. The genre has radically reshaped the way we listen to and consume music, first by opposing the pop mainstream and then by becoming it. From its constant sampling of past styles and eras—old records, fashions, slang, anything—to its mythologization of the self-made black antihero, hip-hop is more than a musical genre: it’s a philosophy, a political statement, a way of approaching and remaking culture. It’s a lingua franca not just among kids in America, but also among young people worldwide. And its economic impact extends beyond the music industry, to fashion, advertising, and film. (Consider the producer Russell Simmons—the ur-Combs and a music, fashion, and television mogul—or the rapper 50 Cent, who has parlayed his rags-to-riches story line into extracurricular successes that include a clothing line; book, video-game, and film deals; and a startlingly lucrative partnership with the makers of Vitamin Water.)

But hip-hop’s deepest impact is symbolic. During popular music’s rise in the 20th century, white artists and producers consistently “mainstreamed” African American innovations. Hip-hop’s ascension has been different. Eminem notwithstanding, hip-hop never suffered through anything like an Elvis Presley moment, in which a white artist made a musical form safe for white America. This is no dig at Elvis—the constrictive racial logic of the 1950s demanded the erasure of rock and roll’s black roots, and if it hadn’t been him, it would have been someone else. But hip-hop—the sound of the post- civil-rights, post-soul generation—found a global audience on its own terms.

Today, hip-hop’s colonization of the global imagination, from fashion runways in Europe to dance competitions in Asia, is Disney-esque. This transformation has bred an unprecedented cultural confidence in its black originators. Whiteness is no longer a threat, or an ideal: it’s kitsch to be appropriated, whether with gestures like Combs’s “white parties” or the trickle-down epidemic of collared shirts and cuff links currently afflicting rappers. And an expansive multiculturalism is replacing the us-against-the-world bunker mentality that lent a thrilling edge to hip-hop’s mid-1990s rise.

Peter Rosenberg, a self-proclaimed “nerdy Jewish kid” and radio personality on New York’s Hot 97 FM—and a living example of how hip-hop has created new identities for its listeners that don’t fall neatly along lines of black and white—shares another example: “I interviewed [the St. Louis rapper] Nelly this morning, and he said it’s now very cool and in to have multicultural friends. Like you’re not really considered hip or ‘you’ve made it’ if you’re rolling with all the same people.”

Just as Tiger Woods forever changed the country-club culture of golf, and Will Smith confounded stereotypes about the ideal Hollywood leading man, hip-hop’s rise is helping redefine the American mainstream, which no longer aspires toward a single iconic image of style or class. Successful network-television shows like Lost, Heroes, and Grey’s Anatomy feature wildly diverse casts, and an entire genre of half-hour comedy, from The Colbert Report to The Office, seems dedicated to having fun with the persona of the clueless white male. The youth market is following the same pattern: consider the Cheetah Girls, a multicultural, multiplatinum, multiplatform trio of teenyboppers who recently starred in their third movie, or Dora the Explorer, the precocious bilingual 7-year-old Latina adventurer who is arguably the most successful animated character on children’s television today. In a recent address to the Association of Hispanic Advertising Agencies, Brown Johnson, the Nickelodeon executive who has overseen Dora’s rise, explained the importance of creating a character who does not conform to “the white, middle-class mold.” When Johnson pointed out that Dora’s wares were outselling Barbie’s in France, the crowd hooted in delight.

Pop culture today rallies around an ethic of multicultural inclusion that seems to value every identity—except whiteness. “It’s become harder for the blond-haired, blue-eyed commercial actor,” remarks Rochelle Newman-Carrasco, of the Hispanic marketing firm Enlace. “You read casting notices, and they like to cast people with brown hair because they could be Hispanic. The language of casting notices is pretty shocking because it’s so specific: ‘Brown hair, brown eyes, could look Hispanic.’ Or, as one notice put it: ‘Ethnically ambiguous.’”

“I think white people feel like they’re under siege right now—like it’s not okay to be white right now, especially if you’re a white male,” laughs Bill Imada, of the IW Group. Imada and Newman-Carrasco are part of a movement within advertising, marketing, and communications firms to reimagine the profile of the typical American consumer. (Tellingly, every person I spoke with from these industries knew the Census Bureau’s projections by heart.)

“There’s a lot of fear and a lot of resentment,” Newman-Carrasco observes, describing the flak she caught after writing an article for a trade publication on the need for more-diverse hiring practices. “I got a response from a friend—he’s, like, a 60-something white male, and he’s been involved with multicultural recruiting,” she recalls. “And he said, ‘I really feel like the hunted. It’s a hard time to be a white man in America right now, because I feel like I’m being lumped in with all white males in America, and I’ve tried to do stuff, but it’s a tough time.’”

“I always tell the white men in the room, ‘We need you,’” Imada says. “We cannot talk about diversity and inclusion and engagement without you at the table. It’s okay to be white!

“But people are stressed out about it. ‘We used to be in control! We’re losing control!’”

If they’re right—if white America is indeed “losing control,” and if the future will belong to people who can successfully navigate a post-racial, multicultural landscape—then it’s no surprise that many white Americans are eager to divest themselves of their whiteness entirely.

For some, this renunciation can take a radical form. In 1994, a young graffiti artist and activist named William “Upski” Wimsatt, the son of a university professor, published Bomb the Suburbs, the spiritual heir to Norman Mailer’s celebratory 1957 essay, “The White Negro.” Wimsatt was deeply committed to hip-hop’s transformative powers, going so far as to embrace the status of the lowly “wigger,” a pejorative term popularized in the early 1990s to describe white kids who steep themselves in black culture. Wimsatt viewed the wigger’s immersion in two cultures as an engine for change. “If channeled in the right way,” he wrote, “the wigger can go a long way toward repairing the sickness of race in America.”

Wimsatt’s painfully earnest attempts to put his own relationship with whiteness under the microscope coincided with the emergence of an academic discipline known as “whiteness studies.” In colleges and universities across the country, scholars began examining the history of “whiteness” and unpacking its contradictions. Why, for example, had the Irish and the Italians fallen beyond the pale at different moments in our history? Were Jewish Americans white? And, as the historian Matthew Frye Jacobson asked, “Why is it that in the United States, a white woman can have black children but a black woman cannot have white children?”

Much like Wimsatt, the whiteness-studies academics—figures such as Jacobson, David Roediger, Eric Lott, and Noel Ignatiev—were attempting to come to terms with their own relationships with whiteness, in its past and present forms. In the early 1990s, Ignatiev, a former labor activist and the author of How the Irish Became White, set out to “abolish” the idea of the white race by starting the New Abolitionist Movement and founding a journal titled Race Traitor. “There is nothing positive about white identity,” he wrote in 1998. “As James Baldwin said, ‘As long as you think you’re white, there’s no hope for you.’”

Although most white Americans haven’t read Bomb the Suburbs or Race Traitor, this view of whiteness as something to be interrogated, if not shrugged off completely, has migrated to less academic spheres. The perspective of the whiteness-studies academics is commonplace now, even if the language used to express it is different.

“I get it: as a straight white male, I’m the worst thing on Earth,” Christian Lander says. Lander is a Canadian-born, Los Angeles–based satirist who in January 2008 started a blog called Stuff White People Like (stuffwhitepeoplelike.com), which pokes fun at the manners and mores of a specific species of young, hip, upwardly mobile whites. (He has written more than 100 entries about whites’ passion for things like bottled water, “the idea of soccer,” and “being the only white person around.”) At its best, Lander’s site—which formed the basis for a recently published book of the same name (reviewed in the October 2008 Atlantic)—is a cunningly precise distillation of the identity crisis plaguing well-meaning, well-off white kids in a post-white world.

“Like, I’m aware of all the horrible crimes that my demographic has done in the world,” Lander says. “And there’s a bunch of white people who are desperate—desperate—to say, ‘You know what? My skin’s white, but I’m not one of the white people who’s destroying the world.’”

For Lander, whiteness has become a vacuum. The “white identity” he limns on his blog is predicated on the quest for authenticity—usually other people’s authenticity. “As a white person, you’re just desperate to find something else to grab onto. You’re jealous! Pretty much every white person I grew up with wished they’d grown up in, you know, an ethnic home that gave them a second language. White culture is Family Ties and Led Zeppelin and Guns N’ Roses—like, this is white culture. This is all we have.”

Lander’s “white people” are products of a very specific historical moment, raised by well-meaning Baby Boomers to reject the old ideal of white American gentility and to embrace diversity and fluidity instead. (“It’s strange that we are the kids of Baby Boomers, right? How the hell do you rebel against that? Like, your parents will march against the World Trade Organization next to you. They’ll have bigger white dreadlocks than you. What do you do?”) But his lighthearted anthropology suggests that the multicultural harmony they were raised to worship has bred a kind of self-denial.

Matt Wray, a sociologist at Temple University who is a fan of Lander’s humor, has observed that many of his white students are plagued by a racial-identity crisis: “They don’t care about socioeconomics; they care about culture. And to be white is to be culturally broke. The classic thing white students say when you ask them to talk about who they are is, ‘I don’t have a culture.’ They might be privileged, they might be loaded socioeconomically, but they feel bankrupt when it comes to culture … They feel disadvantaged, and they feel marginalized. They don’t have a culture that’s cool or oppositional.” Wray says that this feeling of being culturally bereft often prevents students from recognizing what it means to be a child of privilege—a strange irony that the first wave of whiteness-studies scholars, in the 1990s, failed to anticipate.

Of course, the obvious material advantages that come with being born white—lower infant-mortality rates and easier-to-acquire bank loans, for example—tend to undercut any sympathy that this sense of marginalization might generate. And in the right context, cultural-identity crises can turn well-meaning whites into instant punch lines. Consider ego trip’s The (White) Rapper Show, a brilliant and critically acclaimed reality show that VH1 debuted in 2007. It depicted 10 (mostly hapless) white rappers living together in a dilapidated house—dubbed “Tha White House”—in the South Bronx. Despite the contestants’ best intentions, each one seemed like a profoundly confused caricature, whether it was the solemn graduate student committed to fighting racism or the ghetto-obsessed suburbanite who had, seemingly by accident, named himself after the abolitionist John Brown.

Similarly, Smirnoff struck marketing gold in 2006 with a viral music video titled “Tea Partay,” featuring a trio of strikingly bad, V-neck-sweater-clad white rappers called the Prep Unit. “Haters like to clown our Ivy League educations / But they’re just jealous ’cause our families run the nation,” the trio brayed, as a pair of bottle-blond women in spiffy tennis whites shimmied behind them. There was no nonironic way to enjoy the video; its entire appeal was in its self-aware lampooning of WASP culture: verdant country clubs, “old money,” croquet, popped collars, and the like.

“The best defense is to be constantly pulling the rug out from underneath yourself,” Wray remarks, describing the way self-aware whites contend with their complicated identity. “Beat people to the punch. You’re forced as a white person into a sense of ironic detachment. Irony is what fuels a lot of white subcultures. You also see things like Burning Man, when a lot of white people are going into the desert and trying to invent something that is entirely new and not a form of racial mimicry. That’s its own kind of flight from whiteness. We’re going through a period where whites are really trying to figure out: Who are we?”

The “flight from whiteness” of urban, college-educated, liberal whites isn’t the only attempt to answer this question. You can flee into whiteness as well. This can mean pursuing the authenticity of an imagined past: think of the deliberately white-bread world of Mormon America, where the ’50s never ended, or the anachronistic WASP entitlement flaunted in books like last year’s A Privileged Life: Celebrating WASP Style, a handsome coffee-table book compiled by Susanna Salk, depicting a world of seersucker blazers, whale pants, and deck shoes. (What the book celebrates is the “inability to be outdone,” and the “self-confidence and security that comes with it,” Salk tells me. “That’s why I call it ‘privilege.’ It’s this privilege of time, of heritage, of being in a place longer than anybody else.”) But these enclaves of preserved-in-amber whiteness are likely to be less important to the American future than the construction of whiteness as a somewhat pissed-off minority culture.

This notion of a self-consciously white expression of minority empowerment will be familiar to anyone who has come across the comedian Larry the Cable Guy—he of “Farting Jingle Bells”—or witnessed the transformation of Detroit-born-and-bred Kid Rock from teenage rapper into “American Bad Ass” southern-style rocker. The 1990s may have been a decade when multiculturalism advanced dramatically—when American culture became “colorized,” as the critic Jeff Chang put it—but it was also an era when a very different form of identity politics crystallized. Hip-hop may have provided the decade’s soundtrack, but the highest-selling artist of the ’90s was Garth Brooks. Michael Jordan and Tiger Woods may have been the faces of athletic superstardom, but it was NASCAR that emerged as professional sports’ fastest-growing institution, with ratings second only to the NFL’s.

As with the unexpected success of the apocalyptic Left Behind novels, or the Jeff Foxworthy–organized Blue Collar Comedy Tour, the rise of country music and auto racing took place well off the American elite’s radar screen. (None of Christian Lander’s white people would be caught dead at a NASCAR race.) These phenomena reflected a growing sense of cultural solidarity among lower-middle-class whites—a solidarity defined by a yearning for American “authenticity,” a folksy realness that rejects the global, the urban, and the effete in favor of nostalgia for “the way things used to be.”

Like other forms of identity politics, white solidarity comes complete with its own folk heroes, conspiracy theories (Barack Obama is a secret Muslim! The U.S. is going to merge with Canada and Mexico!), and laundry lists of injustices. The targets and scapegoats vary—from multiculturalism and affirmative action to a loss of moral values, from immigration to an economy that no longer guarantees the American worker a fair chance—and so do the political programs they inspire. (Ross Perot and Pat Buchanan both tapped into this white identity politics in the 1990s; today, its tribunes run the ideological gamut, from Jim Webb to Ron Paul to Mike Huckabee to Sarah Palin.) But the core grievance, in each case, has to do with cultural and socioeconomic dislocation—the sense that the system that used to guarantee the white working class some stability has gone off-kilter.

Wray is one of the founders of what has been called “white-trash studies,” a field conceived as a response to the perceived elite-liberal marginalization of the white working class. He argues that the economic downturn of the 1970s was the precondition for the formation of an “oppositional” and “defiant” white-working-class sensibility—think of the rugged, anti-everything individualism of 1977’s Smokey and the Bandit. But those anxieties took their shape from the aftershocks of the identity-based movements of the 1960s. “I think that the political space that the civil-rights movement opens up in the mid-1950s and ’60s is the transformative thing,” Wray observes. “Following the black-power movement, all of the other minority groups that followed took up various forms of activism, including brown power and yellow power and red power. Of course the problem is, if you try and have a ‘white power’ movement, it doesn’t sound good.”

The result is a racial pride that dares not speak its name, and that defines itself through cultural cues instead—a suspicion of intellectual elites and city dwellers, a preference for folksiness and plainness of speech (whether real or feigned), and the association of a working-class white minority with “the real America.” (In the Scots-Irish belt that runs from Arkansas up through West Virginia, the most common ethnic label offered to census takers is “American.”) Arguably, this white identity politics helped swing the 2000 and 2004 elections, serving as the powerful counterpunch to urban white liberals, and the McCain-Palin campaign relied on it almost to the point of absurdity (as when a McCain surrogate dismissed Northern Virginia as somehow not part of “the real Virginia”) as a bulwark against the threatening multiculturalism of Barack Obama. Their strategy failed, of course, but it’s possible to imagine white identity politics growing more potent and more forthright in its racial identifications in the future, as “the real America” becomes an ever-smaller portion of, well, the real America, and as the soon-to-be white minority’s sense of being besieged and disdained by a multicultural majority grows apace.

This vision of the aggrieved white man lost in a world that no longer values him was given its most vivid expression in the 1993 film Falling Down. Michael Douglas plays Bill Foster, a downsized defense worker with a buzz cut and a pocket protector who rampages through a Los Angeles overrun by greedy Korean shop-owners and Hispanic gangsters, railing against the eclipse of the America he used to know. (The film came out just eight years before California became the nation’s first majority-minority state.) Falling Down ends with a soulful police officer apprehending Foster on the Santa Monica Pier, at which point the middle-class vigilante asks, almost innocently: “I’m the bad guy?”

But this is a nightmare vision. Of course most of America’s Bill Fosters aren’t the bad guys—just as civilization is not, in the words of Tom Buchanan, “going to pieces” and America is not, in the phrasing of Pat Buchanan, going “Third World.” The coming white minority does not mean that the racial hierarchy of American culture will suddenly become inverted, as in 1995’s White Man’s Burden, an awful thought experiment of a film, starring John Travolta, that envisions an upside-down world in which whites are subjugated to their high-class black oppressors. There will be dislocations and resentments along the way, but the demographic shifts of the next 40 years are likely to reduce the power of racial hierarchies over everyone’s lives, producing a culture that’s more likely than any before to treat its inhabitants as individuals, rather than members of a caste or identity group.

Consider the world of advertising and marketing, industries that set out to mold our desires at a subconscious level. Advertising strategy once assumed a “general market”—“a code word for ‘white people,’” jokes one ad executive—and smaller, mutually exclusive, satellite “ethnic markets.” In recent years, though, advertisers have begun revising their assumptions and strategies in anticipation of profound demographic shifts. Instead of herding consumers toward a discrete center, the goal today is to create versatile images and campaigns that can be adapted to highly individualized tastes. (Think of the dancing silhouettes in Apple’s iPod campaign, which emphasizes individuality and diversity without privileging—or even representing—any specific group.)

At the moment, we can call this the triumph of multiculturalism, or post-racialism. But just as whiteness has no inherent meaning—it is a vessel we fill with our hopes and anxieties—these terms may prove equally empty in the long run. Does being post-racial mean that we are past race completely, or merely that race is no longer essential to how we identify ourselves? Karl Carter, of Atlanta’s youth-oriented GTM Inc. (Guerrilla Tactics Media), suggests that marketers and advertisers would be better off focusing on matrices like “lifestyle” or “culture” rather than race or ethnicity. “You’ll have crazy in-depth studies of the white consumer or the Latino consumer,” he complains. “But how do skaters feel? How do hip-hoppers feel?”

The logic of online social networking points in a similar direction. The New York University sociologist Dalton Conley has written of a “network nation,” in which applications like Facebook and MySpace create “crosscutting social groups” and new, flexible identities that only vaguely overlap with racial identities. Perhaps this is where the future of identity after whiteness lies—in a dramatic departure from the racial logic that has defined American culture from the very beginning. What Conley, Carter, and others are describing isn’t merely the displacement of whiteness from our cultural center; they’re describing a social structure that treats race as just one of a seemingly infinite number of possible self-identifications.

From the archives:

The Freedmen's Bureau

(March 1901)
"The problem of the twentieth century is the problem of the color line..." By W.E.B. DuBois

The problem of the 20th century, W. E. B. DuBois famously predicted, would be the problem of the color line. Will this continue to be the case in the 21st century, when a black president will govern a country whose social networks increasingly cut across every conceivable line of identification? The ruling of United States v. Bhagat Singh Thind no longer holds weight, but its echoes have been inescapable: we aspire to be post-racial, but we still live within the structures of privilege, injustice, and racial categorization that we inherited from an older order. We can talk about defining ourselves by lifestyle rather than skin color, but our lifestyle choices are still racially coded. We know, more or less, that race is a fiction that often does more harm than good, and yet it is something we cling to without fully understanding why—as a social and legal fact, a vague sense of belonging and place that we make solid through culture and speech.

But maybe this is merely how it used to be—maybe this is already an outdated way of looking at things. “You have a lot of young adults going into a more diverse world,” Carter remarks. For the young Americans born in the 1980s and 1990s, culture is something to be taken apart and remade in their own image. “We came along in a generation that didn’t have to follow that path of race,” he goes on. “We saw something different.” This moment was not the end of white America; it was not the end of anything. It was a bridge, and we crossed it.

Monday, January 12, 2009

Predictors of Success

Finger length may predict financial success
By RANDOLPH E. SCHMID, AP Science Writer Randolph E. Schmid, Ap Science Writer 1 hr 11 mins ago
WASHINGTON – The length of a man's ring finger may predict his success as a financial trader. Researchers at the University of Cambridge in England report that men with longer ring fingers, compared to their index fingers, tended to be more successful in the frantic high-frequency trading in the London financial district.

Indeed, the impact of biology on success was about equal to years of experience at the job, the team led by physiologist John M. Coates reports in Monday's edition of Proceedings of the National Academy of Sciences.

The same ring-to-index finger ratio has previously been associated with success in competitive sports such as soccer and basketball, the researchers noted.

The length ratio between those two fingers is determined during the development of the fetus and the relatively longer ring finger indicates greater exposure to the male hormone androgen, the researchers noted.

Previous studies have found that such exposure can lead to increased confidence, risk preferences, search persistence, heightened vigilance and quickened reaction times.

In a separate study last year, Coates and colleagues reported that the hormone that drives male aggression and sexual interest also seemed able to boost short term success at finance.

They studied male financial traders in London, taking saliva samples in the morning and evening. They found that those with higher levels of testosterone in the morning were more likely to make an unusually big profit that day. Testosterone, best known as the male sex hormone, affects aggression, confidence and risk-taking.

In the new study, the researchers measured the right hands of 44 male stock traders who were engaged in a type of trade that involved rapid decision-making and quick physical reactions.
Over 20 months those with longer ring fingers compared to their index fingers made 11 times more money than those with the shortest ring fingers. Over the same time the most experienced traders made about 9 times more than the least experienced ones.

Looking only at experienced traders, the long-ring-finger folks earned 5 times more than those with short ring fingers.

While the finger ratio, showing fetal exposure to male hormones, appears to signal likely success in high-actively trading that calls for risk-taking and quick reactions, it may not indicate people who would do well at other sorts of financial activities, the researchers said.

Some traders require additional skills on dealing with clients and sales workers.

And the advantage may even reverse for some, Coates team said, such as traders taking a more analytical and long-term approach to the markets.

One study, which looked at average finger ratios in university departments found that faculty from math, science and engineering exhibited longer index finger ratio, rather than ring finger, they noted.

___http://news.yahoo.com/s/ap/20090113/ap_on_sc/sci_financial_finger

Wednesday, January 7, 2009

Sociology in Action




Teens divulge risky behavior on social networking sites

By Serena Gordon, HealthDay Reporter - Tue Jan 6, 8:48 PM PST


MONDAY, Jan. 5 (HealthDay News) -- More than half of teens who use the social networking site MySpace have posted information about sexual behavior, substance abuse or violence, new research shows.
The good news, according to a second study from the same research group, is that a simple intervention -- in this case, an-e-mail from a physician -- made some of the teens change their risky behaviors.
"I was surprised, at least to some extent, at how clearly teens were discussing behaviors that we struggle to get out of them," said Dr. Megan Moreno, an assistant professor of pediatrics at the University of Wisconsin-Madison.
"Once we started getting the findings, we wondered, why are they doing this?" Moreno said. "Do they not get it? And, if they don't understand that this is public, can we send them a cautionary message to let them know just how public their information really is?" Moreno was working at the University of Washington and Seattle Children's Research Institute at the time the studies were done.
"We need to devise ways to teach teens and their parents to use the Internet responsibly," study senior author Dr. Dimitri Christakis, director of the Center for Child Health, Behavior and Development at Seattle Children's Research Institute, said in a statement.
Results from the two studies appear in the January issue of the Archives of Pediatric and Adolescent Medicine.
More than 90 percent of teens in the United States have access to the Internet, according to background information from the studies. About half of all teens who use the Internet also use social networking sites, such as MySpace and Facebook. MySpace boasts more than 200 million profiles, according to the studies, and about one-quarter of those belong to teens under 18.
Moreno and her colleagues randomly selected 500 MySpace profiles from people who reported their age as 18. They collected the information during the summer of 2007.
They found that 54 percent of the profiles contained information on risky behaviors, with 24 percent referencing sexual behaviors, 41 percent referring to substance abuse and 14 percent posting violent information.
Factors associated with a decreased risk of posting risky behaviors included displaying religious involvement or involvement with sports or hobbies.
For the second study, the researchers randomly selected 190 profiles of people between 18 and 20 who displayed risky behaviors, such as sexual information. Half were sent an e-mail from a physician that pointed out that the physician had noticed risky behavior on their profile and suggested changing the displayed information. The e-mail message also provided information on where to be tested for sexually transmitted diseases.
Almost 14 percent of those who got the e-mail deleted references to sexual behavior, compared with 5 percent of the others.
"This was a creative and unique way to reach kids," said Kimberly Mitchell, the author of an accompanying editorial in the same issue of the journal and a research professor at the Crimes Against Children Research Center at the University of New Hampshire in Durham.
Mitchell advised parents not to try to forbid their children from using these sites altogether. "It's important for parents to understand how important these social networking sites are to kids," she said. "They're here to stay, and they're not all evil. There can be some really positive aspects to these sites. But adolescents aren't necessarily thinking 10 years ahead, when employers or college administrators may look at these sites. Teens live in the here and now, so parents need to talk to kids about the longer-term impacts and help them think through some of the repercussions."
Moreno suggested that parents ask teens to show them their MySpace or Facebook pages. "Teens will definitely balk, but they balk at lots of things, like curfews," she said. "Some parents feel it's a violation of privacy, like reading a diary, but it's out there, it's public."
Parents should use this information as a conversation starter, Moreno suggested.

Tuesday, January 6, 2009

Maslow's Hierarchy of Needs

How far is too far? Whose safety preempts?

240,000 dollars awarded to man forced to cover Arab T-shirt

NEW YORK (AFP) – An airline passenger forced to cover his T-shirt because it displayed Arabic script has been awarded 240,000 dollars in compensation, campaigners said Monday.

Raed Jarrar received the pay out on Friday from two US Transportation Security Authority officials and from JetBlue Airways following the August 2006 incident at New York's JFK Airport, the American Civil Liberties Union (ACLU) announced.

"The outcome of this case is a victory for free speech and a blow to the discriminatory practice of racial profiling," said Aden Fine, a lawyer with ACLU.

Jarrar, a US resident, was apprehended as he waited to board a JetBlue flight from New York to Oakland, California, and told to remove his shirt, which had written on it in Arabic: "We will not be silent."

He was told other passengers felt uncomfortable because an Arabic-inscribed T-shirt in an airport was like "wearing a T-shirt at a bank stating, I am a robber,'" the ACLU said.

Jarrar eventually agreed to cover his shirt with another provided by JetBlue. He was allowed aboard but his seat was changed from the front to the back of the aircraft.

Last week, nine Muslims, including three children, were ordered off a domestic US flight after passengers heard what they believed were suspicious remarks about security.

Although the passengers, eight of them US citizens, were cleared by the FBI, they were reportedly still barred from the AirTran flight.

Security has been at a high level in US airports since the September 11, 2001 hijacked airliner attacks against the World Trade Center in New York and the Pentagon in Washington.

However, rights groups and representatives of the Muslim community say the security measures have led to frequent discrimination and harassment.


http://news.yahoo.com/s/afp/20090106/ts_alt_afp/ustransportairsecuritymuslimsrights_090106002219/print

Monday, January 5, 2009

KC Native, Gulf War Pilot, MIA or KIA

Posted on Sun, Jan. 04, 2009
Navy board to review status of missing pilot, a KC native

By BEN EVANSThe Associated Press

WASHINGTON The family of a Navy pilot missing since his plane was shot down in the Persian Gulf War isn’t ready to give up hope that he is alive.

Capt. Michael Scott Speicher’s family says they will oppose any decision to declare him killed in action.

The Navy has scheduled a review board hearing for today on the status of Speicher, who has been missing since January 1991, when his FA-18 Hornet was shot down in Iraq on the first night of the war.

The hearing comes several months after the Navy received a fresh intelligence report on Speicher from Iraq.

Speicher was born in Kansas City and attended Winnetonka High School before his family moved to Jacksonville, Fla., when he was 15.

Speicher’s family, which has seen the latest intelligence report, thinks Navy Secretary Donald Winter is moving toward changing Speicher’s status from missing/captured to killed, said Cindy Laquidara, the family’s lawyer and spokeswoman.

The family — including two college-age children who were toddlers when Speicher went missing — thinks the Pentagon should do more to determine definitively what happened, Laquidara said. They see the outcome as setting a standard for future missing-in-action investigations, she said.

“This really is a precedent for every other captive serviceman or woman, and it needs to be done right,” Laquidara said. “We’ve looked at the information that’s going to be presented to the board, and we feel pretty confident that it’s not time under the standards that they’ve set to change the status. There are things that need to be done before one can be certain.”

Speicher was the first American lost in the Gulf War.

Some think Speicher ejected from the plane and was captured by Iraqi forces, and potential clues later emerged that he might have survived: The initials “MSS” were found scrawled on a prison wall in Baghdad, for example, and there were reports of sightings.

The Pentagon has changed Speicher’s status several times. He was publicly declared killed in action hours after his plane went down. Ten years later, the Navy changed Speicher’s status to missing in action, citing an absence of evidence that he had died.

In October 2002, the Navy switched Speicher’s status to “missing/captured,” although it has never said what evidence it had that he was ever in captivity.

Another review was done in 2005 with information gleaned after Baghdad fell in the U.S.-led invasion, which allowed American officials to search inside Iraq. The review board recommended then that the Pentagon work with the State Department, the U.S. Embassy in Baghdad and the
Iraqi government to “increase the level of attention and effort inside Iraq” to resolve the question of Speicher’s fate.

The Defense Intelligence Agency, which tracks the cases of missing military personnel and works with other intelligence agencies, submitted its latest report last fall.

“Captain Speicher’s status remains a top priority for the Navy and the U.S. government,” Cmdr. Cappy Surette, a Navy spokesman, said recently. “The recent intelligence community assessment reflects exhaustive analysis of information related to Captain Speicher’s case.”

The final decision on changing Speicher’s status must come from the secretary of the Navy; the review board’s decision is only a recommendation, said Lt. Sean Robertson, another Navy spokesman.

Robertson said that once the board meets, it has up to 30 days to complete its report. The family then would have up to 30 days to comment on the board’s recommendation before it is forwarded to the secretary for a decision.

The board will be composed of three officers, including one who is experienced in F/A-18 aircraft. The board has a legal adviser assigned and a legal counsel also will represent Speicher to look after the interests of him and his family, Robertson said.

Laquidara said family members would attend the hearing.

“It’s really easy to put out a yellow ribbon but not so easy to allocate resources to find a missing serviceman or woman,” she said. “If Scott’s not alive now, he was for a very long time, and that could happen to somebody else.”

Civil War Submarine Investigations

Posted on Mon, Jan. 05, 2009
CSI Hunley: Fate of historic sub a cold case file

By BRUCE SMITHAssociated Press Writer

It could be one of the nation's oldest cold case files: What happened to eight Confederate sailors aboard the H.L. Hunley after it became the first submarine in history to sink an enemy warship?
Their hand-cranked sub rammed a spar with black powder into the Union blockade ship Housatonic off Charleston on a chilly winter night in 1864 but never returned.

Its fate has been the subject of almost 150 years of conjecture and almost a decade of scientific research since the Hunley was raised back in 2000. But the submarine has been agonizingly slow surrendering her secrets.

"She was a mystery when she was built. She was a mystery as to how she looked and how she was constructed for many years and she is still a mystery as to why she didn't come home," said state Sen. Glenn McConnell, R-Charleston and chairman of the South Carolina Hunley Commission, which raised the sub and is charged with conserving and displaying it.

Scientists hope the next phase of the conservation, removing the hardened sediment coating the outside of the hull, will provide clues to the mystery.

McConnell, who watched the sub being raised more than eight years ago, thought at the time the mystery would be easily solved.

"We thought it would be very simple ... something must have happened at the time of the attack," he said. "We would just put those pieces together and know everything about it."

But what seemed so clear then seems as murky now as the sandy bottom where the Hunley rested for 136 years. When the Hunley was raised, the design was different from what scientists expected and there were only eight, not nine, crewmen, as originally thought.

The first phase of work on the Hunley consisted of photographing and studying the outside of the hull. Then several iron hull plates were removed allowing scientists to enter the crew compartment to remove sediment, human remains and a cache of artifacts.

Thousands of people, many re-enactors in period dress, turned out in April 2004 when the crew was buried in what has been called the last Confederate funeral.

With the inside excavated, the outside of the hull will now be cleaned before the sub is put in a chemical bath to remove salts left by years on the ocean floor. The Hunley will eventually be displayed in a new museum in North Charleston.

Archaeologist Maria Jacobsen said the Hunley is like a crime scene except that, unlike on television shows, there is no smoking gun.

"If we compare this crime site investigation with, say, a tragic plane crash in the mountains, that investigation would be a lot easier," she said. "You can go to the crash you can see the metal pieces and they have the fingerprints of the crash site."

In the case of the Hunley, some of those fingerprints may be covered with the encrusted sediment on the hull that scientists refer to as concretion.

When the sub was found there was no window in the front conning tower, suggesting it had been shot out, perhaps by Union sharpshooters.

But no glass was found inside the sub and the remains of the captain, Lt. George Dixon, showed no injuries to his skull or body consistent with being shot while looking through the window, McConnell said.

The crew's bodies were found at their duty stations, suggesting there was no emergency resulting in a scramble to get out of the sub. And the controls on the bilge pump were not set to pump water from the crew compartment, suggesting there was no water flooding in.

After the attack both Confederates on shore and Union ships reported seeing a blue light, believed to be the Hunley signaling it had completed its mission.

A lantern with a thick lens that would have shifted the light spectrum and appeared blue from a distance was found in the wreck.

But after the attack, the USS Canandaigua rushed to the aide of the Housatonic and there is speculation that the light could have come from that ship instead.

Could the Canandaigua have grazed the Hunley, disabling her so the sub couldn't surface? A good look at the hull in the coming months may provide the answer.

Historians also know the Hunley needed to wait for the incoming tide to return to shore.

"Were they waiting down there and miscalculated their oxygen and blacked out?" said McConnell.

He said a grappling hook, believed to serve as an anchor of the Hunley, was found near the wreck. Cleaning the hull may produce evidence of a rope showing the sub was anchored, perhaps waiting for the tide to change.

Then there is the mystery of Dixon's watch, which stopped at 8:23 p.m. Although times were far from uniform in the Civil War era, the Housatonic was attacked about 20 minutes later, according to federal time, McConnell said.

One theory is the concussion of the attack stopped the watch and knocked out the sailors on the sub. Or the watch simply might have run down and was not noticed in the excitement of the attack. That could have led to a miscalculation of the time they were under water.

Union troops reported seeing the Hunley approaching and the light through the tower window "like dinosaur eyes or a giant porpoise in the water," McConnell said.

If the Hunley crew miscalculated and surfaced too close to the Housatonic on their final approach they would not have had enough time to replenish their oxygen before the attack, he said.

The clues now seem to indicate the crew died of anoxia, a lack of oxygen, and didn't drown. "Whatever happened, happened unexpectedly, with no warning," McConnell said.

Running out of oxygen can quickly cause unconsciousness.

"One you reach that critical stage, it's like you flick a switch," he said. "It's that fast, like on an
airplane."

Saturday, January 3, 2009

"Prickly City" takes on the Constitution and the Takings Clause


This is from the paper. All credit given, it is from December 31st, KC Star. Prickly City is one of the few comic strips that leans right. The storyline included here is originally from 2005, and deals with a fight for Walmart to build. Walmart has brought on controversy in Virginia, proposing to build within a mile of the entrance to the historic battle site of the Wilderness (http://www.google.com/hostednews/ap/article/ALeqM5gJpplE9BdF5uEHq4nJg1n5D9o45AD95F6OGG0).
Is Walmart what the founding fathers had in mind when they put in the Takings Clause? Where can we see its use today, in our area? Should Kansas City invoke it to build a light rail system? Where should the trail run? Who should be forced to give up their properties?