Thursday, June 05, 2003


The following is wholly, irredeemably cynical. It may also be true. I hope not.

= = = = = = = = = = = =

The presidency is a four-year television program carried on almost every channel and every network. Like other popular reality shows, such as American Idol, the public is given nominal input into the content of the show. We're allowed to vote every four years instead of every week, but we do get to pick the star of the show.

And that, really, is what we do. Regardless of a candidate's policies, qualifications, philosophy of governance or lack thereof, Americans know one indisputable fact about the person they elect as president -- we'll be watching this man on television for the next four years.

This is the biggest show on television. And unlike other broadcasts, it cannot be avoided. The inescapable daily fact of every election is that we are selecting the star of the TV program we'll all be watching -- like it or not -- for years to come.

Jimmy Carter won the White House in 1976 because Gerald Ford was a lousy TV show. The gentle, conflict-averse Carter wasn't much better of course -- we all loved Mr. Rogers, too, but he never was king of the ratings.

After four years of Carter's PBS-like approach to the presidency, we were glad to replace him with the charismatic movie star Ronald Reagan. His star-spangled showmanship was a smash hit -- Grenada! -- and the public choked on the idea of replacing his show with the Dick Cavett-ish Walter Mondale.

But alas, no show can run for more than eight years. And, anyway, the scandal-ridden, addled Ronnie wasn't as much fun to watch in the second season. His show, like The X-Files, withered in its later years.

So in 1998 viewers were offered a spin-off: George Herbert Walker Bush. As spin-offs go, this had all the potential of Enos, or Flo or After MASH. Jesse Jackson was a promising, entertaining alternative, but the Democrats weren't ready for In Living Color, so they offered Michael Dukakis -- Unliving Pallor. The horrifying prospect of watching four years of Dukakis TV made George Bush seem almost exciting. So viewers opted for the spin-off and bided their time.

For a while, the Bush show was more popular than expected. The writing was terrible, but the Hollywood special effects were really something. Still, the show had no plot, no central theme and no compelling characters. It was cancelled after four years.

Bill Clinton was a natural TV star -- the Letterman to Reagan's Carson. Ratings soared even -- especially -- during his salacious, scandal-plagued second season. (Scandals are the presidential equivalent of Cousin Oliver -- a desperate ploy for ratings in a once-great, but fading series. That's why every two-term president since the dominance of television has been threatened with impeachment after six years or so.)

The proposed Clinton spin-off -- the Al Gore show -- promised to carry on much of the Clinton show's winning formula. But viewers weren't quite as enthusiastic to watch their favorite show without their favorite star. Most wanted to, but nearly as many opted for the Pat Sajak-like George W. Bush, so the broadcasters stepped in and decided to go with a kind of sequel.

That decision, so far, has proved popular -- thanks again to the special effects, now CGI- and WMD-enhanced, used in the first Bush show.

(Ultimately, though, the popularity of the Bush show hinges on its allowing viewers to see the kind of TV spectacular that previous presidents had kept off the air. 9/11 was a ratings bonanza. The Clinton show, in its second season, had decided the Millennium bombing was inappropriate for television, so they kept it off the air. In retrospect, had they allowed that episode to be broadcast, we'd probably be watching their spin-off today.)

So what does this mean for 2004? Who will be the star of the big show for the next four years?

The Bush show remains popular, but it's working too hard. Viewers have gotten a little bit savvier about and less patient with sweeps-week ratings stunts. And for the Bush show, every week is sweeps week. But stunts aside, the show's star is merely good, not great, at television.

What of the alternatives? Howard Dean, this year's Bruce Babbitt, offers some substance, but substance makes for lousy TV. John Kerry and Richard Gephardt are stars, but not top-tier stars. Remember The Geena Davis Show? I'm not sure viewers are apt to want to tune in to either one for the next four years.

Which brings us to John Edwards. The guy looks good on TV. Viewers like him, and they might just decide that this is the show they want to watch for the next four years.

= = = = = = = = = = = =

Do I really believe this? Do I really think that presidential politics -- the selection of "the most powerful man in the world" -- is really such a clumsy ratings contest? Do I really believe that elections simply come down to who makes for better television?

Not really. But then again, that's how every election in the last 40 years has gone.

posted by Fred Clark 5:10 PM

Wednesday, June 04, 2003


Tom Friedman's column in The New York Times succinctly restates his position on the war on Iraq. I've always thought that Friedman presented a somewhat credible argument for a war, but that this argument was unrelated to this war. But it is still an argument for war as a first resort -- which is bass ackwards and against the rules.

What's startling is that Friedman shares the Claude-Raines-ish surprise so many war supporters are expressing over the unearthing of mass graves in Iraq:

Once the war was over and I saw the mass graves and the true extent of Saddam's genocidal evil, my view was that Mr. Bush did not need to find any W.M.D.'s to justify the war for me.

Set aside Friedman's reckless stretching of the word "genocidal" here. Friedman writes for The New York Times -- doesn't he also read the paper? Why is he so startled at the existence of these mass graves? And why are the mass graves considered more newsworthy than the mass killings that produced all those bodies?

Many of the recently uncovered mass graves contained the bodies of Shiite Muslims killed after the 1991 uprising urged by the first President Bush. The Shiites rose up, the promised American support never arrived, and Saddam slaughtered them wholesale -- even using helicopters against them, some flying with impunity through alleged "No Fly Zones." (Mr. Friedman may have forgotten America's role in provoking, then abandoning, this uprising, but Shiite leaders in Iraq have not. It's a standard part of their anti-American stump speeches.)

This mass slaughter was common knowledge -- it was in all the papers at the time. But why is it that the slaughter was downplayed, buried deep in the paper, while the uncovering of the graves is front-page news? It's as though we consider the mass burial of his victims worse than the killings themselves.

Some of the other mass graves probably date back to before the first Gulf War, back to the 1980s, when Donald Rumsfeld was sucking up to Saddam as Bechtel's special envoy for its pipeline pipe dreams. Back then, America viewed Saddam with the hard-edged pragmatism of Cold War realism. He was the enemy of our enemy and we ignored Halabja and other atrocities in pursuit of longer-term strategic aims.

So why, now, are Tom Friedman and so many others shocked -- shocked -- to learn that all that mass-killing produced so many dead bodies? What on earth did they expect?

Friedman would have us believe that there is some great lesson about Iraq to be learned from its mass graves. But the real lesson here goes far beyond Saddam Hussein's Iraq. The lesson is this: massive slaughter produces death, even when we ignore it.

Right now, horrific violence and death are a daily occurence in the Congo. It's in all the papers, but it's not front page news. Nor does it seem to be within the scope of America's foreign policy. Years from now, perhaps when it becomes politically expedient, the victims whose plight we glibly abide today will be exhumed and we will have the opportunity, once again, to express how utterly shocked we are at the "true extent" of all that death.
posted by Fred Clark 4:29 PM

Tuesday, June 03, 2003


Bad credit? No credit? You're screwed.

Secured credit cards are the widely prescribed remedy for Americans seeking to establish -- or re-establish -- their all-important credit history.

For young people starting out, or recently divorced women (mostly women) starting over, these low-limit cards are held up as a good way to begin rebuilding a record as a responsible debtor. (For the quasi-governmental credit reporting agencies, the only people worse than irresponsible debtors are people with no debts at all.) They're also the suggested course of action for people who --through a run of bad luck or bad judgment -- have a history of bad debt and missed payments.

Whether a bad history or no history, Americans without a decent credit rating are financially marginalized. Economic self-sufficiency and financial independence requires a good standing with the unelected credit bureaus, and one of the most common means for improving this standing is to acquire and responsibly use a secured credit card.

But secured credit cards are an indefensible scam.

Regular credit cards -- unlike credit extended for a mortgage or an auto loan -- allow consumers to borrow without collateral. This is, of course, riskier for the creditor. That's why non-collateralized credit card debt is more expensive. The interest rates for credit cards are far higher than the rates for collateralized loans -- reflecting the added costs of the higher risk. Fair enough. (Within limits).

A person with a higher-than-average risk (i.e. someone with a lower credit rating) will be charged a higher rate for using credit cards than a "preferred" customer whose lower risks allow for lower rates. This is still reasonable -- although it's also a good example of what James Baldwin meant when he said "It's expensive bein' poor."

Secured credit cards are the option of last resort for the very riskiest customers -- those with damaged or nonexistent credit histories. Because these customers carry the most extreme risk, they pay the most extreme costs. The rates for secured credit cards reflect this -- they are far higher than the rates charged for regular cards.

But here is where the logic breaks down and the predatory scam begins. Unlike regular credit cards, secured cards are secured. They are fully collateralized. For creditors, they carry no risk at all.

To obtain a secured credit card, the borrower -- a term that scarcely applies -- must first make a deposit equal to 100 percent of the line of credit offered. A customer puts $500 into an account and is extended $500 in credit. But this isn't really "credit" and it isn't really "extended." These borrowers are, in fact, paying to borrow their own money.

Consider also that this 100-percent collateral is in cash. Unlike the collateral for a home equity loan, it's utterly liquid. And unlike the collateral for an auto loan it's not subject to depreciation. Actually, it's subject to appreciation -- deposits are, after all, are what provides the capital that lets banks make their money.

Secured credit lending is 100-percent risk free for the institution making the loan. How then is the high-risk rate justified? It isn't. It can't be. When a loan is 100-percent secured with liquid, appreciating capital, it matters not a whit whether or not the borrower is in the good graces of Transunion, Equifax and Experion.

That loan is safe as houses. Safer, in fact.

The secured-credit scam is defended by the claim that it is more important for low-income Americans to build up their "credit rating" than for them to build-up real-world assets. A $500 bank deposit should be, for our poorer neighbors, a stepping stone for establishing assets and building personal wealth and self-sufficiency. Instead, that deposit becomes a risk-free bonanza for bankers and an excuse to charge usurious rates for the poor to access their own money.

The Federal Trade Commission warns consumers against "secured credit marketing scams." This is a worthy warning -- dubious agents and agencies on the fringe of the financial world victimize poor people with scams masquerading as mainstream secured credit offers from mainstream financial institutions. The FTC is right to be concerned about these fringe players, but the biggest scam of all is being conducted by precisely those mainstream institutions -- there's no such thing as a secured credit card offer that isn't a marketing scam.

So what would a legitimate offer look like? Here's a modest proposal.

Some financial institutions still exist to serve their customers -- institutions like credit unions or those few Savings & Loan associations that ignored deregulation and its temptation to swell to bursting (their model remains the Bailey Bros. Building & Loan from It's a Wonderful Life -- there are still a few George Baileys out there). Small, innovative, community-based financial institutions are also growing all over the country. Any of these would be a good place to start providing an alternative to the secured credit scam.

Such institutions could begin offering secured credit cards with rates that reflect the negligible risks involved. These cards would enable the working poor to establish credit histories while also developing assets. The cards could charge a minimal rate -- I'm thinking 5 percent or less -- to cover the administrative costs of the service. The deposits themselves could also likely cover those costs --this is again, the business of banks. But I would prefer to leave those deposits untouched, accumulating modest savings interest and savings habits for the depositors.

Such a system could allow for a reasonable profit for the financial institution, and a far more reasonable rate for the customer. A more aggressively mission-oriented version of such a system could go further. The cards could be offered, for instance, in conjunction with an IDA (individual development account -- see here) to create a more vigorous asset-development program that would also help customers repair or establish their credit rating.

The goals of such a system would be: 1) to create an alternative to the current practice of charging exorbitant, high-risk rates for no-risk, secured loans; 2) to empower low-income households by enabling them to create or rebuild their credit histories without paying punitive rates; 3) to empower these same households by helping them to create assets and accumulate capital; and 4) to help consumers who may have only a marginal relationship to financial institutions to acquire and develop the financial skills and habits needed for a greater degree of economic self-sufficiency. (Another possible benefit: major institutions might respond to this competitive lower-rate offer by lowering their own rates slightly.)

The grant-writer-ese of that paragraph ("empower," "self-sufficiency," etc.) is intentional. Even in these funding-challenged times, I can envision several foundations who would be willing to help fund a trial program along these lines. The outline here is, I realize, still rather sketchy.

I'm hoping for feedback:

1. Does anyone know of a program like this currently in place? If so, how well is it working?

2. What am I missing? What unforeseen or unaccounted for complications might make such an effort unworkable? What are the potential pitfalls of such a program?

Whether or not my proposed remedy is workable doesn't alter the essential point I'm arguing here: high-risk rates for secured credit cards are unnecessary and unfair. This predatory practice is overripe for legislative correction. The current scheme should be illegal.

posted by Fred Clark 2:47 PM

Monday, June 02, 2003


Credit ratings: The quantification of class posing as a measure of character.

Your credit rating -- that arcane score created by the three private agencies that function as a kind of shadow judiciary -- governs a lot more than the kind of house you may be able to live in. And "governs" is the key word here -- these unelected, unaccountable agencies function with a kind of sovereign power. We are not only their objects, we are their subjects.

Credit ratings are now used to screen job applicants and to limit access to health, life and auto insurance. And that's just the uses we know about.

"Credit worthiness" is marketed as a synonym for trustworthiness, but such a thing is not readily quantified. What the ratings actually measure are things like income and assets -- wealth, in other words. They also take into account payment histories -- whether a person has been overdue with or negligent in making payments. This payment history is, of course, merely a way of restating and reinforcing the prior category of income and assets. Payment history is just one way to inflate the simple question of wealth or poverty into a matter of character.

This is what the credit-rating system does. It quantifies and stratifies wealth, while pretending to be a qualitative measure of character.

If a person is poor, it follows that she or he will have difficulty paying the bills. The payment history is more interesting, more significant, if the person in question is wealthy, but still has a history of defaults and late payments. Such a person, it might be reasonable to infer, is untrustworthy. But the credit ratings cannot account for such a case. Imagine two men: one poor, but honest; and one rich, but reckless. Their payment histories will be quantified the same, but since the second man is wealthier, his credit-rating will be higher and he will be regarded as more "trustworthy."

This illusion of measuring character is what accounts for its use by employers and insurers. The perverse result is that an unemployed job applicant may be rejected because she is unemployed.

The quasi-governmental bureaus that compile and oversee credit ratings would protest their scores do more than simply stratify wealth. For instance, the ratings also consider a person's total load of debt.

Perhaps, but this consideration includes absurdities. A prudent person, for example, will try to pay off their car loan. That will hurt their credit rating. Consolidating credit card debt into a single, lower-interest loan is a responsible, money-saving step. But this too can lower your score. (See Kathy Kristof for details.)

But in any case, these complex and counter-intuitive tertiary measurements cannot overshadow the basic factor that determines credit rating: Are you wealthy or are you poor?

That such a measurement has become so central to our society belies our claims of classless, democratic egalitarianism.

The fact that this crude measurement of wealth is becoming more widely utilized as a refined measurement of a person's character is simple obscene.

posted by Fred Clark 7:08 PM

Sunday, June 01, 2003


My first impression of H. Ross Perot was that he was some sort of Jim Henson creation. The giant ears, exaggerated gestures and improbable voice made me suspect that Frank Oz was at work.

My second impression -- once I realized he wasn't really a muppet -- was that he was Buzz Windrip come at last. Windrip is the plainspoken populist demagogue in Sinclair Lewis' It Can't Happen Here, the man whose unexplored "can do" promises of results for the American people leads to the rise of totalitarianism in the novel.

That was probably unfair, too. H. Ross Perot was evasive, dismissive, erratic and demagogic -- but he probably would not have ushered in totalitarian rule.

Anyway, my point here is to give Perot credit for his one very real, major accomplishment -- he put the federal budget deficit squarely, and unavoidably, in the center of the political agenda. The Clinton administration deserves enormous credit for turning around the snowballing deficits of the Reagan and Bush I years, but some of that credit must be shared with Perot, whose half-hour infomercials on the deficit helped created a groundswell of popular support -- even popular anger -- for balancing the budget.

I've completely lost track of Perot -- is he even still alive? If he is, I hope he's reading the newspaper. The deficit is back, bigger than ever. Ol' Ross needs to get his pointer and easel out of storage, whip up another batch of pie graph charts and rent himself some air time.
posted by Fred Clark 1:55 PM


Politicians think more gambling is good, more civic duty bad

The item below, or something like it, will be appearing soon in an evangelical magazine. The unspoken theme here is the way that our Frat Boy in Chief has eroded any sense of civic responsibility -- sneering at those who would suggest that citizens must pay taxes, or that the war on terrorism requires us to do anything more personally costly than just going shopping. This contempt for responsibility -- for substantial, as opposed to symbolic, patriotism -- makes it very difficult for the states to address their fiscal crises with legitimate revenue from taxes. As a result, they're building more casinos and expanding the lotteries, trying to balance their budgets by ripping off the poor and the stupid. Anyway, here's a sneak preview at an upcoming issue of Prism magazine ...

= = = = = = = = = = = = =

In the year 2000, the federal government had a budget surplus of $200 billion. This year, it will have a deficit of at least $300 billion -- the largest shortfall in history.

The states, too, are broke. A stagnating economy, reduced federal support and unfunded security mandates following 9/11 have put 47 of the 50 states into the deepest fiscal crisis they have experienced since World War II. Across the country, state budgets are being slashed -- school libraries are closing, bridge repairs go unfinished, safety nets are frayed.

States are taking desperate measures. Every third lightbulb has been unscrewed in Missouri's state offices, Timothy Egan reported in The New York Times. Egan also tells of Kentucky's releasing prisoners early, of Connecticut's laying off state prosecutors, and of police in Michigan selling advertising on the sides of patrol cars. But even these drastic steps cannot make up for the record shortfalls facing so many state budgets. The states need revenue.

Twenty-five years ago, New Jersey lawmakers took a drastic step of their own to save the dying resort town of Atlantic City. They invited developers to build casinos, hoping to create an American Monte Carlo. That didn't happen -- thousands of nickel slots don't afford the glamour of the Riviera -- but they did succeed in turning the seaside town into an East Coast, organized-crime-free version of Las Vegas.

Results have been mixed. Beyond the Boardwalk, much of the city remains in poor shape. But they do have a state-of-the-art high school funded by part of the $70 billion that gamblers have spent in the city since 1978. And the $5.6 billion the casinos have paid in taxes has gotten the attention of elected officials elsewhere.

Other states have tried to stake out their own piece of the action. Cash-strapped states, like all gamblers, are tempted by the chance to make money without making anything else -- to generate income without creating wealth.

Efforts to expand gambling revenues are under way in at least half of the states -- from Olympia to Tallahassee.

States are expanding their lotteries -- adding more drawings and new "games" and increasing their advertising. If your state isn't yet part of Powerball, it will be soon. State lotteries are also increasing their advertising, which critics complain is heavily focused on poor and minority neighborhoods. These ads are also conveniently exempt from federal truth-in-advertising rules, so you'll never hear Ray Charles singing about how the odds of winning Powerball are one in 121 million.

And more gambling is on the way. Maryland Governor Robert Ehrlich plans to line the Chesapeake with slot machines. Ehrlich has staked his governorship on gambling revenue, which he vows will balance his state's budget without citizens having to pay a penny more in taxes.

Delaware already derives 9 percent of its state budget from its lottery and racetrack slots, but state legislators are pushing for organized sports betting -- legal only in Nevada, Oregon and Delaware.

Pennsylvania plans to add slot machines at state racetracks, including the new track being built in impoverished Chester. Racetrack slots, which give new meaning to "Off Track Betting," are being considered in several other states as well. In Michigan, proponents call them "racinos" and claim that they could provide Detroit with the kind of revenue boost that riverboat casinos have given Tupelo, Miss.

The country has also seen an explosive proliferation of venues operated by Indian tribes. Americans spent $14.1 billion in Indian casinos in 2002, an 11 percent increase over the previous year. The tribes, as sovereign nations (when it's convenient for the United States to allow them to be), are exempt from taxes. But cooperative "compacts" with state governments let the states share the take. In 2002, according to Los Angeles-based analyst Alan Meister, state and local governments reaped $563 million from compacts with 348 tribal casinos.

One objection to this increasing legitimization and expansion of gambling is that it hurts the poor. The people most likely to play the lottery or to blow their paycheck on the slots are the people least able to afford the statistically inevitable losses. When your household expenses exceed household revenue, a lottery windfall is a powerful temptation.

It's an imprudent choice, but prudence holds little hope for the poor.

The states are behaving just like these households. Giving up hope for increased legitimate revenue, they turn to gambling.

The difference, of course, is that the states control the odds. In the long run, the house always wins. This house advantage is what makes gambling appear a safe bet for state leaders.

But this is only true if you do not consider that for the state budgets to win, somebody has to lose. Gambling is a zero-sum game. And the patsies in the states' big push for gambling are the residents of those very states.

This is a perverse notion of statecraft and public service. A state cannot advance it's overall health by preying on its own people. It's a mobius-strip economic model recalling Mark Twain's description of the poor neighborhood where people "made a precarious living taking in one another's laundry."

States respond to this charge by claiming that expanded gambling will draw revenue primarily from out-of-staters. Thus Gov. Ehrlich says his casinos will draw gamblers mainly from Delaware and Pennsylvania. Pennsylvania claims its Chester racino will draw gamblers primarily from Maryland and Delaware. And Delaware legislators say sports betting will draw gamblers from Pennsylvania and Maryland. (This fleecing of outsiders is central to the success of the tribal casinos. Foxwoods in Connecticut is slowly buying back Manhattan, $40 at a time. While not exactly justice, it is rather like karma.)

They can't all be right. Very likely, all are wrong.

Atlantic City, 25 years ago, had East Coast gamblers all to itself. But the city of Monopoly -- like its model, Las Vegas -- has lost its regional monopoly on gambling. It's like when you're at a ballgame and you can't see, so you stand up -- that gives you a clearer view and an advantage on those seated around you. But then everyone else stands up too and you lose your advantage, your view and your seat.

The states' claims that their expanded gambling will prey primarily on outsiders are false because everyone is standing now. The states' new gambling enterprises will, like their lotteries, feed primarily on the losses of their own residents.

The assumption, largely unchallenged, is that citizens are better off losing their money at a sucker's game than they would be saddled with the civic responsibility of paying taxes.

The states did not invent this idea; it's all the rage in Washington. It's part of why that word "citizens" sounds vaguely archaic, like some relic from ancient Greece. Athens had citizens. America has "consumers."

The states' desperately needed revenue will come either from gambling or from taxes. Gambling, perversely, is considered less burdensome.

The good news is that from Annapolis to Springfield to Olympia, elected officials are standing against the tide of expanded gambling. A bipartisan coalition of Maryland legislators has, for now, staved off the bayside casinos. Illinois Governor Rod Blagojevich has rejected any expansion in gambling, while increasing the tax rates paid by the state's casinos. And Washington legislators, who agree on little else about that state's ballooning deficit, have agreed that they will not turn to gambling to cover the gap.

These public servants need the support of their constituents to reject the promised magic solution of gambling fever. Let them know that there are still a few citizens left.

posted by Fred Clark 1:36 PM

This page is powered by Blogger. Isn't yours?