Tuesday, March 31, 2009

"Can-do" attitude

I don't know exactly how I feel about what actions we should take concerning the Bush-Cheney torture activities. Truth commission, war crimes trials (if The Hague doesn't get there first), investigation followed by censure - which choice is best is something on which I have not been able to reach a conclusion.

Our nation has trouble dealing with bad things done in our name or by our representatives. White people certainly benefited from the destruction of the indigenous population, and our attempts to make it right have never seemed satisfactory. Ford pardoned Nixon, reasoning that the country had already been through enough; I can't say if the catharsis of a trial would have made things better.

The Clinton impeachment showed the alternative at its worst, as it devolved into a partisan witch hunt. (I should say more about this, because I think quite a few people shared my opinion. The Monica Lewinsky business was, yes, a personal matter that did not merit impeachment. But it was reprehensible for this activity to take place in one of our hallowed national places, the Oval Office. It demonstrated Clinton's recklessness and arrogance, and we should never forget that.)

So I don't have anything to offer as to what we should do about the perversion that was Bush-Cheney policy. I have this vague sense that we can't just leave it as it is, let it go while we deal with our pressing problems. To see these two men strutting around, Cheney in particular, proudly defending one of the shameful periods in American history, is an offense to everything I've ever believed was right about this country.

On the other hand, to allow the real business of the nation to be sidetracked while a Democratic Congress postures for the people back home seems a poor solution also. Letting the likes of Pelosi and Reid, sham public servants that they are, use such a proceeding as a means of gaining political capital disgusts me.

For now, then, I'll leave the discussion to others. Andrew Sullivan has written a lot about this issue, apparently believing that no stone should be left unturned in determining the truth and bringing everyone involved to account. In a Sunday post, he again takes up the case of Abu Zabaida, someone we tortured though we had evidence that he was a low-level al-Qaeda functionary. Apparently, whatever useful information he had, essentially names, he gave up before the torture actually began - there was no plot failed as a result of our "interrogation."

[Here's one thing I never quite understand. If you were running a terrorist outfit, and someone with critical information disappeared, wouldn't you then change your plans? "Hey, Osama's not in his cave any more. Well, maybe he's just on vacation, let's go ahead with those attacks exactly as planned." I know that some would say you extract the information before anyone knows he's gone, which is inevitably the situation on 24, but it doesn't appear that our post-9/11 procedures were restricted to that.]

Sullivan quotes an article from the Washington Post:
As weeks passed after the capture without significant new confessions, the Bush White House and some at the CIA became convinced that tougher measures had to be tried. The pressure from upper levels of the government was "tremendous," driven in part by the routine of daily meetings in which policymakers would press for updates, one official remembered. "They couldn't stand the idea that there wasn't anything new," the official said. "They'd say, 'You aren't working hard enough.' There was both a disbelief in what he was saying and also a desire for retribution -- a feeling that 'He's going to talk, and if he doesn't talk, we'll do whatever.' "
I think I understand the mindset that gave rise to this attitude. George W. Bush was in so many ways the epitome of a bad CEO. (This isn't surprising in that he was, in fact, a bad CEO.) I've worked for a few (fortunately, very few) people who adopt a credo that any failure is the result of not making sufficient effort, that something not getting done can be overcome by more hard work.

There are, of course, times when that's true. There are, of course, other times when that's not. A great deal of wisdom is understanding the difference.

Some things are impossible or infeasible. You can push someone to create faster-than-light speed travel by next week, and it still won't happen. You can insist on extracting information from someone who doesn't have any, and you may get information, but it won't be true.

When one works for someone who doesn't understand this, the negative effects go beyond the wasting of time. It creates a lack of trust up and down the chain of command. Imagine being the person who had to go into the Oval Office and say, "This guy clearly doesn't have anything more to give," and to be told, "You're wrong, go back again." Your boss is saying, in effect, that you're an idiot who doesn't know his work, doesn't have the ability to distinguish between the hard and the impossible. I can tell you, that makes for a very corrosive work environment.

Monday, March 30, 2009

Participation and democracy

John McIntyre at You Don't Say has begun another fascinating thread, which started in response to a blogger "rejoicing in the death of the newspaper." (McIntyre's second post on the subject is here.) David Eaves essentially makes this case:
  • Participation leads to greater democracy.
  • Old media is not participative, it's presented.
  • New media is participative.
  • Therefore, new media is more supportive of democracy.
  • So, we should be happy that new media is replacing old media.
McIntyre does such a good job of dismantling this argument that I should probably leave it alone, but I won't.

He questions the first point, arguing persuasively that it is "accurate information" that leads to greater democracy. This seems right to me; I've never actually believed that mere participation was sufficient for democracy. I'm not one of those who thinks voting should be mandatory, that the votes of those who vote for the Irish name (or whatever criterion they use) should count as much as those who spend time evaluating candidates and issues (not that there should be a test, just that showing up is as close a proxy for knowledge as we're likely to get).

It's almost too easy to point out that participation as measured by blog comments or Wikipedia is a false democracy, as blather (or, all too often, hate) tends to permeate the "discussion":
Look at the news stories on sites that permit comments — a story followed by, say, 157 comments. After the first dozen, the responses become duplicative. Cranks arrive. If the comments are not mediated, there’s an excellent chance that the “conversation” will sink into vulgar abuse and the racists and anti-Semites will crawl out into the sunlight.
The second point above seems slightly more persuasive in an era of Fox News and other overtly partisan outlets. The profit motive has too often replaced judgment, but, and I think McIntyre would agree with me, the solution is not getting rid of old media, it's raising our expectations of it. He effectively argues that the old system has served us quite well, for the most part:
It has been the task of the newspaper to perform the functions necessary to permit such reflection and choice: investigation and reporting, selection of significant information, verification of its accuracy, and publication in a clear and compact form. And despite the sneers at our outdated 19th-century industrial model, with reporters answering to assigning editors and copy editors independently examining the texts, we can still do the job tolerably well.
I'll add one more thing: the argument laid out by Eaves depends on a hidden assumption, that new media and old media are independent, that one can replace the other with no loss. That is nonsense, if old media disappears, new media will have little raw material with which to work.

[I flipped McIntyre's title, Democracy and Participation. I'll leave it to the reader to determine what percentage of that is homage, what percentage laziness.]

Health care jobs

Let me add a bit more to my argument that President Obama, for all his good intentions to control health care costs through computerization, is missing some of the more important aspects of the transition. As I've written before, there seems to be a theory at the White House that pumping money into getting medical records from paper into the computer will: cut costs markedly, decrease life-threatening errors, and create (or, in the strange and unquantifiable term, "save") jobs.

I'll dispense with the second item first, if only to say that I have no idea to what extent computerization will save lives. It's pretty easy to posit situations in which that will help; for example, in caring for an elderly person who can't remember their drug sensitivities. This all sounds good, to imagine a system in which test results go directly from the lab into the Multivac, available for all downstream health providers to see.

Of course, there are also some massive security and policy issues, and I have no idea how we'll deal with that. As it is, we go to the doctor's office and routinely sign our HIPAA form, but with only the vague sense that we're giving access to the people in that office. How will we feel when we realize that we're letting anyone who has access to the system around the world see the details of our care? And we'll have to do that to make some of the more rosy scenarios come to pass. If we want the French paramedic to be able to treat us optimally at the scene of our car accident, they need to be able to look at our health history. (In a world where we allow offshored mortgage processors access to our financial statements, maybe this isn't so much of a concern.)

To put a financial value on this right now is almost impossible - our data is weak when it comes to understanding just how many people are injured or killed due to a lack of the information that would presumably be available. And I need to stress that, all things being equal, I'm very much in favor of this kind of initiative. I'm just trying to explore whether we can expect all the gains that have been promised.

As for my first and third points, at first blush they seem incompatible. We all know that the major component of any business expense is personnel (at least that's what we're told when the heads roll). If we're cutting costs while preserving or expanding jobs, then there must be some magic expense that's hidden in the health-care system that has nothing to do with people.

Because the money that will be saved comes directly from letting go of the people who handle the paper. I go to my doctor's office, and there are a whole lot of non-doctor people drifting around, more than seem necessary to answer the phones or take my blood pressure. Of course that's because many of them are handling paperwork, whether it be patient records or insurance forms. Automate that and, for better or worse, those people are gone.

The point might be raised that these people will transition into the new health care jobs that will be creating by this vast modernization project. And here's where the details get in the way of the theory.

I've been involved with several projects that take a paper-based way of accomplishing something and put them on the computer. Here are the jobs that come during this process: subject matter experts (SMEs) who create the specs for the new system, designers and developers who create the new system, and clerical support that somehow gets old records into the new system. There are some ancillary jobs, managers and testers and the like, but these would be the three major categories.

So we have two questions: where will the "old" people fit into this process, and who exactly will get jobs in each of these categories, that is, how much will this big-P project actually enhance the employment situation for Americans?

Unless there are some closet programmers in the ranks of the support staff at my doctor's office, their new jobs will have to come as SMEs or scanners. But there is no way that the staff to SME ratio is one-to-one, the SMEs will tend to come from the ranks of the most senior office staff. Because this is to be a national system, there will be relatively few experts enlisted in setting the specifications.

And few of these trained staff people will want to move to the scanners. That will be mind-numbing work, whether it consists of literally scanning in massive paper files or retyping a doctor's chicken scratches. (This is also the place where compromises will be made for time and cost considerations. Dead people are unlikely to get computerized records, and there may be an attempt only to get the last 10 or so years into the system.)

Furthermore, these jobs will either pay very little, or they will be outsourced (or, maybe, offshored?). This is public money, so there will be a lot of pressure to "Let Brown Do It," to ship boxes of medical records to a concrete-block building in North Dakota (or Bangalore or Manila) to be converted into a new file format.

Essentially, these jobs will be lost in the interests of efficiency, and that might be a good thing in terms of productivity. But it seems at odds with one of the major goals of the program.

Here is the place where I can reiterate my earlier point that even the hard-core IT jobs are unlikely to create a job boom for Americans. The offshore BPO (business process outsourcing) firms are reportedly already buzzing around this business, and I really don't see how we'll be able to withstand the cost differential. If I had to guess, I would say that very few jobs will actually be created in the U.S. to build the system. There will be a few - UPS may be able to add some people, and we'll need some technicians for installation and training - but it isn't even clear that the number will come close to the jobs that are lost.

Once again, we should embark on this effort; it only makes sense that a modern country has modern medical records, and there will undoubtedly be gains in safety and efficiency. But let's not fool ourselves that this can lead to a jobs boom, there is absolutely no evidence for that.

[Note: I have omitted other issues here, such as legal liability and the false certainty of "it's on the computer, so it must be right." These are potentially large, absolutely non-trivial, and perhaps they'll find their way into another post one day.]

Sunday, March 29, 2009

A math primer

Not a very light topic for a Sunday, perhaps, but I like math so this post seems like fun to me. I'll try to keep it brief, but I think I need to pull together some of the myths that drive thinking about mathematical topics. I've written about one or more these before, but it's important to keep them in mind.

1) "Exponential" growth

One reads all the time about something that's experiencing exponential growth. Cell phone penetration, chip utilization, and so forth: many things are seen as following an inexorable growth of growth tendency.

Of course, that's not true, not possible, and I wrote about it about a year ago. But it bears repeating. Growth in any real situation more likely follows the logistic curve, the elongated S that features slow growth at the beginning, apparent exponential growth in its maturity, then a flattening as some kind of natural limit presents itself.

There is an upper limit on the number of cell phones we can possibly have; we can differ on that limit (I'd say 30 billion is higher than we will ever see), but it exists. Therefore, any business model or op-ed that depends on "exponential growth" is bound to be wrong. More importantly, at the middle of the S, the rate of growth begins to slow down. Finding this inflection point is vital to understanding the profile of the market in question.

2) Curve fitting

Much of the current discussion about what kind of recession we're having depends on historical precedent. Economists take what happened in the past and extrapolate it to the present, despite a lack of relevant data points (is our current situation like the Great Depression? Who knows, we've only had one?).

This attitude is typified by a recent post from Kevin Drum, where he argues:
If you want to know what's going to happen in the future, you should pay attention to what's happened before...."This time it's different" is probably the most dangerous phrase in the world. It's especially dangerous because every once in a while it's true. But not often.
Drum is arguing for a kind of determinism that is actually more dangerous than what he's saying. I'm not contending that one should accept convoluted arguments to ignore what is patently obvious, just that most real-world models are complex, made up of many components. Ignoring changes to assumptions or conditions has its own perils.

Some simple math. If you tell a student to fit a curve to data points (1, 2) and (2, 4), most will pretty easily come up with y = 2x. But here's what I can do, I can fit a slightly more complicated curve to these points: y = 2x + k(x - 1)(x - 2), where k is any number I want. This is a family of parabolas that pass through the two given points, but I can make one pass through any other point I want, simply by choosing the right k. (And there are literally an infinite number of other curves I could fit, including circles and squares and triangles.)

My point is, the fewer data points we have, the less likely we are to get a model that predicts anything. If I give you those two points, and ask you what y will be when x = 3, you may say 6, but I can make a model that will give any answer I desire. We need to be quite wary about using the past to predict the future.

3) Nonlinearity

[I'm going to be brief and imprecise here, give just enough detail to make my point.]

Nonlinear systems are, roughly, any real system of any complexity. They feature components that depend on one another, that are changed by the changes they themselves generate in other components. Weather is one such system. I suspect the global financial system is the same.

The main result of the study of such systems is that small changes in initial conditions can bring about huge changes down the road. One person decides to go out for a drive in Iowa, and the extra heat generated by burning the gas brings about a monsoon in Bangladesh. (That's an extreme example, and there is evidence that some systems, including weather, feature dampening effects that keep such extremes from happening, but the math is consistent with that.)

Therefore, it's virtually impossible to get a precise forecast of what any real nonlinear system will do. I would wager that we'll never have absolutely exact weather forecasts ("that corner will be 74º with 48% humidity and a NNE wind at 7.6 mph a week from Thursday").

By the same token we can't really predict what, for example, a given stimulus package will do. It is not impossible that an $800 billion package will work perfectly, creating jobs and improving GDP, while $801 billion will force the economy into a chaotic condition that will be worse than what we have now (and, oddly, $802 billion might lead to a perfect result again). But we don't know that ahead of time, and we can't.

So we need to take all the certainties and forget them, and just hope that we will move in the right direction. But we also need to be realistic and understand that...no one really knows.

Saturday, March 28, 2009

Happy happy joy joy

There continues to be a prevailing attitude that a lot of our problems are simply the result of pervasive pessimism, that, if we could just perk up and feel good about what's going on, we would see a resurgence of financial strength and national pride. The media comes in for particular criticism here; they insist on talking about layoffs and unemployment and housing prices and dead 401(k)s, and they bring everyone down, and that creates a vicious cycle that makes things even worse.

Rob Horning discussed this in an item that ran about a month ago, where he parses an op-ed by Robert Shiller that replicates the idea of his book (with George Akerlof), Animal Spirits (which has given us one of the more blogged phrases of the past month), that "the Depression narrative could easily end up as a self-fulfilling prophecy." Horning:
If people invest, or spend, not out of strict need or want, but in accordance with how they feel about wanting, then the implication is that the media owes society some happy talk about the economy to keep up the “animal spirits”—Keynes’s term for the irreducible ambition that drives entrepreneurs regardless of their probability of success.
Akerlof has gone so far as to argue that government's role is one of countercyclical confidence: that it cut down people's confidence when times are good, and pump us up when times are bad. Horning quotes Will Wilkinson, who is not very pleased with this reasoning:
I’m extremely suspicious of what strike me as intellectually contentious, ad hoc interventions into the economy aimed at expectation management. Countercyclical economic mood-control initiatives seem to me inconsistent with the maintenance of a general framework of stable rules — that is, they don’t take the importance of expectations seriously enough — while also smacking of illiberal state propaganda.
A skeptical Horning concludes:
Delusional thinking about credit risk got us into this mess, so now the only thing to get us out is more widespread and more doggedly institutionalized delusional thinking? All right then! Not sure how this would help the “trust” and “faith in the system” components of animal spirits, but oh, well. Maybe if we perfect the dissemination of these delusions, we’ll be free at last from those ultimately irrelevant real economic conditions, and the state can just drop in to tell us what condition our animal spirits should be in.
I liken the Shiller-Akerlof attitude to our mental model of how football works. Conditioned by years of "Win one for the Gipper" and "Do you believe in miracles?," Americans have come to believe that any situation can be overcome by the right frame of mind and simply "wanting it more." All the economy needs to come back strong is the right halftime speech, and we will all be inspired to do anything necessary for success.

This seems compelling until we think about it for, say, 15 seconds or so. A football team made up of 160-pounders can be as inspired as we want, but, barring massive injections of PCP, is going to be crushed by the 300-pound linemen of a major college or pro team. No amount of "animal spirits" is going to change the reality that facts are facts. If you've lost your job, confidence doesn't permit you to believe you still have one.

Anyone who's worked in business for a while has seen an example or two of the person who brings very little competence, but a whole mess of confidence. Enthusiasm is confused with ability; that's possibly the result of management's own experience, that projection of a certain image can bring real results, whether in sales or the conference room.

But that should have nothing to do with hiring a programmer or an accountant, jobs where skills actually matter. You can trump up a theory that a positive attitude helps any team, and that may be true in the short run, but, eventually, management's regard for someone who can't actually do anything but be peppy corrodes the team.

The same is true of the economy as a whole. To say that everything's fine, just ignore those pesky media reports or the for sale signs, is to brand yourself as a nincompoop. It's possible to project long-term optimism while being realistic about today; that's something, I think, that Obama has done quite well (though I personally still find his outlook overly rosy). Irrational optimism, optimism that is at odds with actual on-the-ground, is foolish, and the purveyor of such ideas should not be listened to.

Friday, March 27, 2009

I have failed

I know that you, Gentle Reader, look to me for my opinions on issues of globalization. As such, I imagine that you are all anticipating my take on yesterday's Charlie Rose show, not the part on female desire, but the interview with Nandan Nilekani, co-chair of big Indian offshorer Infosys, who has a new book, and Thomas Friedman of the New York Times, who has the same old books to plug.

I wanted to watch and listen and give my impressions, to offer up some insights as to what I saw as flaws in the logic, on how these high priests of globalization would discuss changes to the model in light of the world financial crisis. Of course, I didn't expect them to take up basic questions of how American politicians justify visa policy or offshoring, but I could hope.

And then they launched into the interview, and the first thing they talked about was how Nandan gave Tom the idea for his Einsteinian revelation that, THE WORLD IS FLAT. It's a story that sums up the work of Friedman so well that I recreate it here (from the transcript on Rose's web site):
CHARLIE ROSE: The least. Just quickly, let’s review that. What happened? You were in India.

THOMAS FRIEDMAN: And Nandan was actually out of the country when I came over. And we had shot about 60 hours of film over 11 days. And over those 11 days, I was getting this sinking feeling that something really big had happened with globalization, and I had missed it. But I couldn’t quite put my finger on it.

And the last interview was with Nandan. He just came back. And the film crew was setting up in his office. I was sitting on the couch outside. Nandan came out, we sat down, I took out my laptop, as is my habit, and started interviewing him.

And at one point, Nandan said, “Tom, I’ve got to tell you, the global economic playing field is being leveled.” And it all just sort of -- I said global economic -- leveled -- and I wrote down in my notebook -- and we did the interview. But that just stuck in my head. And in the Jeep on the way back to the hotel, I just kept going over the global economic playing field is being leveled. I said the global -- but Nandan said global economic playing field is kind of being flattened. And then it just sort of popped into my head that what Nandan Nilekani, India’s premiere engineer-entrepreneur was telling me was that the world was flat, and that was a book.

And then we actually went over to his house that day. It was your birthday, I think, right?
You know, historians of the future are going to be so lucky. We have little idea how Tolstoy came up with the concept of War and Peace, but we have in laborious detail the story of how Tom Friedman came up with the flat world idea. And we will also know that Friedman was a very important man, because he dealt only with very important people, and was even invited to their homes on his birthday.

I don't want to belabor the infelicities that define the "thought entity" that was Tom Friedman (but I can't resist one; as David Smick showed in his book, the financial world is curved, even crooked; Friedman's flat world is a vinyl sheet laid on a non-smooth underfloor, and we're seeing the results of that now - it's a lousy analogy, and I'm royally tired of it), but, as he launched into this story for the umpteenth time, I couldn't take it any more. I grabbed the remote and turned to something, anything else.

You see, I could just tell how this interview was going to go. Friedman, there for no appreciable reason other than to give his NYT bestseller cred to his buddy's new book, was going to butt into the conversation whenever he could with yet another faux insight, and Rose would let him get away with it. Nilekani would likely get very little time to talk about his book, even though Charlie would hold it up a couple of times. The viewers would gain no insight into, well, anything. Then Charlie, Tom, and Nandan would head out to one of New York's finest steakhouses for a big piece of beef and a couple of bottles of wine, while Charlie and Nandan would wait around for another Friedman gem ("It's interesting that beef and wine are both red, and businesspeople love both of them, but they hate it when their numbers are in the red - there's a book there, and I'll call it "The Redding of Business.")

In the interest of serving my public, I went back today and checked out the transcript of the interview (mercifully, there isn't one of the steak dinner). And it went just as I said. Nilekani made some banal observations about the challenges facing India today - his book may be better than that, I don't know. Any attempt to dig deeper was interrupted by Friedman's usual drivel (six-lane superhighway, market and Mother Nature hitting a wall, sustainable living, we're not too big to fail - all the usual stuff that real people have been seeing for years, but comes as revelation to this Pulitzer winner).

What's sad is what's always sad. There is a real opportunity to ask Nilekani some tough probing questions. They don't have to be adversarial, they just have to be firm, recognizing the challenges that confront these two nations. Maybe he's dealt with them in his book, maybe he hasn't, but they are questions that won't be asked on any broadcast outlet in this country other than, possibly, PBS.

And Charlie Rose missed this chance, so busy was he offering hospitality just short of cigars and brandy. But it isn't polite to let Friedman ramble on in the same vein as he always does, it's a disservice to the viewers. Rose needs to emerge from what I like to call his Friedman fog; if Tom's presence isn't a help, get him off the show.

Thursday, March 26, 2009

Martha and Bernie

I try to be philosophically consistent, or explain when it doesn't appear so. For example, I support making education better, but don't believe that we can try to educate everyone to the same level and then be surprised when we fail to see the outstanding kids do as well as we need them to do. Is this inconsistent, that education needs to be better but, for some students, we need to get them off the college track? I can see where some might think my logic doesn't hold up, but I've laid out my thinking in several posts and ask only for people to try to follow it, then evaluate it.

(At the same time, I don't find it consistent that people say we need to pay teachers more, but we also have to standardize the curriculum; so we're paying more money at the same time we're making teaching easier.)

Crises often present opportunities for great inconsistency. The problems are so large, and, clearly, no one orthodoxy is up to the task of making them better. It might seem incongruent to believe simultaneously in the power of the free market and in the Obama programs; if you believe both those things, you need to be prepared to explain yourself further.

David Letterman is going through such a time himself. He has been merciless in his treatment of "investor" Bernie Madoff, with a barrage of jokes and skits that can only make one believe that Dave has a few million dollars with the faux financier.

At the same time, better living guru Martha Stewart continues to get gentle handling on Dave's show. He has joked repeatedly (and somewhat tediously) about how, first, we can feel safer now that Martha's locked up, and now, how we should be worried that she's back on the streets (chortle, chortle).

But these crimes come out of the same sensibility. It's pat to argue that Bernie victimized people, while Martha's offense was "victimless." But, if you were one of the people who bought stock while tipped-off Martha was selling, you lost money just as surely as if you wrote Bernie a check. In fact, in some ways, her crime was worse; each of the people who signed up with Bernie had every opportunity to do due diligence, and they didn't. The other sides to the Martha trades had every right to believe that they possessed requisite information.

I'm not saying that the crimes were entirely proportional, and the relative prison terms strike me as roughly correct. But it's a much harder argument to claim that one case was unfair, the other fair. For both Martha and Bernie, the attitude of "something for nothing" was present, it informed their actions. And there were people on the other side of those trades who were hurt. Just because it's easy to see one group and not the other does not mean that we can't be bothered by both cases, and it doesn't mean that one perpretator should escape punishment.

Wednesday, March 25, 2009

How we can make globalization work for us

It's no secret to regular readers of this blog that I have concerns about the hidden costs of globalization. I do not, cannot, deny the positive effects of all the policies that fall within this term, but I think the standard economic model, the place where a great deal of the discussion begins and ends, is too limited to capture reality. We can draw all the graphs we want that demonstrate that two countries that engage in free trade both end up better off; the flaws come when we ignore all the constituent groups that make up a nation, and when we accept GDP as a proxy for the health of that nation.

For the average American, the picture of the "benefits" is a lot blurrier; in many cases, they've seen careers (not just "jobs") disappear, and, with them, large sections of whole cities and towns. The offshorers are doing a lot better than the offshored, as they've taken huge percentages of the gains out of the stream. Neither customers nor shareholders seem to have realized the huge benefits that have been promised; saving a buck on a sweater doesn't seem like much next to a loss of livelihood.

We have not done a good job at all of assessing these costs, as the discussions generally devolve into tiresome rants about "America First" vs. economic orthodoxy. In reality, we have no idea the extent to which American students are shying away from technology because they don't expect to find jobs in those fields. Instead, we urge them to enter those fields anyway because "that's the future."

In economics terms, we focus on expanding the supply on the questionable proposition that an increase will somehow create its own demand; we pay no attention to the reality that demand for Americans has been dropping, and we never touch on the certainty that differing wage scales may have something to do with that. Furthermore, to forestall the possibility that someone may raise such an argument, our leading offshorers and their apologists have decided to adopt the strategy of calling out American workers as stupid and incompetent, and the press has happily gone along with that.

But why should an American youngster, looking at the job market of the future, take on the challenge of a curriculum in science or technology? Your education is substandard, your work habits are terrible, you're more concerned with updating your Facebook page than doing any hard work. And the genetically-endowed, 100 hours a week young people from China and India are going to eat your lunch anyway.

Even if you are good, and hard-working, you run into another obstacle: college tuition rates seem impervious to economic reality. The Indian who attends one of the Indian Institutes of Technology (IIT) pays about $750 a year. You will pay $45,000 for a year at MIT. If the IIT Indian gets a job at $10,000, he can pay his whole tuition in under four months. The MIT grad will need a job at $600,000 to do the same. Obviously, that's unlikely, so the American will tend to start with a huge burden of debt, in fields that have uncertain prospects.

But there is an answer, and the only question I have is why it hasn't happened yet. What we need is for IIT to create a degree-granting distance learning program. Offer a B.Tech. over the Internet, and I guarantee Americans will flock to it, especially when it only costs, say, $2000 a year. (It can't be too hard for those brilliant students to set up, not if you listen to Tom Friedman or Bill Gates; it's probably no more than a weekend project for these bright-eyed geniuses.)

I'm not being sarcastic here; I honestly want to see this happen. You see, it's pretty obvious that higher education in the U.S. has turned into a scam. The economics makes no sense, with high tuition rates propped up by every expert who tells us that the key to our future is to get those degrees, and go back for more and more training, and that will make us globally competitive (despite the massive wage disparities). Let's run some numbers.

Visualize an Intro to Econ course in one of our larger universities. We see a giant amphitheater with students filling every one of the 1000 seats. At the front is an ABD (all but dissertation) lecturer who is, for all intents and purposes, a university contractor. Let's say each student pays $1000 per course they take (that's conservative). So we have a million dollars in revenue. The lecturer gets $5000 for teaching the course (that's generous), and let's multiply that by four to capture overhead, foregone tuition for the TA, and other expenses. Thus, the course costs about $20,000 to present. $1,000,000 vs. $20,000: that's pretty good profit for the school. Each student in that class is subsidizing something else (landscaping for the president's residence, the lacrosse team, and so forth).

The apologists for our university system will argue that I'm being simplistic here, that the value of a degree from the University of Podunk is far greater than simple dollars and cents. The experience, the exposure to alternative points of view, the friends that will last a lifetime.

If that's true, then existing on-line courses should be cheaper than on-campus instruction. One example: the University of Illinois Graduate School of Library and Information Science charges in-state students $5363 per semester. Their on-line students pay $1952 for 4 graduate hours. A typical load is 12 hours per semester. So on-line students actually pay about $500 more (maybe the U of I servers eat a lot).

I have to admit I'm at a loss as to why IIT hasn't set something up in the U.S. This is a huge arbitrage opportunity; they could charge three (four, five) times their normal tuition, American students would save 90-95%, and American companies would get more of these highly-coveted graduates, young people who, because they would be free of crippling student loan debt, could work for less than they possibly can now. Seems like a win-win to me.

Oh, there would be one big loser in this: the current American higher education system. One wonders how supportive our economics departments would be of free trade when their doors close because, say, Cambridge is offering degrees over the Internet ("Get your degree at the place that taught Keynes everything he knew"). They'd fight back, of course; we'd see and hear learned PhDs arguing why Internet learning is inferior, how it leads to a lack of quality, how we can't equate the on-campus experience to that on a computer screen, how we need to have an American presence in the field. You know, all the arguments they blithely ignore when confronted with them with respect to people's jobs.

One has to expect that the resistance would be keen. The supposedly-independent accreditation bodies might refuse to approve an IIT engineering curriculum. If they did that, they would be the ones called into question. After all, award-winning journalist Tom Friedman has claimed that IIT is "more selective than Harvard," so no one could argue that their program isn't up to snuff.

I'm not one to look for conspiracies everywhere. I can't seriously argue that there's some kind of collusion going on between foreign schools and our schools that prevents on-line learning from taking place. But you have to wonder when you see what would be a huge opportunity for, say, IIT to pull in some serious bucks and extend their brand, and they don't take advantage of it.

If this ever does happen, the entire field of American higher education will be rocked to the core. A system that depends on massive cross-subsidization will collapse under competitive pressures. To see a model for this, we need only look at American manufacturing, at our auto companies.

But Americans should, just once, have the chance to take advantage of offshoring in the same way that CEOs have. If that means a few free-market economists lose their jobs, well, hey, that's just the inexorable workings of the great market, and none of them, surely, could object to that.

Tuesday, March 24, 2009

Time to get out the map

Citizen Carrie shows us today, among other things, why the Internet is crushing traditional news media. In Stealth White House Meeting with Indian IT Delegation, she reports on a meeting of White House economic officials with a group of business leaders from India. Read it, because it's about all you're going to see from a U.S.-sourced writer about this meeting, and Carrie was able to do that by referencing Indian sources.

But, on to content. The major thrust of the meetings was America's H-1B program, and how desirous the Indian leaders were to have it continue and be expanded. One of the big players in the meetings was Larry Summers, director of Obama's National Economic Council and one of the administration's leading spokespersons for all things economic. Apparently, according to the Indian press, the Indian businessmen came away with confidence that our government, including Obama, was going to continue their support of the program.

India has the natural concern that waning American support for visas that allow Indians to come here and for offshoring will hurt its by-a-thread economy. They see the H-1B program as a cornerstone of free trade; since there are not a lot of goods traveling back and forth between the two countries, it's necessary to keep people and work in the form of services crossing the borders. (Of course, we never hear in any of these articles about the restrictions on Americans taking jobs in India, I guess it would be pretty inconvenient to mention that.)

Carrie cites a Business Week article that fails to mention the White House meeting, but does take up the party line espoused by Infosys co-chairman, Nandan Nilekani, that we need to be concerned that any restrictions on anything will lead to a situation where, "trade between the US and India will be neither free no fair. That’s something people on both sides of the globe need to be concerned about." No mention of possible trade-offs here, but I'll let Carrie have the last word on this point:
Why don't you write a full series of posts along these lines so you can educate Americans on how the loss of good-paying jobs for U.S. citizens is vital for continued good political and economic relations between the U.S. and India? I'm sure if we are fully able to understand the benefits of middle-class workers moving into cardboard boxes, while our "healthy" GDP creates good-paying jobs for Harvard Business School grads, lobbyists, and a select few who are able to latch onto the coattails of the business school grads and the lobbyists, we'd be less likely to call for those uncouth protectionist measures.
One other thing that interested me about the Indian visit was a report of another meeting the delegation had, with "thought leaders" Henry Kissinger and other luminaries (objective sorts like a CEO of a multinational and the former US ambassador to India). What's Kissinger doing here?

It's always possible that Henry has massive contracts with India and so wants to keep the U.S. financial pump flowing, but I had another thought, and, for that, we'll have to refer to the world map (you can look at your atlas if you don't have a map on the wall of your office).

Let's take the 30th parallel north, and start in Saudi Arabia and move east. We start with our "friends" the Saudis, then travel to Iraq, Afghanistan, Pakistan, India, and China. I'm not going to detail our manifold problems in those nations, other than to suggest that an amazing number of our current difficulties have something to do with those six.

And of those six, not one can be counted upon to be a reliable friend to American interests, except, maybe...India? And, if we have to pay a high price for that support, it's very possible that the calculation is that it's worth it. If we have to give a remarkably generous nuclear package to India, we'll do it. If we have to let them have our call-center jobs, we'll do it. If we have to educate their students, then allow them to stay in this country (at least for six years), we'll do it.

But we can't appear to be appeasing India - we're the strongest country in the world - so we wrap the discussion in "free trade," or "jobs Americans can't do," or "helping the downtrodden masses" (the last argument allowing CEOs to think of themselves as Mother Teresas with private jets).

It seems we've completely fallen away from the concept of evaluating policies on the basis of whether they make overall sense. It's not very far from that to the point where we treat any mention of India as being anything other than our good friends (as opposed to a nation with interests of its own) as somehow unpatriotic; we've done that with other countries, and that's not led to positive outcomes, as the U.S. is seen not as an honest broker for good, but as a country that's in the pocket of another. I don't think that makes a lot of sense.

Monday, March 23, 2009

A dollar a minute?

I don't write often about being a runner. It's something I do every day, I've done it for many years (moving on toward 32), and I love it, but I have no ability to make it interesting for a reader. Any stories I have would probably be pretty uninteresting to the non-runner; I suppose I could start a separate running blog, but, otherwise, I will likely always write very little here.

However, the newest issue of Track & Field News is out, and it informs us that the basic price to enter this fall's New York City Marathon is $171, a $16 (>10%) increase over last year. I get that this is one of those events that's pretty much recession-proof, at least that's the theory, but that sure seems like a lot of money to me.

When I started running, the sport was just emerging from its origins. The usual non-scholastic race, just before I started, was a few people would get together in a park, one guy would yell "Go" and start a stopwatch, toss it into the grass, and the runners would race around a somewhat-measured course. The first finisher (hard to call him a winner as there would rarely be a prize) would come across the "line," pick up the stopwatch, and record everyone else's times. I never actually ran a race like this, but there were still a few around.

The earliest races I ran were analogous to those of today. No timing chips, of course, but runners would line up, get their numbers and T-shirts, and off we'd go. There were very few 5Ks in those days, the base distance was 10K, but there were aid stations and finish lines and big clocks.

But the pricing was certainly different. The rule of thumb was to pay about $1 or so per mile. A 10K would cost $6-8, a marathon $25-30. As the running boom (erroneously credited to Frank Shorter's 1972 Olympic marathon win) progressed, prices went up to $1 per K: $10 for a 10K, $40 for a marathon.

Now it seems the standard charge for a 5K, at least in my area, is $25. Marathons are routinely over $100, with NYC going up to $171 ($6.52 per mile). Many people are paying more than a dollar per minute of running. Of course, price levels are higher than they were in 1977, but according to the BLS Inflation Calculator, the factor we should use is 3.50. So it would appear that the per-mile cost of a distance race has outpaced the rate of inflation. Not sure I like that. (Of course, shoe prices have gone up even more, but that's a topic for another day...)

Sunday, March 22, 2009

The rosiness of memory - Oscar version

This is about as untimely as can be, but I tend to write what rolls through my head, so here it is. The ads for the new Wolverine movie starring Hugh Jackman had me ruminating on the reviews of his stint as Oscar telecast host. In general, they seemed to be negative, with the most positive of them liking his personal charm but finding him unfunny; the worst branded his performance as a disaster.

I wonder what magical Oscar telecast people are remembering. I've been around long enough to remember some of the Bob Hope and Johnny Carson-hosted Oscars, and you know what? They were never drop-dead funny, never the transcendent event that people seem to think they watched when they were kids. The basic formula hasn't changed in years, and you can bring Billy Crystal out on a burro, or drop Whoopi Goldberg in from the ceiling, but the show's the same: an opening comedy bit (except for those years they started with a musical number - curse you, Rob Lowe!), award presentations preceded by unfunny banter, Best Songs (no matter how you present them, they are still usually pretty bad songs), dead people montage, and so forth.

We want to believe that the Oscars are significant, so they have to be the greatest entertainment extravaganza of all time. But it's an awards show, and they're honoring something that they cannot possibly show (there would be an interesting broadcast, show every nominee in full). The Grammys finally figured out that they were different, that they could be a music show interrupted occasionally by awards. They hand out a fraction of the total statuettes, but, because the statuettes are there, they can command amazing talent. (Then they do the odd slice-and-dice thing, where we stand to honor the once-in-a-lifetime collaboration of Yo-Yo Ma, T.I., Garth Brooks, and Jimmy Sturr, whose record was nominated in the Best Rap, Classical, Country, and Polka Song category.)

The Oscars can't do that (hey, look, it's Meryl Streep and Kate Winslet doing the famous taxicab scene from On the Waterfront - the Tonys used to try stuff like this until someone realized it's totally unwatchable), so it is all that it can be. It's an awards show, it cannot transcend that, and Hugh Jackman can pirouette around to his heart's content and it won't transform the Oscars.

So cut the telecast a break, think seriously how you might improve it (and most of the lists that come out every year would make it a lot worse), but understand the limitations come from the inherent nature of the enterprise.

Saturday, March 21, 2009

Good words, bad words

John McIntyre at You Don't Say offers a valuable post on 200 words you can well do without. The post talks about a list of "200 words that public bodies should not use if they want to communicate effectively with local people" produced by the British Local Government Association. McIntyre's comment:

Two points of interest here: dishonesty and pretense.

Jargon and euphemism can simply serve the purpose of gulling the unwary. In such cases, the writer knows full well that he or she is being duplicitous. Imagine the vendor of a product who assures you of your good fortune at being able to buy a smaller quantity for a higher price.

But I think that pretense is the more common cause of the kind of bloating that the LGA’s little list represents. That is what makes it difficult to tell whether the language contains an obscure meaning or conceals a lack of meaning. Jargon, of course, signifies that one is a member of the club. For some years past, people who pretended to scholarly study of literature had to master the opacities of deconstructionism, which the uncharitable insisted was an assertion that texts have no meaning, demonstrated by example. To be accepted as a proper member of the bureaucracy, one must write like a bureaucrat, and anyone who marinates in that stuff long enough winds up unable to distinguish any other flavor.

There's nothing here with which I can disagree, certainly, especially when I actually take a look at the 200 words. The Association has offered these terms with alternatives that are better for clear communication. No one who has worked in a company or dealt with government will fail to recognize most of these, words such as:
Actioned (alternative: do)
Autonomous (independent)
Commissioning (buy)
Engagement (working with people)
Exemplar (example)
Promulgate (spread)
and on and on...
One could take issue with a few of these; "outsourced" isn't quite the same thing as "privatised," and there are times the distinction is important. More importantly, as McIntyre points out, a list like this is unlikely to have much effect. Quite appropriately, he urges us to use the list not as a weapon, but as a personal reminder to avoid falling into bad language habits.


Much as I agree with the excision of these words of pretense, I would argue that we sometimes need the so-called five dollar words. English is a marvelous language, with influences from around the world, and has so many words that we can often find a word that is exactly right. And sometimes that word is long or unfamiliar. Being either doesn't necessarily imply dishonesty or pretension (not that I believe McIntyre is implying that, but it's an attitude I've run into a few times).

For an example of what I mean, see this post by programmer Jeff Atwood. Atwood exposes us to the words "idempotent," "orthogonality," and "immutability," and explains why they are vital to the software developer.

Perhaps these words, and others mentioned by commenters on the post, are jargon, but they're useful jargon, limited only by the reality that too few developers are aware of their meaning. But understanding them can save a lot of time, as each expresses in one word a concept of depth and complexity.

The danger, of course, is that they will creep imperfectly into the vocabularies of the obfuscators, and we'll have politicians and business leaders talking about "immutable orthogonal idempotence," and no one should want that.


It almost seems pointless to criticize the "entertainment news" shows; they're ridiculous parodies of real news shows (which are massively pointless as it is), and it's a measure of how we've debased public discussion that Brangelina and the octomom are given the massive attention that they are. We should take no pride in the success of Entertainment Tonight and Extra and so forth. They probably don't represent the death of our civilization, but, as the saying goes, if you're not part of the solution, you're part of the problem.

Occasionally, though, it gets bad enough that I find it really irksome. As I'm watching rather too much basketball this weekend, I have been treated to a barrage of ads for Monday's ET. The big story: the world says a tear-filled goodbye to Natasha Ricahardson.

I'm willing to bet that ET hasn't devoted more than a few minutes to the entire life of Natasha Richardson. She was, by all accounts, a great stage actress, but her film career was spotty; she may have received some attention for being a part of the great Redgrave acting family, and I'd imagine there was a flurry of stories when she left one marriage to be with Liam Neeson (maybe not, I don't really remember, maybe Neeson himself was not a big enough star at that time).

She was just not a "star" in the ET/Extra sense, but now it's blanket coverage - why? Because she died in an "interesting" way, and there is no end to the medical experts who want to weigh in with their opinions on a case they know nothing about. So these entertainment journalists can whip up some frenzy, accompanied by pictures of famous people dressed in black, and they've got hours of stories at hand.

It's disgusting, and the people who produce these shows should be ashamed, and the people who are tuning in to hear every detail of the death of someone they never valued in life should turn off their sets and think seriously about how they spend their time.

Friday, March 20, 2009

A new blog

I don't mean this to become the "Commenting on Joe Posnanski" blog (though I might get more readers if I did), but I'm going to mention him yet again. Posnanski is the excellent sportswriter who writes for the Kansas City Star, Sports Illustrated, and has a very entertaining blog that is mostly about sports. (He also has a new book coming out in September, but I'll let him push that on his blog.)

He started a new blog a couple of days ago, and I'll let him describe what it's about:
So, here's the concept of this blog: Newspapers, we all know, are drowning. There are reasons for this, numerous reasons, some complicated, some very simple, some interesting, some boring as sin. The times are changing. The economy is in the tank. Readership is breaking into niche audiences. The business model is broken. Print is dying. Newspapers lost touch with readers. Readers lost touch with newspapers. On and on and on and on and on.

People also feel very differently about what's happening to papers. Some say good riddance. Some worry for the future of Democracy. Some believe newspapers wrote their own death notice with their greed and stubborn reluctance to adjust to a changing world. Some believe this future was inevitable, that the overpowering newspaper business that once existed could not survive, it had to be picked apart piece by piece like the fish in "Old Man and the Sea." Some believe that a technological breakthrough and an economic turn can put newspapers (or whatever they will be called by then) a new life -- there are more readers now than ever before. Some think that if you hooked up a heart monitor to the newspaper business right now, you'd get a straight line.
The blog is called The Future of Newspapers, and, even though Posnanski has not mentioned my post that, coincidentally, came out the same day (The future of journalism), I thought I'd steer any of you who have an interest in this topic to it. What it will become, I don't know, but I always find it intriguing to get in on the ground floor of a blog that discusses topics in which I have an interest.

Thursday, March 19, 2009

He's so funny

Rush Limbaugh is truly one of the great comedians of our times. He professes to be the voice of the Republican party even as he points the way to its demise.

He hopes that Obama will fail. Even though he is theoretically in touch with "the people," he misses the reality that Obama is perceived as trying to fix the country's woes. Hoping that he will fail is the same, in most people's eyes, as hoping the country will fail. Of course, he says now that's not what he meant, that he simply didn't want Obama's techniques of fixing the economy to succeed. Sure not what it sounded like.

Now Limbaugh says, "I am all for the AIG bonuses." Even the people who believe that they need to be paid because of contracts, or who think these all-so-talented folks need to be retained to undo the harm they've created, would not be "all for" the payments.

Perhaps Limbaugh has simply talked so much that he can no longer hear, or, just maybe, he is a brilliant conceptual comedian. What he is not is someone that anyone should listen to for insight on the country's condition.

An ex ante vs.ex post post

The AIG bonus controversy is interesting, not so much for the predictable back and forth between those few who are lining up in favor of them (see Ruth Marcus of the Washington Post for the basic argument, one that is shared by the editorial staff of the Chicago Tribune) and those who are outraged that a company that got itself into so much trouble that the federal government would have to buy 80% of it would pay massive sums to the people who ostensibly caused that trouble.

What it does show is a dichotomy in the way two groups of people look at the world. The common concept of a bonus is that it is something paid ex post, that is, it's earned for work that has been done. Someone has a good year, produces a lot for their company, and they're rewarded with some extra compensation. The baseball player has a good first half, is named to the All Star team, and there's something extra in the paycheck. We all get that idea.

But there is another group of people who hand out (and collect) money for what they hope will happen, ex ante payments. To them, a bonus is not a reward for a job well done, but a sweetener to induce the recipient to do the good job. Under this theory, the financial analyst who would go home early if he's only making $150K will keep his nose to the grindstone for $300K. (Note that I'm not ignoring the argument that the analyst has to be paid or he'll go elsewhere, but, if that's all we were trying to accomplish, we'd just pay him $300K in the first place, not structure it as a "bonus.")

Most CEOs tend to favor the latter interpretation, even for themselves. This creates some odd situations, as when huge "incentive" payments are given to people who are leaving the company (11 of the AIG bonus millionaires have already left the company; for another example, see Bob Nardelli and Home Depot).

Philosophically, the ex post interpretation is consistent, and people feel it's consistent. Bonuses are a reward for exceptional performance and nothing else.

Ex ante, however, requires the believer to jump through mental hoops. You see, there are no companies in which every employee gets pre-agreed bonuses. A CEO puts one group of people in one category, the rest in another. And they don't even see the inconsistency of handing out speculative bonuses to one set of employees, while insisting that bonuses to the other, larger, group result from a lengthy (and imperfect) evaluation process.

Of course, the discerning reader is thinking. The CEO believes that the first set is valuable, integral to the success of the enterprise, while the second set is derived from a vast fungible supply of workers who can easily be replaced. That's not what they say, but it's certainly how they feel. So, perhaps, this post is belaboring the obvious, that "Our power is our people" is just a meaningless group of words that these captains of industry mouth whenever they want to seem like men and women of the people.

But there is still value in pointing out yet another way in which American companies do not exist to employ Americans or compensate Americans, and our insistence on acting in this way is a profound problem. We save one group of auto companies with operations and employees in this country in preference to another group of auto companies with operations and employees in this country, based solely on where their ultimate corporate headquarters is located, then we're surprised when a large part of the strategy involves the firing of American workers.

And this is why any attempt to save American jobs by funneling the money to corporations is problematic. Any such result is collateral, no matter what the corporate leaders might say. We have no proof that the money will be used to beef up domestic payrolls. Unfortunately, short of starting up another CCC, I can't imagine a better solution, so we'll probably be stuck using this indirect method of trying to bring down the unemployment rate; we simply shouldn't assume that the most rosy predictions will come true.

Wednesday, March 18, 2009

The new economic boom

Andrew Sullivan prints a letter from a reader who has just been laid off. He's not a financial executive or a realtor, he's in the new "boom" industry of infrastructure improvement:
Some construction companies, including the biggest names in the D.C. area, have reduced their workforces by over 90% just to stay afloat. Others have gone bankrupt.

With the collapse in residential construction, and with commercial construction struggling badly, contractors have directed their focus on government projects. For instance, my company bid a small ($600,000) project to demolish an existing government building, haul the material offsite, and restore the work area with new topsoil and grass -- a two, maybe three month-long project. How many contractors bid for this project? THIRTY-SIX! THIRTY-FRIGGING-SIX! Three years ago there probably would have been no more than six or seven bidders, because everybody was so busy. My company cut our bid to the absolute bone, then cut some more. I think we came in 7th or 8th place.

Every bid for government or public authority work we've submitted as a general contractor over the past year has been at cost, meaning break-even. Right now, as my boss told me yesterday morning, we can't even buy jobs (i.e. bidding contracts) at a loss just to keep revenue coming in and the field guys busy.
Competition is a wonderful thing, it really is, and we can see why from this story. With 36 bidders for every project, all these new initiatives that will be generated from the stimulus money will be done at the absolute lowest cost. Our public funds will get the biggest possible bang for the buck.

But there are potential downsides as well, and they may undermine the supposed advantages. The first is quality. I would love to think that all these firms that are winning these bids are doing so while upholding quality, but experience tells me otherwise. Either they're cutting corners somewhere, or they're lowballing the bid to win the contract, and we'll end up paying the overage.

The second is the actions that will be taken to keep costs low, each of which will seem justifiable when we're "spending the taxpayer's money." In this environment, offshoring begins to look downright patriotic, even though it undercuts one of the main reasons we want to embark on this stimulus - that of helping stimulate consumer spending through renewed employment.

We've already seen GM invest a billion dollars, money that was backed up by us, in Brazil. We can bet that the bulk of the modernization of health care systems will be done in other countries.

Thus, we need to temper our expectations for this stimulus. I don't know how it will end up shaking out, but it could well lead, not to a V- or U-shaped output pattern, but to an L, in which we end up stagnant for a long time (while we rebuild the rest of the world). I know that any measures that try to counter this will be labeled as that filthy word "protectionism," but I also know that the last thing we need is an ineffective outlay of public funds.

The future of journalism

Several trends have come together lately, and I think that I can make a prediction as to what the future of journalism is. Recently, we've seen the folding of the Rocky Mountain News and the complete web-ification of the Seattle Post-Intelligencer, every industry expert is predicting the demise of many more newspapers, and large media companies have or will file for bankruptcy.

Journalism is one of those fields, like education, that is forever catching up with business trends and presenting them as new-found knowledge. As newspapers and TV stations are folded into large "media companies," they will be treated as any other subsidiaries, with the same ethic and behavioral structure.

In particular, branding will be seen as the way to add value. For those not up on cutting-edge marketing lingo, branding is the practice of creating value in a product that is not intrinsically in the product. The classic example is the soft drink industry, which sells flavored sugar water at far higher prices than would be warranted by the cost of ingredients and distribution. The high margins are justified by "brand value," the feeling of homespun comfort you get when you drink Coca-Cola, or the youthful outlook of Pepsi-Cola. A lot of this is pretentious claptrap, but it works well enough that "branding" has become the unifying concept of the consumer goods market.

Local television news has known this for some time, so we have the female anchor who acts like nothing so much as a party hostess, ushering us into her living room to hear 10-second snatches of the news of the day. Chicago's CBS affiliate tried an experiment several years ago, of having one of our more respected journalists, Carol Marin, host a serious news program that might cover a story in depth, making room by giving only 30 seconds to the weather...and it was a total failure (at least in terms of the only thing that counts, the ratings). The brand message of local news is, "Come join us for some family fun; we'll slip in some serious news :-(, but we'll have the zany weather guy, and the zany sports guy, and generally enjoy a good time," and the ratings measure how successfully any "news" organization fulfills this strategy.

In print, we've heard of layoffs of reporters who cover uninteresting things like business, we've heard of publishers talking about "news productivity," and we've seen a greater emphasis on soft topics - even important news is increasingly covered with a kind of breezy irreverence.

So what I think is going to happen is that reporters are going to have to become brands, and they will be charged with providing an endless stream of content that can be "repurposed" by "content managers" (no editors here) to the various media platforms that are part of the modern news organization. A columnist will write pretty much nonstop, and some portion of that writing will go to the print newspaper (if there is one), some will go to the blog, some will go to the Twitter feed.

Someone like Eric Zorn of the Chicago Tribune will churn out endless words. I pick Zorn because he was one of the first mainstream journalists to get a blog, and he is now up on Twitter. His print column, which used to be pretty much it for him, is now a tiny proportion of his weekly output. I don't know what kind of pressure is on him to be typing all the time, but it can only intensify as the Zorn brand becomes increasingly important to the health of the Tribune company.

And that brings us to Q Score, the measurement of a public figure or product's penetration and popularity. Q Scores will be used to determine which "content provider" is reaching the public, and it will become a vital part of the decision to retain certain reporters or columnists. Each company will have to decide which part of the score they wish to emphasize, but, since attention is what every company craves, it's likely that penetration will be the dominant factor. We'll see more "controversial" writers in place of solid thinking and writing (hence, new media star Karl Rove).

It's not entirely impossible that this strategy will be successful enough that there will be money sufficient to support some of the traditional news-gathering tasks, such as investigative reporting or foreign bureaus. I'm not real hopeful, because part of this strategy is to pick the low-hanging fruit, and any activity that doesn't generate profits on its own is going to be discarded.

That's the future: the successful journalist will be the guy or gal who can pump out vast numbers of words (thinking optional) and can be appealing enough to gain a high Q Score, to be branded. It will not be cost-effective to have a reporter spend four months on a story that only generates a few thousand words, so depth will decline. Competition for "branded" reporters will be high, so the stars will make more money. On the other hand, the probability of a young reporter catching on will be far lower. If print survives at all, it will only do so because of failings in the Internet advertising model. Expect more multimedia figures, as the columnist will do a two-hour radio show and write longer-form pieces for the magazine section; this will lead to a breakdown in specialties, as we want to hear what the "star" has to say about the banking crisis. Therefore, we'll see less expertise as the brands are spread across the news universe (in other words, there will be no sportswriters or political writers, we'll just have the usual subjects commenting on whatever's hot).

It's difficult to see this as positive for those of us who actually value the gathering of news, but it could save the business ("Hey, let's see what Angelina Jolie is writing about today!").

Also: I didn't mean to imply that Q Scores would be the only metric; obviously Internet page clicks will be an important input as well. That these numbers may have little to do with actual quality will be of little consequence.

Tuesday, March 17, 2009

Two worlds

Every so often, I realize that I don't necessarily see the world the way other people do, and that that is not always to my credit. Many of my blog posts assume the opposite, that I challenge conventional wisdom in a positive way (if only everyone would listen to me...).

Being out of step, though, cuts both ways. It can blind me to the natural perceptions of others. One such occurred to me when reading a blog post by Joe Posnanski (I've written previously of Posnanski, most recently this past Saturday; he is a perceptive writer, and a fellow modern enough to use computers to analyze baseball). He was writing humorously about a trip with his family, then this:
We are the Jetsons. Map navigation systems in cars and telephones the size of baseball cards that play movies and cars that tell you when someone is calling and allow you to talk to them as if they’re sitting in the back seat. It isn’t just that none of these things existed a few years ago. None of them even seemed remotely possible. None of them even seemed like an invention you could dream about. I know I’ve written here before, but when I was 12 years old, I was utterly convinced that everything that could be invented had been invented. I guess everyone is like that, but the logic was so clear to me. Someone invented a box that sound came out of — that was radio. Someone then invented a box that had sound and pictures, and that was television. Someone made the pictures on the box colorful, that was color TV. Thus ended the great era of invention. There were no worlds left ti conquer. It had all been done except, maybe, a flying car.

Now, these things — the Internet, bluetooth, handheld devices, the Amazon Kindle*, 24-hour banking, high def television, Tivo, the Slap Chop, iTunes, DVDs, the WII — these are all part of our daily lives. They are so intertwined in our daily lives that in many ways they lose their wonder, and it becomes hard to remember how we lived without them.
And I realized something, that, to most people, that is exactly how they apprehend the world. They believe that we are incredibly advanced, then, something new comes out and they goggle at it anew.

That is 180º away from how I look at the world. Maybe it's because I've worked in technology most of my adult life, maybe it's because I read a lot of science fiction and science fact when I was a kid, but most new things that come out are pretty much expected to me. New concepts in science impress me (still not sure how Einstein did what he did), but the engineering that grows from that science is fairly natural.

People have been writing about Internet-type networks for decades. Geosynchronous satellites were first proposed in the 1920s, and Arthur C. Clarke wrote about them in the '40s, so GPS systems are not so much, "How do they do that?" as "What took them so long?" (Clarke also wrote, famously, "Any sufficiently advanced technology is indistinguishable from magic," which probably is a far better summary of what I mean than this blog post.)

I, and I would guess others who work in technological fields, despair at the lack of progress we've made, the low level of engineering that's actually been done. I look at Twitter or Facebook, which are what I would call "dummy apps" (not for their impact, and social innovation is important, but for their low level of technical complexity), and am impressed only with certain data storage implementation details; as computer applications they are remarkably simple.

Yet others look at Twitter or Facebook as incredible pieces of technology, as advanced science making its way into the workaday world. This is something that technically-oriented people (well, me) need to remember as they design things and talk about things.

Pre-publication update: I wrote the foregoing, then put it aside to ruminate a bit. While wandering about, I came a post from today by Hank Williams (welcome back to the blogging world, Hank). In one of those examples of synchronicity, he wrote about these same things, and probably did so better than I:
I have often lamented the short term focus of those of us who create products and services based on bits and bytes. The last decade has, in many respects, been depressing to me. The internet did a great thing in that it made all kinds of services accessible to more people. But it also redefined what "technology" means. Today, a web page is considered tech. And so, Alltop, or Digg or Blahgirls, are considered technology. This kind of stuff, which may have merit, muddies the waters, when it is stirred in the same pot with tools that require serious technical depth to create. The other thing that has happened is that an enormous amount of our focus and mindshare has moved to quasi-entertainment focused tools such as Twitter.
He laments our focus on the trivial, when technologists have the power to dig deep and do significant things:
And so as information technologists, what can we do about it? Well obviously we don't make food. But we may create a tool that helps farmers increase yield, or perhaps distribute more effectively. We can't build a new power grid, but perhaps we can develop a new modeling tool that helps develop insights into the most efficient way to organize such a grid. We will not, for the most part, be teachers, but perhaps we can develop software that helps increase the efficiency of learning.

The point is that we don't tend to solve the most important problems directly. But we can make it easier for those that are solving the problems to do so. We create tools that increase efficiency, and that indeed may make new more effective approaches possible.

And given that this is what we can do, this is what we must do.
He's so right, but there's no real roadmap as to how we might get there. Any accounting or inventory system has far more complexity than toys like Facebook, but venture capitalists aren't lining up to fund the former, much less power grid optimization software. We confuse the time something consumes with its significance, but the millions of hours a day people spend on Facebook is not creating much of anything (yes, there are the isolated examples of the Facebook group that raises money for something or other, but I'm talking net cost-benefit).

But this circles back to my original point: as long as people are amazed at the bells and whistles, it will be hard to get them to focus on the ways technology might be used to take on bigger challenges. And it will be hard to make a living in doing so until someone is convinced of that.

Monday, March 16, 2009


I've written before that one of our assumptions, that democracy goes hand in hand with free market capitalism, is flawed. They are two systems which really don't have all that much in common, given that one features "one person, one vote," while the other has as its centerpiece "one person, 40 billion votes" (if that person is Bill Gates). The tensions between the two systems become grossly apparent at various times in our history, and I believe one of those times is happening now.

However, there is one critical common factor of democracy and capitalism, and that factor is choice. Each of them gives a sense of choice that is not present in other systems; we can choose any from a number of candidates, and we can choose any from a number of toothpaste varieties. Of course, these are still constrained choice sets, as we see every presidential campaign when the voters of numerous states are presented with no choice at all, the decision already having been made. Even on the consumer side, choices are certainly not infinite. In general, though, we are usually satisfied that we are being given more rein for our selections than, say, people had in the old Soviet Union.

Naturally, choice requires knowledge, and, while we've made great strides toward making that kind of knowledge more available, it is still not clear that we want a gigantic number of choices for every single thing. Most people prefer Social Security to self-directed investments because there are too many options and the information is not 100% reliable (note that this is an inherent quality of anything that requires a forecast; information can never be complete, and has a cost that can exceed its marginal value).

Choice is also confounded by deliberate attempts to obfuscate and blur. We know this to be true in the political sphere and the economic. Misleading campaign ads, puffy advertising - these techniques have converged in modern marketing, which has become, essentially, the science of attempts to influence people's choices, even if the new selection contradicts their ultimate best interest.

What seems hard for people in these turbulent times is taking a consistent stand on one side or the other of the issue of choice. And it's not easy, especially as we each would like to have our own comfort level of choice (a lawyer who deals with health care issues might feel perfectly comfortable making a decision about a medical plan, and would find a national plan restrictive and unsuitable; the factory worker or software developer who is not immersed daily in these details would welcome the simplicity).

But that doesn't mean that we can't have individual clarity on these matters. If you believe that every home buyer made a free choice, that is, he or she was not defrauded, and you believe that choices have deserved consequences, then it's hard for you to support any kind of restructuring or bailout for homeowners.

And that attitude requires you to be consistent. As we see various companies using their federal money to pay bonuses to their employees (kudos for a job well done??), we hear an argument that is commonly made. Many of these employees are not the high-rolling decision makers who brought the world financial system down, they're clerks and secretaries and mailroom employees who happen to have large amounts of their compensation paid as a year-end bonus. Surely they don't deserve to be penalized for their innocent involvement in the scandals of overleverage.

But that argument ignores choice, the very real choice that each of these employees made when they decided to accept the job. Think of the secretary who is deciding between two jobs, one that pays $30K, the other $20K with a typical $20K bonus. We know that the hiring manager will tout that bonus as money in the bank, as something that has never failed to be paid. Choice, however, requires us to see through that argument to the very real possibility that conditions may not permit that bonus to be paid.

So I don't see how we, in effect, bail out that employee by looking the other way on his bonus simply because he doesn't make as big a base salary as someone who made the other choice. It still comes down to a matter of the choices people have made, and where an individual draws the line on how much we protect people from those choices.

Because then we get into a big argument as to which choices are protectable and which aren't, and I'm not real comfortable that we can reach an equitable conclusion. If there are larger considerations (we have to bail out the banks themselves because they are the blood of our global business system), then, fine, let's get that out on the table. If community stability is an argument for helping overextended homeowners, again, let's be upfront about that.

However, cherry-picking certain groups and their choices as being things we should shore up, and others we should not, runs a risk of being political pandering, and creates a very real possibility of social upheaval on a large scale. With everything else we've got on our plate, we sure don't need that.

Sunday, March 15, 2009


Fleetwood Mac was in Chicago last week, and I've always found their story fascinating.  Not the soap opera of their personal entanglements, that all had a cliché aspect to it that was more boring than anything.

No, it's the way that their massive fame and success tended to overshadow the real virtues of the band.  It's a pop group that had real chops, with a fabulous rhythm section and one transcendent talent who inhabits a world of catchy hooks and virtuosic guitar playing and has, despite the acclaim, been overlooked.  (I'm not going to get into the Christine McVie vs. Stevie Nicks business in this post, I've had too many battles over it in the past McVie is better).

I'm talking about Lindsey Buckingham, who's gone through an amazing journey of back and forth between his work with FM, his solo work, and his solo work that bled back into the reunions of FM.  He may forever be lost in the Fleetwood Mac mix, and that would be a shame.

So this Sunday I'm going to forget Rumours and Tusk, and think about Gift of Screws, Under the Skin, and the remarkably overlooked 1992 record, Out of the Cradle (which peaked at #128 on the U.S. charts).  There is a whole lot of good stuff on this album.

Here's a video from 1992 of Buckingham on Dave Letterman doing Countdown from Out:

And doesn't Dave look young here?

One more, this one a near-novelty song from the movie National Lampoon's Vacation:

One might consider Holiday Road just a throwaway, but it is as hooky and fun as anything Buckingham's done.

Saturday, March 14, 2009

If only you were there too, Tom

Yglesias gets it exactly right in a comment on Tom Friedman's latest bit of brilliance. First, Tom:
There is a huge amount of money on the sidelines eager to bet again on America. But right now, there is too much uncertainty; no one knows what will be the new rules governing investments in our biggest financial institutions. If President Obama can produce and sell that plan, private investors, big and small, will give us a stimulus like you’ve never seen.

Which is why I wake up every morning hoping to read this story: “President Obama announced today that he had invited the country’s 20 leading bankers, 20 leading industrialists, 20 top market economists and the Democratic and Republican leaders in the House and Senate to join him and his team at Camp David. ‘We will not come down from the mountain until we have forged a common, transparent strategy for getting us out of this banking crisis,’ the president said, as he boarded his helicopter.”
Beyond the bipartisanship, in the real world this would be in practice a recipe for rule-by-CEO. A key constraint on the decision-making would need to be that it served the personal financial interests of the 20 “leading bankers” and “leading industrialists” (whatever that might mean) and there’s no reason to think that would serve the public interest. It would be interesting to speculate about what would happen if you held a meeting with all those people and gave them some kind of truth serum that made them speak honestly and bargain in good faith, but that’s not going to happen. Instead, the way the system works is that Obama and has team will need to craft a response and will need to take responsibility for its success or failure.
Of course, this is the usual Friedman recipe for every problem, journalistic or otherwise. Find the nearest rich person, assume that they will act for the common interest instead of their own, and let them run with the ball. If you read Friedman at all closely, you'll see that his interactions are invariably with sheiks and CEOs and princes and international consultants. His few mentions of "regular people" come from other people's reporting.

More than baseball

I've written about sportswriter Joe Posnanski before. He writes for the Kansas City Star, and, now, for Sports Illustrated. (He even has the SI cover story this week, a fine piece on Albert Pujols.) He also has one of the great blogs around, which you can see if you turn your head a little to the right.

Posnanski is, I gather, pretty self-critical; in fact, one of the positives about his blog is his self-deprecating tone that crops up every so often. So, if he says he's happy about something he's written, it's likely worth checking out.

Such a story is this one from the Star. It's about a baseball scout, but it really isn't about baseball at all, it's just a lovely story about a man. I can't excerpt it, you'll have to read it, and I urge you to do so.

Friday, March 13, 2009

The rich are hurting, but not that much

Forbes is out with its annual look at the richest people in the world. This is a pointless exercise for many reasons, not the least of which is the unreality of the numbers. Let's say that Bill Gates has a crisis of some sort, and he has to turn all his holdings into cash. There's no way that he's going to be able to sell all his Microsoft stock at the same price; as he starts to sell, the price will drop, and the last share he sells will be at a severely lower price. But that's only a technical objection.

This is more a post about innumeracy. Take a quote from the Reuters story about this exercise:
The net worth of the world's billionaires fell from $4.4 trillion to $2.4 trillion, while the number of billionaires was down to 793 from 1,125.
(The way Forbes itself put it is similar:
The world's richest are also a lot poorer. Their collective net worth is $2.4 trillion, down $2 trillion from a year ago.
The reader is left with the impression that these chieftains have lost 45% of their riches. Perhaps that's supposed to make us feel better about the similar drops we've seen in our 401(k)'s. But, how did Forbes get these numbers?

They took all the billionaires on the list from last year, added up their values, then did the same for this year, and compared. This technique, however, is spurious, because, as is stated in the clause in the Reuters quote above, there are a lot fewer billionaires on the list.

It's as if we compared the wealth of two countries through GDP without correcting for the population - oh, wait, the press does that all the time, too. To be fair, Forbes does mention the average (down 23%), but the first number has no business being reported at all. If I had to guess, I would wager that I would find this impossible to explain to any news editor.

Clicky Web Analytics