For those of us who live and work in Sydney there was no avoiding US vice president Dick Cheney's recent visit to Australia.
The arrival of the Queen Mary 2 and Queen Elizabeth 2 had brought traffic snarls and disruption to the city days earlier, but Cheney seemed an even bigger presence in Sydney than the ocean liners. The VP's 30-car motorcade closed the Harbour Bridge and several blocks of the CBD for hours, while at night Blackhawk helicopters patrolled the skies, scaring the bejesus out of much of the inner city in the process.
The Iraq War is a lot like one of those Blackhawk helicopters hovering outside your flat at 2am; it refuses to be ignored. I admit that I am far away from the centres of power in Washington and Canberra, but the more that I read, watch and learn about the war in Iraq, the more I want to ask: what role has IT played in all this?
You can blame Rumsfeld or Bush or Howard, but if we sidestep the political arguments for a minute and look at the systems underpinning the war, there is plenty of evidence of what retired US Army Colonel Andrew Bacevich calls "the assumption among forward thinkers that technology - above all information technology - has rendered obsolete the conventions traditionally governing the preparation and conduct of war".
Bacevich's comments appear in Fiasco, two-time Pulitzer Prize-winning journalist Thomas E Ricks' account of the Iraq War's strategic shortcomings. Ricks chronicles several mistakes made by the US and its allies in Iraq, but many of the most dire ones are a direct outgrowth of new military efficiencies driven (supposedly, at least) by IT. For example, the US Defense Department's belief that the war could be fought with a relatively small number of troops can, at least in part, be attributed to a misplaced confidence in IT-related advances in warefare, such as better battlefield communications, more precise global positioning, and superior command and control capabilities.
I'm no lecturer at West Point or Sandhurst, but when I read criticisms of the war that involve poor intelligence sharing, an emphasis on tactics over strategy and an over-reliance on technology to provide solutions, alarm bells go off. Substitute "implementation" for "tactics", and you could be describing just about any failed IT project in history.
I'm not bashing IT managers, or their military equivalent; they have to do more with less, just like everybody else these days. I'm talking about something deeper: our belief that technology can make things better.
Thirty years after the PC was supposed to make us more efficient, we all do more work more than ever. And our willingness to believe that technology can be applied to any problem is the kind of thinking that leads us to view people as metrics, and wars as bloodless, surgical endeavours that can be risk-managed from afar.
Not only does technology fail to solve problems that require human solutions, frequently it doesn't even work like it's supposed to. Just ask Dick Cheney, whose flight out of Sydney was diverted to Singapore because of "mechanical problems".