When it comes to environmental sustainability, the information technology community has seriously mistaken its priorities. Our latest research has confirmed what we have been saying for four years. The IT industry is already energy-neutral in terms of its consumption and savings, but there is still no credible scenario for safely managing the global production and disposal of literally billions of personal computers, mobile phones and other electronic devices. Yet even today, improving the energy efficiency of IT equipment is still the overwhelming focus of the Green IT community.
Stories by David Moschella
Having written nearly 200 columns for Computerworld over the past eight years, it's time for me to sign off, at least for a while. Writing regularly from atop a powerful platform such as Computerworld can get in your blood, so letting go isn't easy. But I will always be grateful to everyone at Computerworld for letting me try my hand at this, especially my numerous, and invariably helpful, editors. Most of all, I would like to thank Computerworld's readers for their countless thoughtful comments, and even their sometimes stinging critiques.
The news that Bill Gates plans to cease being a full-time Microsoft employee in July 2008 was anything but surprising. It's been apparent for some time that Microsoft's founder and chairman (a title he will retain) is more energized by the work of the Bill & Melinda Gates Foundation than by the day-to-day churn of the software business. Taking on the world's many inequities and diseases has clearly emerged as a higher calling for a man whose single-minded focus has often been the key to his success.
The consumerization of IT continues apace. Over the past two years, I have been watching technology products and services initially developed for consumers and small businesses have an increasingly significant impact on enterprise computing. In fact, I wrote two columns on that topic: "Keeping Up With Your IT Consumers" and "Change at Hand for PC Management".
Today's IT debate is about how the Internet should be managed.
Few topics get more attention in our industry than the relationship between IT and competitive advantage. Does IT help drive revenues, gain market share or improve margins? That's what ultimately matters. My firm's research shows that a little more than a quarter of business executives believe that their use of IT gives them these sorts of direct advantages in their markets.
You don't have to be a weatherman to detect a touch of froth in the air. For the first time in more than five years, the IT industry press is pretty much all positive, with the eyes of the Internet world focused on the future. There is even talk about IT being "disruptive" again. Has a new up cycle begun, perhaps even a minibubble? It would seem so.
You may not have noticed recent stories explaining why German book publishers are saying no to Google's -- and previously Amazon.com's -- requests to put German-language books online. But while it's easy to dismiss this as just another publishing industry story, there are larger lessons that continue to get too little emphasis.
Many years ago, a major truck manufacturer discovered that its costs were higher than those of its competitors because it used so many more parts to build its fleet. On further analysis, it learned that it actually used eight oil dipstick designs, while its competitors used just one. Though each dipstick added design and manufacturing costs, the eight designs contributed no meaningful marketplace value. Many other parts of the truck were similarly different, but not differentiating.
Everyone agrees about the problem. With very few exceptions, enterprise IT organizations are drowning in low-value work. My company's research shows that roughly two-thirds of most IT staffers' time is spent on the installation, migration, patching, security and compliance work that, while clearly necessary, contributes little or no direct business value. In some organizations, the figure is closer to 90 per cent.
How do you know when a best practice is about to give way to a next practice? Typically, you look for credible signs of change and a compelling story line that points the way toward a different future. Well, if you believe in those two criteria, you might want to start to rethink the way your company manages its PCs.
In the early 1980s, when the PC emerged and computer hardware began its long transition toward commodity status, there were three great US computer technology companies: IBM, Digital and Hewlett-Packard. While we all know now that the power of microprocessor-based systems changed computing forever, to appreciate the challenges that HP currently faces, it's worth revisiting how each of those companies responded to the changes that roiled the industry.
In a column last year I argued that one of the main reasons so many business people embraced Nicholas Carr's erroneous but provocative Harvard Business Review article, IT Doesn't Matter was that it played to some of their deep-rooted fears and anxieties. Simply put: if IT doesn't matter, that means it's not really worth learning about, a reassuring thought to many IT-phobic business people, both at the staff and executive levels.
Over the past several years, few IT industry developments have been covered by the media as eagerly as Wal-Mart Stores' efforts in RFID and McDonald's and Starbucks' support for Wi-Fi. Much of this interest stems from the simple fact that most people prefer to read and write about things they actually know something about. Most stories about the projects of IT customers are a step or two away from our daily experiences, which makes them feel somewhat less real and compelling.
It was one of those days when you wonder if you're spending too much time on the conference circuit. The topic seemed important enough: the future of software licensing. But while the vendor panelists were all sufficiently rehearsed, it was obvious that they were struggling to find something new to say.