"The Active Desk," was a touch-screen device developed in 1992 at the University of Toronto.
Stories by Gary Anthes
Forty years ago this summer, a programmer sat down and knocked out in one month what would become one of the most important pieces of software ever created.
The information architect talks about the lessons of computing history, the re-emergence of oral culture and all the data that Google doesn't index.
What's holding users back? Potential security risks and a loss of IT control topped the list of perceived barriers to SaaS adoption. With so much trepidation in the air, Computerworld decided to get the real scoop, so we interviewed six executives who have tackled SaaS projects.
Computer science -- it's not just about hardware and software anymore.
It would be hard to exaggerate the angst that has gripped the US in recent months as the election nears, markets churn and assets melt. But the headlines that have made us dread picking up the newspaper mask a long-term problem that may shape the future of America more than John McCain's plan for Iraq, Barack Obama's health care ideas or Uncle Sam's heroic efforts to rescue the economy.
Science and technology may not have been the focus of the recent debates between presidential hopefuls John McCain and Barack Obama, but both candidates have outlined some broad policy proposals and goals. That's a good thing, because, as some of the top technology thinkers in the United States today recently shared with Computerworld, the next president will have to tackle the country's ongoing decline in global technological competitiveness.
There's an age-old choice in IT -- whether to adopt a "best of breed" strategy for the power and flexibility it can bring, or go with a single vendor for accountability and simplicity. J. Craig Venter Institute Inc. (JCVI) believes in best of breed. The genomic research company runs Linux, Unix, Windows and Mac OS in its data center. For storage, it draws on technology from EMC, NetApp, Isilon, DataDomain and Symantec.
Every June and November, with fanfare lacking only in actual drum rolls and trumpet blasts, a new list of the world's fastest supercomputers is revealed. Vendors brag, and the media reach for analogies such as "It would take a patient person with a handheld calculator x number of years (think millennia) to do what this hunk of hardware can spit out in one second."
It has been just over a year since the introduction of Version 3 of the IT Infrastructure Library (ITIL). The update to ITIL, a framework for best practices in IT service delivery, was intended to sharpen its focus and attract a new group of followers.
Is R&D in the US losing focus, or just shifting focus?
It's impossible to look at the x86 family of microprocessors without wondering if, after three decades of dominance, the architecture might be running out of steam. Intel, naturally, says the x86 still has legs, while hastening to add that its battles with competing architectures are far from over.
Thirty years ago, on June 8, 1978, Intel introduced its first 16-bit microprocessor, the 8086, with a splashy ad heralding "the dawn of a new era." Overblown? Sure, but also prophetic. While the 8086 was slow to take off, its underlying architecture -- later referred to as x86 -- would become one of technology's most impressive success stories.
Patrick Gelsinger is an electrical engineer. He joined Intel in 1979, worked on the design of the 80286 and 80386 microprocessors, and was the chief architect for the 80486 chip.
William Scherlis is a professor of computer science at Carnegie Mellon University and director of the Institute for Software Research there. He specializes in software assurance, software evolution and technology to support software teams. He has a long association with NASA and the US Department of Defense. Scherlis spoke with Gary Anthes about progress in software development.