Emerging Technology

FRAMINGHAM (07/06/2000) - New compression tools can help ease bandwidth crunch By Fred Hapgood CIOs who need to improve network performance--meaning all of them--can hedge their money across three bets: more bandwidth, caching or multicasting systems, and more compact compression. At first glance, compression seems the smartest option: It is typically cheaper than the other choices, it's usually effective immediately, it avoids installation and upgrade complications, and it leverages current investments in bandwidth or caching technologies. Unfortunately, compression has long suffered from several complications that often make it a CIO's last choice--but that may soon change.

HURDLES BY THE SCORE Compression's problems are pretty basic. For instance, it saves the most transmission resources when it runs end to end, desktop to desktop. (There is no benefit from a compression-decompression cycle in the middle of a connection.) That requirement can demand a high level of cross-platform compatibility and often some end-user training. Even more frustrating, compression makes files opaque to many network management tools, since it alters the format of the data that the tools use to identify and understand the traffic.

Compression is also a resource vacuum. The most common compression technique identifies repeated chunks of data and builds lists of these chunks at each end of a connection. The compression software then simply points at a place on the list instead of sending blocks of data itself. These "substitutional" or "dictionary-based" compressors can work wonders but at a cost: They gobble processing power and memory. These resource constraints can bite especially deep for network applications, where the compression algorithm may have only a few milliseconds in which to do its work. As a result, compression users often cut corners--such as limiting the number and complexity of the objects the system is able to handle--reducing resource load but also restricting compression efficiency.

RAISING THE BARRIERS Fortunately, computers have begun to deliver enough horsepower to take the roof off serious compression development. The growing density of corporate wide area networks has raised the cost of across-the-board bandwidth upgrades. And the commerce globalization has made compatibility with the low-bandwidth connections and per-bit pricing found on other continents more important. As a result, a host of new network-centered compression products and services are emerging from laboratories and vendors.

Developers have made considerable progress on compressors specialized for specific types of content, for example. The more assumptions a compressor can make about its material, the better it works. Medical Synergy Technologies of Rochester, New York, uses this fact in its medical dictation system that lets radiologists define common medical conditions and treatments as "repeated objects." With the system, doctors simply point and click from a list objects as they dictate, using speech only to specify how a certain patient differs from the predefined cases. The system then sends only the location of that case in the library on the other end of the connection, radically reducing time and bandwidth demands for both dictation and transcription.

SMART COMPRESSION ARRIVES Some companies have gone one step further. They combine several specialist programs into a library with enough intelligence to know which subject to pick and when. For example, such programs can recognize a spreadsheet file, decompose it into a more succinct set of formulas and data, and then pass that information down the line. The receiving end then recognizes the information as spreadsheet data and can regenerate the columns and rows of cells into their original format. Different compressors can do the same for presentation and word processing files. Such library programs may also recognize a file that was compressed with a less-effective algorithm, fetch the original application, decompress the file, recompress it with a more powerful algorithm and then send it on--all in real-time.

Intelligent Compression Technologies (ICT) of Falmouth, Massachusetts, makes a suite of such products. Loretta Michaels, director of Nortel Network's Wireless Internet Division, says that ICT's compressors can make a 9600bps or 14.4Kbps connection behave like a 56Kbps connection or better. "A file containing a PowerPoint presentation that would take an hour to download without intelligent compression could be received in four minutes," she says. According to Michaels, the Toronto-based Nortel was so impressed by ICT's compressors, which it originally bought for internal use, that the company is incorporating them into products of its own.

ICT isn't the only advanced compression company out there. Expand Networks of Roseland, New Jersey, builds network accelerators that save bandwidth by using several techniques. In one way, the accelerators can recognize the data packets in a single communications session, strip out their headers (which, by definition, are all the same), and replace them with a two-bit session number.

Feargal Ledwidge, network manager for Irvine, California-based Wyle Electronics, says he's seen a threefold to fourfold increase in effective bandwidth by using the products. "Without these accelerators we would probably have had to double our bandwidth purchases," he says. Tomer Zaidel, technical manager for Internet Gold--one of Israel's largest ISPs and another Expand customer--points out that compression is especially valuable when it replaces relatively costly international bandwidth. For his company, he says the Expand products produced a return on investment in a matter of weeks.

CLOSE ENOUGH FOR JAZZ While a spreadsheet or network traffic needs to be identical--bit-for-bit--on both ends of a compression chain, other data types aren't nearly so finicky. Speech, music and video can tolerate an error or two without serious impact on the receiving end. Knowing that, some developers have created techniques that send just the essential elements of the communication, at the cost of introducing (hopefully) unimportant differences between what the sender transmits and the recipient receives. These so-called lossy algorithms have a significant advantage: They can be adjusted to send more or less exact versions of a file depending on the users' needs and network conditions.

Workfire.com of Scottsdale, Arizona, for instance, makes a product called Workfire Server that can store several compressed variants of a given file, detect the connection speed of an end user and then pick the appropriately sized variation--faster connections get bigger files, slower connections get smaller ones. And Lucent Technologies has successfully demonstrated (but as of this writing not yet marketed) a system that senses network conditions and adjusts compression in telephony applications accordingly.

Even the management penalties imposed by compression are gradually dissolving.

NetScout Systems of Westford, Massachusetts, recently released a Decompression Probe that solves the problem of indecipherably compressed network traffic. The product takes a slice out of the flow of compressed data, identifies the program that compressed it, decompresses the slice and hands the necessary information over to the proper monitoring and analysis programs. John W.

Parsons, manager of Global Telecom Planning and Design at Eastman Kodak in Rochester, New York, says his company successfully uses the NetScout Probes to analyze network traffic into applications such as SAP and Lotus Notes.

MEASURING SUCCESS While all these tools seem to offer significant benefits, the increasing complexity of the compressor market means that CIOs will need to create their own standards for success and return on investment. And even some vendors admit that there are no guarantees. "Nobody's claims should be accepted at face value," stresses William Sebastian, president of ICT.

But while many people think of compression as a temporary fix--useful only until we can enjoy the ocean of bandwidth promised by analysts such as George Gilder--history suggests that human needs will expand even faster than bandwidth technology can provide. HDTV can already gobble up 6Mbps. In a few years we may find ourselves with even more data-hungry wall-sized displays, panoramic video and 3-D images. The ultimate video technology, holography, could suck up terabits of bandwidth and still ask for more. Given that, unless human appetites change remarkably, CIOs will be buying better compression for many years to come.

How are you handling compression problems? Tell Technology Editor Christopher Lindquist about it at et@cio.com. Fred Hapgood, a Boston-based technology writer, can be reached at hapgood@pobox.com.

NEW PRODUCTS

RESTORE DEAD DOCUMENTS It may not happen very often, but a damaged Microsoft Word, Excel or PowerPoint document can ruin a user's day--and the day of any help desk people who need to deal with the problem. To try to make everyone's life a little happier, Concept Data has created OfficeRecovery, a suite of tools intended to help restore corrupted Microsoft Office files. The products work with Word and PowerPoint files, starting with the Office 95 versions.

ExcelRecovery works with the 5.0, 95, 98 and 2000 versions as well as files from the Macintosh releases of Excel 5.0 and 98. The entire suite of tools costs $399 ($269 without PowerPoint recovery tools). For more information, visit www.officerecovery.com.

WORK IN ARUBA As networks expand, the tools that monitor them need to grow with the flow. Aruba 1.5 from Valencia Systems is a Java-based network monitoring and reporting package that the company claims can support more than a quarter-million network interfaces. Network administrators can view the data either in the included Aruba Console interface or via a Web browser.

Administrators can also partition networks as needed to make sure that they view only the data relevant to their needs. The product currently runs on Windows NT, HP-UX, AIX and Solaris Unix. Pricing starts at $10,000. For more information, visit www.valenciasystems.com.

PCS FOR EVERYBODY Ford's doing it. Delta Air Lines is doing it. Do you plan on providing PCs and Internet access for all your employees? If so, enRamp of Huntington Beach, California, would like to help. Companies with 50 or more employees can partner with enRamp to provide workers with new PCs, software and Internet access in their homes. Companies can also work with enRamp to provide secure intranet access, telecommuting and automated administration. EnRamp also offers laptops and support packages designed to prevent a sudden load on internal support services. The systems start at $24.95 a month. For more information, visit www.enramp.com or call 714 799-7267.

FILES, FILES EVERYWHERE Easy, inexpensive wireless file access existed only in dreamland just a few months ago. Now X:drive Inc. offers low-cost wireless access to corporate files with little effort on the customer's part. Users can access files via Palm VII wireless devices or wireless application protocol (WAP)-enabled cell phones. The service is free for individual users. Enterprise Solutions packages--which include guaranteed 100 percent uptime, multiuser folder sharing and enhanced security--start at $4.95 per month per user. For more information, visit www.xdrive.com or call 310 883-2800.

FIGHTING ONLINE EVIL If your company's network traffic swamps the average firewall, there may be a new answer for you. CyberGuard Corp. and Data General have joined forces to create the KnightStar Plus, a combination of CyberGuard's KnightStar firewall appliance and Data General's Aviion server products. The products support remote monitoring and administration via a Web browser and are available in rack-mountable configurations. Prices start at $27,995. For more information, visit www.cyberguard.com or call 800 666-4273.

OFF TO A QUICKSTART Getting an e-commerce site off the ground should no longer take days, weeks or months. InterLan Technologies' new QuickStart service offers preconfigured Unix, Linux and Windows NT servers that they claim can be up and running within hours. Small to midsize businesses simply visit the InterLan website or call a sales rep, request the specific system configuration they need--including processor type, operating system, memory, storage and backup options--and InterLan can connect it to its network in less than a day.

Prices start at $650 a month plus a $1,000 installation fee. For more information, visit www.interlan.net or call 888 452-6825.

REVISIT GRAPHICAL USER INTERFACES STICKY GUIS A decade ago, graphical user interfaces drew skeptical looks. Not anymore.

BY FRED HAPGOOD

In May 1990, Microsoft literally changed the face of computing, proving that even the most clichd hyperbole can be accurate on occasion. The release of Windows 3.0 decisively shifted the nature of the computer interface from command-line prompts (remember "c:>"?) to a collection of onscreen controls: icons, windows, pull-down menus and pointers--all activated by a cutely named pointing device, the mouse.

Microsoft's operating system wasn't the first of its kind. Several computers with graphical user interfaces (GUIs) had debuted during the previous 15-plus years, including Apple Computer's famous Macintosh and Xerox's groundbreaking-if-unmarketed Alto. Until Windows 3.0, however, users could debate which interface--old-style text or the new graphical--made more sense in a given situation. By the end of 1990, however, Microsoft's marketing muscle made these debates academic. It was clear that for the foreseeable future, GUIs would be the interface of choice, like it or not.

On balance, CIO didn't like it. In two articles, in April 1991 and July 1992, we advanced several concerns about this new technology. We worried that developers would slight some useful programs (such as statistics packages) because they did not fit naturally into the GUI model. We worried that the interfaces would promote increased error rates. We worried about the deterioration of user skills, the robustness of GUIs in networked environments, their high resource demands and their sluggish performance.

Above all we worried about their development expense. "Just to match the functionality you had in your previous application takes almost an order of magnitude more work," we warned. GUIs were time-consuming to write and test.

("In the typical GUI application, the user interface accounts for half the programming.") They were an invitation to waste dozens--even hundreds--of expensive man-hours on aesthetic intangibles that command-line interfaces had evaded entirely. We knew that GUIs saved the end user time and trouble, but that time and trouble did not simply disappear. Instead, it shifted upstream to lodge in some poor CIO's development budget.

Eight years later, we see that many of our concerns were misplaced. GUIs worked fine in networks, processing power caught up with graphical demands, and developers didn't neglect programs of interest to business. If anything, however, we understated the cost implications.

That's a bit of a paradox, as application costs generally go down over time.

Standards emerge, tools improve, and developers acquire experience. And all of these things have happened with GUIs. Graphics tools are incomparably more powerful than 10 years ago, for instance. And GUI design has become much smarter. "We've stopped using 'world' metaphors, where if you wanted mail you had to hunt through a picture of a town to find the 'post office,'" says Alan Millar, creative director of Chicago-based e-business consultancy Xpedior.

If nothing else had changed, GUI development costs would probably be well under control. Instead, the Web appeared and lifted these expenses to whole new levels. A GUI is no longer just a control panel for a piece of software; it is a company's public face. Further, website GUIs need to work for a diverse and impatient population, provide orderly access to very large libraries of information and resources, and reflect changing fashions. (These days not even a bank can afford to look out of date.) "Every site has to wrestle continuously with the balance between convention and innovation," observes Karen McGrane, senior director of information architecture at Razorfish, a New York City-based digital solutions provider. All this added complexity combined with the need for near-endless updating and improvement constantly raises costs.

There are some reasons to hope this burden might eventually lift. If specialized devices--such as PDAs and cell phones--take over more computing, their tiny screens might reduce the importance of GUI design. Speech recognition might also reduce some of the emphasis on graphics. And today's pressure toward greater customization suggests that eventually all users will have their own custom interface that will make display decisions--harvesting raw, XML-enabled data directly from company servers and formatting according to the user's needs.

All of this might happen, but few in the field seem to expect it. Experts like McGrane believe that designing applications for mobile devices and configurable user interfaces will just increase development constraints and make things even more expensive, as will integrating speech recognition. In addition, new interface technologies, such as 3-D, virtual reality and animation, seem likely to keep costs high.

One technology produces particular chills: It projects heads-up displays onto eyeglasses so that the user seems to be looking at a very large screen. If accepted by business, the technology could generate square yards of display surface to tinker with--and to pay for.

PREDICTIONS ISPS ISPS ON THE ROPES

CHECKED YOUR Internet service provider's financial health recently? It might not be a bad idea, particularly if your company gets its access from a second- or third-tier provider not associated with a major phone company or backbone network.

According to Steven Harris, a senior research analyst for business network services in the New York City office of market researcher IDC, the recent exit of money from the tech markets has left some ISPs light on cash. The problem, Harris says, is that almost none of the providers (America Online being a rare exception) are actually making money. Instead, they've been paying the bills thanks to Internet-hype-inflated stock prices. But now that a bit of the air has gone out of the techs, these same ISPs may be forced to depend on nonexistent earnings to cover their costs.

The market's continuing slump (at least as of this May) pretty much guarantees that ISP stocks will continue to lag, as now-wary investors expect something more from their investments--like profits. "It's been a number of years," Harris says, "and I think it's getting harder for [the ISPs] to say, 'Oh, just give us more time and we'll make money eventually.'" A slowly sinking ISP probably isn't a reason to panic, however. Harris says that most ISPs pay their bills after they've already provided their customers with service. Given that, customers would probably have at least a little time to change providers before being cut off, even if their current ISP choice goes belly up.

An ISP consolidation probably won't mean increased prices, either, says Harris.

Many large ISPs are tied to major parent companies such as Baby Bells, which see Internet access as a long-term opportunity, even if it's currently unprofitable. New backbone carriers will most likely offer Internet access as a way to entice corporate customers that must provide service to mobile employees. And phone companies, such as WorldCom (formerly MCI WorldCom) and AT&T, often bundle Internet access into contracts as a way to attract or retain telephone customers.

Given that level of competition, Harris says he expects Internet access prices to go down, even if a few providers curl up and die. And that's good news--unless, of course, your provider is the one that goes silent.

-Christopher Lindquist

UNDER DEVELOPMENT HIGH-TECH CHIPS CERAMICS WITH A TWIST A double gyroid might sound like a fancy figure skating maneuver, but it's actually a structure that could lay the foundation for a new generation of cheap and powerful integrated circuits and storage products.

The double gyroid--a three-stemmed vine intertwined with a mirror twin in a seemingly endless matrix--is the molecular configuration of a new kind of ceramic (a nanostructured ceramic) that could someday replace the chemical polymers now used in the manufacturing of chips and magnetic disks. The double gyroid structure creates a densely packed ceramic composed of millions of interconnected struts, which measure only about one-thousandth the diameter of a human hair. The result is a durable, resilient substance that's also lightweight and cheap to make.

The new material's primary technical benefit is that it has a very low "dielectric constant," which makes it less susceptible to electrical crosstalk than today's commonly used polymers. That's an important consideration as chips get smaller and more susceptible to electrical interference. Additionally, designers can spray the material into a film, which they can then twist into a variety of shapes to meet a product's needs. It could even be applied to sheets of paper or fabric.

Ceramics don't naturally self-assemble into a double gyroid. Yet creating a nanostructured ceramic is relatively simple, says Edwin Thomas, a professor of materials science and engineering at the Massachusetts Institute of Technology, and the developer, along with his graduate students, of the process used to produce the material. A single-step, room temperature process forms the ceramic by exposing two organic polymer blocks--one containing silicon, the other hydrocarbon--to ozone. Chip makers could coat a silicon chip substrate with the material using only ozone and ultraviolet light. "You don't need a billion-dollar fabrication line," says Thomas.

Since a nanostructured ceramic is something of a snap to make and use, Thomas believes that his technology and others that similarly lower the cost and increase the ease of chip production will greatly reduce the cost of building chips and storage products. This could lead to a boom in disposable computer products within five to 10 years, he predicts. "You can imagine having the price of electronics so inexpensive that they'll be in clothing and objects you use once and throw away."

Nanostructured ceramic's commercial potential has attracted the attention of IBM. The company, along with the National Science Foundation, is funding Thomas' research. The potential payoff could be huge. "This is the way to go if you want to put a circuit into a newspaper, cereal box or T-shirt," says Thomas. -John Edwards

Join the newsletter!

Error: Please check your email address.

More about America OnlineApple ComputerAT&TCyberguardData GeneralDeltaDelta Air LinesEastman KodakExpand NetworksIBM AustraliaIDC AustraliaKodakLucentLucent TechnologiesMassachusetts Institute of TechnologyMCIMCI WorldComMicrosoftNETSCOUTNetScout SystemsParsonsRazorfishSAP AustraliaWorldComXeroxXpedior

Show Comments

Market Place