Without some good means of input and output, a computer makes a pretty good doorstop or boat anchor. Once we start plugging things into a computer, however, it can receive data and instructions, control other machines, and produce text and images on paper or on a screen. But "plugging in" implies the existence of sockets, jacks or other types of fittings into which something can be connected.
Though it seems like such a simple idea, plugging in calls for standards of compatibility and interoperability, both mechanical (will it fit?) and electrical (will it work?). For example, consider the lowly electrical outlet in the wall. It has two or three prong holes in a standard arrangement, and in the U.S. we can expect it to provide 110 volts of 60-Hz alternating current electrical power. Go to the U.K., and you find incompatible plugs and outlets delivering 220 volts that can fry your American appliances.
Computers aggravate this problem because they use a multitude of different connectors and sometimes use the same type of connector for entirely different devices.
As technology has advanced, older connectors often couldn't handle the newer, higher-density signals, and replacements had to be created. Thus the number of different connectors grew, until the industry realized that it had to simplify things.
The Universal Serial Bus (USB) idea was launched by Intel Corp. with the goal of replacing virtually all other connectors except for video. It was then taken up by many other companies, and in 1995 the USB 1.0 standard was announced. In 1997, a Portland, Ore.-based organization called USB Implementers Forum Inc. was created to coordinate USB development and standards.
USB developers also took the opportunity of the clean, new design to add a significant new capability: USB connectors and cables can deliver not only data to an attached peripheral, but power as well, thus eliminating the need for (and the cost of) those bricklike power-supply transformers that come with so many peripherals.
Today, most desktop computers and laptops have at least two USB ports, and they may or may not have the older legacy connections.
With USB, you can often connect one device to another in a daisy-chain fashion. Thus a mouse can connect to a keyboard instead of plugging into the back of the computer. And that keyboard can connect to a USB port in the monitor. If you need more USB ports, just connect a hub and get three to seven more ports. In theory, at least, up to 127 USB devices can be interconnected.
But for users and IT management, the best thing about USB may be that it simplifies the installation of new peripherals. Many don't even require separate drivers. The computer recognizes the device and installs whatever is needed. And the user can plug or unplug whenever he wants, without having to do anything else.
Within the past few months, USB 2.0 has become a reality, offering up to 40 times the throughput of the earlier 1.1 standard, which was announced in 1998. The 2.0 devices will also work with older USB ports but only at the older, slower speeds.
To get a sense of what this increase means in practical terms, I recently compared backup times to a Maxtor 3000LE 120GB USB 2.0 external hard drive. For USB 1.1, I simply connected the drive to my laptop's built-in USB port. For USB 2.0, I tried out PC card adapters (seemingly identical except for their labels) from Richmond, Calif.-based Keyspan, a division of InnoSys Inc., and from IOgear Inc. in Irvine, Calif. Over USB 1.1, it took 422 seconds to transfer 355MB, while USB 2.0 required only 48 seconds for the same data.
In comparison, copying the same data to another folder on the laptop's hard drive took 93 seconds while copying it to a Toshiba CardBus 5GB drive took 498 seconds.
The most recent variation is called USB On-the-Go (OTG). Its smaller connectors let devices communicate directly, without a PC or a modem. Two OTG-equipped digital cameras can exchange pictures by hooking a cable between them. We can expect to see OTG appear on phones, personal digital assistants and notebook computers.
USB devices use a small four-pin connector (two for power, two for data) rather than the much larger connectors typical of serial devices.
FireWire, The Alternative
Before USB, the primary high-speed serial connection was that defined by IEEE standard 1394 better known by the name Apple Computer Inc. gave it, FireWire. (Sony Corp. and others call it iLink.)Announced by Apple in 1995, FireWire was used for high-speed data transfers and for downloading digital video direct from FireWire-equipped camcorders. With a transmission rate of 400M bit/sec., FireWire handily outsped USB 1.1's 12M bit/sec. maximum throughput and gave it a real advantage where speed was essential, such as in moving large graphics files. Another advantage to FireWire is that it doesn't need a computer host, nor does it need to signal the other component that it's "alive," as current USB implementations must. This kind of data interruption makes USB impractical for most professional video work.
However, Apple gets a small royalty for each product that uses Firewire, and FireWire is more expensive to implement. These two factors have prevented FireWire from supplanting USB in low-end computer peripherals, where cost is critical.
Like USB, Firewire can daisy-chain peripherals together (up to 63, in a treelike structure), and it delivers up to 60 watts of power to peripherals. Firewire allows peer-to-peer device communications and allows hot-plugging and -unplugging. Cable length is limited to 4.5 meters between devices.
With USB 2.0, FireWire has been leapfrogged. However, a higher-speed release of FireWire is expected to up the ante again, providing speeds of up to 2G bit/sec.