Technology Product Focus: How to tell your NAS from your elbow

Many IT and network managers are grappling with whether to choose relatively inexpensive, easy-to-implement network-attached storage (NAS) or storage-area networks (SAN), which are potentially more powerful but also more expensive and harder to implement.

Managers tend to go with NAS if they have tight budgets, need to bring more storage online quickly and work at companies leery of fast-changing technology. Enter network-attached storage (NAS) devices: Low-end, self-contained appliances that don't need keyboards, monitors or much else besides an Ethernet connection and AC power jacks to deliver gigabytes of disk space to your workgroup or remote office.

Besides being a replacement for a standard PC-based Windows NT or NetWare server, NAS devices are also great for small businesses or remote offices where support staff is thin to nonexistent. They are also great if you need to place a server in a public area and want the added security of not having a keyboard and monitor to tempt prying eyes and fingers. And some offer easy methods of adding extra storage when the original drives inevitably fill up with files.

SANs are more appealing to companies that need fast data access for widely distributed users and have the money to make long-term investments in their storage infrastructures.

Information technology managers must weigh cost against ease of implementation and management, speed of data access, scalability, backup and fail-over capabilities and interoperability with other parts of the network. The decisions will become more urgent as the Internet and applications such as customer relationship management and enterprise resource planning generate more customer data.

Even when IT does decide on a strategy, management must be convinced that the move is worth it.

NAS usually occupies its own node on a LAN, typically an Ethernet network. In this configuration, a single server handles all data storage on the network, taking a load off the application or enterprise server. By detaching storage from individual servers, it makes the data available to any user on the network. NAS is essentially plug-and-play storage that uses proven Ethernet and SCSI technology.

A SAN, by contrast, is a high-speed dedicated subnetwork connecting storage disks or tapes with their associated servers. Although these components can be connected via other protocols, including SCSI or IBM's Escon optical fibre, they're associated with the emerging high-speed (133Mbit to 4.25Gbit/sec.), long-distance Fibre Channel protocol.

SAN technology is designed to support disk mirroring, backup and restoration, archiving and retrieval, data migration among storage devices and sharing of stored data among servers. SANs can also be configured to incorporate subnetworks such as NAS systems.

Weighing the cost

An information systems manager at a health care group, for example, says he wants to implement a SAN to accommodate storage needs and allow it to give patients access to billing records over the Web. Earlier this year, he began talking to Compaq Computer about building a SAN, but his bosses recently applied the brakes to the project, wanting more time to consider costs and the still-evolving SAN technology. The situation is common.

"These are expensive decisions - I spend a lot of my time thinking about storage and trying to think about it strategically," the health care group's IS manager said. Its 1Tbyte of data is currently stored on rack-mounted disks connected to individual servers via SCSI buses. This common approach is easy and relatively cheap. Storage can be increased merely by adding SCSI host bus adapter ports in the form of add-in cards to the server, daisy-chaining more devices off existing buses or adding servers - or all three. However, each SCSI bus can support a maximum of 15 disk arrays, and each SCSI bus can stretch no farther than 22.8m. from the host.

Large storage needs can quickly translate into a dense jumble of hardware, with data accessible only through individual servers. To see data on other servers, users must go through the network - a process that's slow for the user and bogs down the network. In some cases, the user may not be able to see data without switching drives.

In addition, if any device needs maintenance, the entire string must be taken offline.

On the other hand, "we can get a NAS box in a shipment at 9 in the morning, and it can be up and involved in production by noon," says Mark Dahl, distributed systems manager at BP Exploration (Alaska).

"NAS is very stable technology," says Lauri Vickers, an industry analyst at Cahners In-Stat Group. "It provides a lot more scalability than conventional storage" but costs much less.

However, she adds, SANs have "the most strategic intelligence that can be applied best to large volumes of stored data".

The promise of SANs must be balanced against their comparatively high costs, as well as implementation and management headaches, say analysts.

"Ninety-one cents of every dollar spent on nontraditional storage, specifically for SANs, goes to management and maintenance," says Vickers.

Another negative is the question of interoperability. The Storage Networking Industry Association (SNIA) and the Fibre Alliance are engaged in an ongoing battle over Fibre Channel standards. The SNIA has support from Compaq, Sun Microsystems and other vendors, while the Fibre Alliance is backed by EMC Corp.

Complex needs

BP Exploration recently invested in both NAS and SAN technologies to accommodate almost 5Tbytes of data on Unix and Windows NT platforms.

NAS technology is "way ahead" of SAN technology for usability and reliability, especially in cross-platform applications, says Dahl. However, he says, SANs let him allocate data to any free space on a storage network.

The health care group's major storage requirements are for its online transaction service, which demands high availability but not huge capacity, and a growing 3Tbyte data warehouse. The data warehouse must store large amounts of data but has comparatively lower uptime requirements.

The group has added 800Gbytes of StorEdge disk arrays from Sun while it evaluates SAN products. Its systems manager has been talking to EMC and Veritas Software and says he has a lot of confidence in Veritas, but he says he isn't sure he can afford EMC's high-end products on his $US2 million budget.

"If money weren't an issue, we'd go with EMC because we know they're solid products," he says. "Moving to a SAN, I do worry about interoperability and management issues." Those concerns lead some customers to stick with safe storage choices.

Earlier this year Dow Jones & Co, an investment and publishing company in New York, spent about $US3.5 million in storage technology, most of it configured as NAS, according to senior systems administrator Marc Appelbaum.

"Dow Jones doesn't want to set the pace with a new technology," says Appelbaum. "This is a conservative company [that] is not going to take a risk on a technology that's not mature."

The 100Gbytes of data being stored by Dow Jones includes customer preferences, archived stories from The Wall Street Journal and the backup for the Web version of the newspaper.

Dow Jones turned to Storage Technology, a vendor that deals in SAN products and tape storage devices, for both products and help in implementing them. The primary storage device is the StorageTek L700 tape library, with 13.6Tbyte capacity and Ultra SCSI (20Mbyte/sec data-transfer) connections.

Appelbaum says his department is planning to set up separate Compaq StorageWorks SANs for individual business units that request them because they offer high-speed access to data.

Easier choices

For some IT managers, the choice of SAN vs NAS was easier.

Ken Ciaccia, information services project manager at Armstrong World Industries, has implemented a NAS configuration. He says the flooring company is currently storing 628Gbytes of data from its SAP applications, with another 28Gbytes added every month. The SAP data is stored on EMC drives attached to the Armstrong LAN.

In addition, the company is moving to an imaging system and needs to store contracts and other documents in Portable Document Format.

"SAN is a technology that makes sense for applications where customers constantly have to access your data," Ciaccia says. "We have storage needs but not so much constant access needs."

John Stone, administration director at the Office of the Public Defender for the Ninth Judicial Circuit in Orlando, oversees the court records for two counties. Staff members in both his and the state attorney's office, as well as independent lawyers, need constant access to these files. Nine months ago, he implemented a SAN to provide faster access to data.

"We have over 240Gbytes of storage capacity, [which] we don't really need all of right now, but we do need the speed that the storage network gives us," he says. "When you have six or seven hundred people trying to access data at the same time, you need a SAN, or they might as well go to lunch every time they try to get to a document file."

The court's SAN comprises one Compaq RAID Array 8000 Pedestal, two HSG80 array controllers, two Peripheral Component Interconnect-to-Fibre Channel adapters, two eight-port Fibre Channel switches, 15 9Gbyte Ultra SCSI 10,000 RPM drives and six 18Gbyte Ultra SCSI 10,000 RPM drives.

Another factor in Stone's decision to choose SAN technology was the electrical isolation the SAN allows for switches and controllers.

"We were looking for the most reliable and the fastest option, and this was what we chose," says Stone. "Everything is mirrored and striped on the drives. This provides everything, including hot backup, unless you lose the drive, and then all the data is mirrored."

After a "substantial" discount, Stone says, the court paid a little more than $US100,000 for the storage network. Even though Stone says he's satisfied with the technology, his shop wasn't immune to the implementation hassles widely reported with SANs.

"[Compaq] got the technology, but you need to find the gurus to help you through the process," says Stone. "There wasn't a lot of knowledge in programming the Fibre Channel switches and the controllers [that allocate data among storage devices] here, and it took a while to find the people we needed to help."

Some predict the choices will become easier. Many industry observers say that the implementation headaches will fade as the distinctions between NAS and SAN fade. "The time will come when you won't tell the difference," says Thomas Coughlin, an independent industry consultant. "Users are going to insist that . . . SANs become easier to implement and cheaper to maintain and that NAS becomes more scalable and flexible."

Vickers says she agrees, to a point. "NAS and SAN features will converge, especially at the high-volume, high-price end, and will look like SANs with NAS devices as part of the storage network," she says. "But you're not going to want to centralise all your storage, which is the general direction. Low-end NAS will continue in that form."

* See Buyers Guide

Join the newsletter!

Error: Please check your email address.

More about Cahners In-StatCahners In-StatCompaqCompaqDow JonesEMC CorporationIBM AustraliaIn-stat GroupSAP AustraliaSECStorage Networking Industry AssociationStorageTekSun MicrosystemsVeritasVeritas SoftwareWall Street

Show Comments

Market Place