BOSTON (06/12/2000) - More than two years after storage vendors introduced the concept of the storage-area network (SAN) as a high-performance alternative to server-based storage, interoperability remains the major implementation problem.
Customers who were swept up by the initial SAN hype discovered that while off-loading storage functions from data network servers onto a specially designed, high-speed, high-availability storage net made sense in theory, in real-world conditions it was extremely difficult.
First of all, no standards existed for the SAN itself, particularly in the area of management. While standards did exist for Fibre Channel technology, those standards were interpreted differently by vendors and a virtual sandstorm of interoperability resulted.
Some SAN vendors began working on standards through organizations such as the Storage Networking Industry Association (SNIA), the Fibre Channel Industry Association, the Fibre Channel Alliance and the Open System Fabric Initiative (OSFI).
Others began to invest heavily in interoperability labs to pretest SANs before delivery, and they embarked on an educational program in the form of seminars, white papers and so-called plug-fests, where vendors could see how well their equipment worked with components from other vendors.
To some extent, these efforts have paid off, according to Nick Allen, a vice president at Gartner Group Inc. "We're not past interoperability issues; we're not at plug-and-play. It's better than it was, and it's getting better steadily. But we continue to advise our customers to buy only turnkey, totally certified systems," he says.
Kumar Malavalli, vice president of technology at Fibre Channel switch vendor Brocade Communications Systems Inc., says,"We have achieved interoperability on the system level, but have not achieved interoperability between the switch vendors' products."
After trying unsuccessfully to put together a SAN, Bret Cox, managing director of Pacific Ocean Post Technology, a digital effects firm in Santa Monica, California, sums up his experience this way: "The promise of a SAN and the theory behind it are sound, but the reality of putting one together - especially with the person who is not very familiar with high-speed technologies and setting up complex systems - is just not there."
But progress is being made on the standards front. For example:
-- The OSFI has defined the standard for interoperability between Fibre Channel switches manufactured by different vendors. Brocade donated its protocol to the group to hasten the acceptance of the standard, which is now with the Internet Engineering Task Force awaiting ratification.
-- The Fibre Channel Alliance is working on a Management Information Base (MIB) specification for SAN management. According to Said Rahmani, senior vice president and director of technology for Pathlight Technology in Ithaca, New York, substantial progress has been made on the SAN MIB. Pathlight announced it will implement the MIB in several of its products, hoping other SAN vendors such as Veritas and Legato will follow. Both of those vendors currently have products implementing the MIB on their product roadmap.
-- The SNIA, the major SAN standards body, currently has 10 working groups addressing standards relating to Fibre Channel storage, discovery, policy-based management and security, says Andrea Westerinen, the organization's technical director.
The interoperability labs
In addition, the largest SAN vendors have invested in interoperability labs.
EMC Corp. has devoted 150,000 square feet, more than US$2 billion in equipment and 1.5 petabytes of storage to its interoperability lab in Hopkinton, Massachusetts, according to Lou Przystas, a consulting engineerBeing a storage-only company requires EMC to have all major vendors' equipment in its lab, Przystas says. As a result, the company has 1,000 systems for testing and reproducing customer environments and 750 open systems hosts.
A typical operation that takes place in the lab is the qualification of new products. "We bring in new products and run them through our testing procedures," Pryzstas says. "If we detect a problem, we return the product to the vendor that supplied it and they upgrade their firmware or code and give it back to us to retest. If we find no other issues during the retest, we qualify it."
Dale Pickford, chief technology officer at eData.com, a data services provider in Boca Raton, Florida, found out the hard way the value of an interoperability lab.
Pickford and an EMC field engineering team spent 10 long days trying, without success, to get a SAN to run. Pickford called a halt to the effort, and requested a meeting with EMC's interoperability team. "We reconvened back at EMC's headquarters and went through the installation piece by piece for a couple of days. EMC then flew a team of 18 people to Boca Raton, and we crushed the project in one weekend. And sure enough, we came out the other end of the tunnel on Monday morning and everything was up and running exactly as it should be," he says.
Without the help of EMC's interoperability lab, Pickford says, his SAN project might never have succeeded. "Their contribution was extensive because the engineering team actually tested all of the features that I was asking to use.
And they actually trained the field team that was going to do the implementation in Florida," he says.
Like EMC, IBM Corp. has invested heavily in interoperability. It has dedicated 80,000 square feet of space and $500 million worth of equipment to its interoperability lab in Gaithersburg, Md. The lab includes equipment from all major server vendors including Compaq Computer Corp., IBM and Sun Microsystems Inc.; switch vendors such as Ancor, Brocade, Gadzoox and Vixel; routers from Chaparral and Crossroads; and software from Computer Associates International Inc., Hewlett-Packard Co., Legato and Veritas.
"Our role is to get the interoperability team together with our skills so that we can help clients design, do proof of concept, and test their planned solutions," says John Berg, director of the IBM interoperability lab. The team creates customer environments within the lab and conducts presale briefings using typical configurations.
Roger Schwanhauser, director of storage and SAN services for IBM Global Services, says the IBM lab goes beyond simple interoperability. It also looks at the application level to help companies improve the management of their business through better availability of information.
Mass Mutual Insurance in Springfield, Massachusetts, recently made use of the IBM interoperability lab while planning its SAN implementation, according to Gerry Roberts, second vice president for IT. "Our intention was twofold: We wanted to validate our design assumptions and get a first-hand look at the technology that they had implemented in the lab because a lot of the equipment that they have is equivalent to the equipment that our company has."
Roberts and his colleagues in IT wanted to make storage as easy to obtain within their Springfield campus as electricity or telephony.
"In other words," Roberts says, "we wanted to make the storage environment a utility, so that it would be available on-demand and movable among applications."
By using the IBM Global Services Lab, Roberts found he could skip the pilot stage (and the expenses associated with it) that many other companies had experienced in the course of deploying SAN technology. "We didn't have to go through the trouble and investment of bringing the components in-house to find out if they worked together. We were able to go to Gaithersburg and kick the tires, which was very beneficial to us. It made the whole process a lot easier," he says.
Recently, Compaq began building a large interoperability lab on its campus in Colorado Springs, Colorado. But this interoperability lab has a new twist.
Compaq has built the lab in cooperation with SNIA and is making it available to all SNIA members. According to Compaq, this will make its interoperability lab the first large-scale commercial lab to be "open to the public." The lab is scheduled to be completed in 2001.
The turnkey solution
Another approach vendors have taken to eliminate interoperability problems is the turnkey SAN. McData's FabricPaks is an example of this approach.
Kendall Fay, director of computing services at Corporate Express, an office supplies company in Broomfield, Colorado, went with a McData FabricPak SAN when his Unix servers in the data center began to run out of storage space and he got tired of buying a new server each time he needed to add storage.
A typical McData FabricPak consists of the McData ED-5000 Fibre Channel Director, JNI host bus adapter, EMC or Hitachi data arrays, Chaparral or Crossroads bridges and tape libraries from a variety of vendors, including ADIC, ATL, StorageTek and Exabyte.
Recently, McData has expanded on this core concept by adding services and applications such as NetBackup 3.2 from Veritas. The company also plans to add database and enterprise resource planning configurations.
By investing heavily in interoperability labs, coming together to develop SAN standards, and sticking with turnkey solutions that they knew would work, SAN vendors have survived the interoperability SANstorm that at one point threatened to bury them.
They have found an oasis of success in the first phase of the SAN life cycle.
This means users can purchase pretested SANs without fear of running into major problems.
But the struggle is far from over. Customers still can't mix and match switches and hubs from various vendors. And there are still problems at the storage-management level.
Clark is a freelance writer based in Haverhill, Massachusetts. He can be reached at firstname.lastname@example.org.