When to go virtual
As is the case with most emerging technologies, storage virtualization isn’t a cure-all for all organisations and it can come with a host of unexplored risks and challenges to the unwary.
Firstly, organisations must assess the cost and complexity involved in deploying virtualization technologies to ensure a given offering is the correct solution for them, warns IBRS' McIsaac.
“In my experience unless you’re organisation with a very large data sets like a Melbourne IT or a Telstra and you’re constantly moving data around and every six months you’re disposing of an array and buying a new one, you’re vastly better off buying one or two arrays from a vendor that will hold all your data rather than trying to knit together a quilt of different arrays from different places,” he says.
IDC’s Oostveen advises end users to make sure that each option is compared against their particular set of requirements and that they evaluate price based on dollars per gigabyte as well as dollars per IOPS (input/output operations per second) and IOPS per gigabyte. Doing this will ensure they are buying all necessary aspects of storage and not just passive capacity.
Melbourne IT’s Gore echoes this sentiment, noting that anyone with more than one SAN environment operating will reap the benefits while a smaller environment with a single SAN will see none.
“You can’t virtualise one SAN but the moment you’ve got two or more in operation, virtualization benefits start kicking in and the more controllers you have the more those benefits start coming in,” he explains.
However, even in situations where virtualization is feasible, the skills require to manage them remain scarce in Australia. Part of the resolve to move to IBM at Edith Cowan, according to Griffin, was the desire to in-source management of the university's storage, where previously it was "tightly managed" by EMC.
While the solution itself is slightly more complex — and the learning curve a challenge — Griffin says in-house management has made day to day maintenance much simpler.
“Being able to have the staff really closely acquainted with how the gear works helps because it is quite simple to operate and they can really get some benefit out of understanding it which helps to prevent any major hiccups,” Griffin says.
“We can measure whether storage is performing appropriately and how a project runs and then adjust how the storage is provisioned in the backend as the project evolves or progresses."
Melbourne IT’s Gore claims there isn’t a huge degree of risk in migrating the data over to virtualised storage, despite spending the last 12 months conducting rigorous tests on the new platform before it goes into production. He does, however, stress the importance of training staff to manage the environment.
Staff must have the skills required to manage virtual storage, which Gore notes are slightly different as are the risks from virtualising storage, and should an operational issue at the virtualization layer arise, it could potentially take out an entire storage infrastructure.
According to Gore, just as is the case with server virtualization, storage virtualization involves a different thought process as the data is transient and no longer locked to a single thing. Staff must be aware of the potential impacts of moving storage and invest accordingly in the tools to allow them to view their storage environment and its performance.
For Griffin, there's no going back.
“We deliberated over whether it was the right thing to do but when you look at the benefits we get out of it, it certainly turned out to be the right choice, we went through a process internally where we looked at all the technical options that we wanted to put into the request for tender we put out and it became very clear that with that simplified, standardised and virtualised strategy, we needed to put this technology in place to get those benefits,” he says.
Next: Storage virtualization tricks and tips