Government websites have played a vital role in managing the current Queensland flood disaster statistics released by hosting company, Melbourne IT, show.
According to the hoster’s chief technology officer, Glenn Gore, four major sites — Queensland Flood Appeals site, the Department of Transport and Main Roads, Volunteering Queensland and TransLink — have in up to some instances received a 50 fold increase in traffic.
According to Gore, the Flood Appeals site, which went live on 29 December and doing approximately 10,000 visitors per day jumped to 40,000 on January 9 following the Channel Nine telethon.
Following news that Brisbane would flood hits jumped to 185,000 then when the floods finally hit visitor numbers jumped to just under 355,000.
According to Gore, server and storage virtualisation was essential in enabling the company to scale hosting to meet public demand.
“To be honest, without virtualisation these sites wouldn’t have held the load,” he said. “The ability to very quickly and easily move resources around, apply resources not only at the server layer but storage layer too to be able to move databases onto faster storage as they needed it. Those two elements — the servers and storage working in concert — made it possible.”
According to Gore, Melbourne IT had to double the amount of computing infrastructure originally assigned to the sites to cope with public demand. On the storage side, a sizable increase in higher speed storage was required to deal with a tripling in input/output load.
At its peak the Department of Traffic and Main Roads received a 50 fold increase off its standard daily baseline traffic, according to Gore.
“The department has done a very clever thing and split their website into two components — the standard business as usual content… and a second site known as a high load site,” he said.
“We use smart load balancing to allow fail over to the high-load site which uses things like low resolution images and has just the critical information — we get rid of the heavy lifting and that allows the information on the roads to get out there.”
Volunteering Queensland saw its traffic jump from a baseline of 10,000 per day to 100,000 at its peak, Gore said, while TransLink saw high, sustained traffic load of 20 to 30 per cent over the period of flooding.
Twitter and disasters
Despite social media, such as Twitter, substituting as an information source for localised information during the Queensland floods, Gore said it was still essential that government websites stay up and running during disasters.
“One of the best ways of keeping up to date was following some of the feeds [on Twitter] but that highlighted some problems,” he said. “For one, you get a lot of rumour or non-factual information… so people tried to correct that by telling people to go to authoritive sites. What you saw was that some sites couldn’t handle the load, which meant people couldn’t get authoritive information and then the rumours spun out of control.
“Whereas keeping sites up and running meant that those rumours were killed off quickly. It’s interesting to see how the two interact and it shows the need for an authoritive source of information people can go and check.”
Cloud to the rescue
According to Gore one of the major lessons IT managers should draw from the floods was to usefulness of fail-over barebones websites.
Gore said organisations should think of their end users, what information they had to have as a minimum, then develop a process for getting that information onto a back-up website which could be quickly cut over to.
“The other thing is having access to spare capacity which you can enable very quickly and direct traffic too as the load comes in,” he said. “Whether it be Cloud, hosting or other, getting that quick access to capacity is important.”
Follow Tim Lohman on Twitter: @tlohman
Follow Computerworld Australia on Twitter: @ComputerworldAu