Westpac has moved to dispel a number of common myths associated with banks using Cloud services, following its own recent journey to the Cloud.
Addressing attendees at this year’s Tech Ed conference on the Gold Coast, Westpac principal architect, Ward Britton, detailed the bank’s deployment of Microsoft’s Azure hosting platform.
Previously, the bank’s quantitative analysts used an in-house analytics platform to carry out risk calculations overnight but to produce increasingly granular information, it needed much more computing power.
“The journey started with one of our quantitative analysts ... who said, ‘Look I’ve got some issues here, this isn’t working, it’s taking a long time, when it’s running my desktop is locked up and I can’t do anything, but I need to do it quicker, and I need to it reliability as sometimes it even crashes’,” Britton said.
Westpac kicked off a proof of concept (POC) to prove that they could take the application (Numerix) and the job running on the analyst’s desktop, plug in HPC (High Performance Computing) and Azure and “make it run way faster” and more reliably.
The Numerix application plugs into Excel and enables the quantitative analyst to utilise a mathematical library and a pricing library, which is provided in the package.
Myth number one, he said, is that banks cannot use the public Cloud as it is too insecure for the sensitive data.
“Customer information is extremely important to banks, it’s looked after with maximum care and security, it doesn’t leave the bank, go offshore or get outsourced and as such it doesn’t belong on the public Cloud,” he said.
“But what about data that doesn’t have customer related information in it ... and what if the processing requirements for this information requires a huge compute capacity, perhaps this could be a candidate to put in the public Cloud.”
After establishing there was no customer information being used in the POC, Westpac charged ahead with the Cloud.
“With the right data, the appropriate amount of governance and diligence around that I reckon this myth is busted,” Britton said.
Britton also quashed the myth that buying a bigger desktop will prevent desktop apps from locking it down.
“Desktop apps can grind the desktop to a halt, Numerix plugs into Excel 2010, runs Monte Carlo simulations, it uses a lot of CPU and RAM to do that, so the biggest baddest desktop you put there to run this on will eventually run out of pump,” he said.
“By shifting the Numerix workload off the PC and onto the HPC grid, we could run it pretty quickly, so this myth that desktop apps will lock up your PC is busted.”
The third myth, Britton said, was that having the capability to burst to the Cloud would be “cheap, quick and easy”.
“To solve the initial problem of Excel locking up the desktop, we offloaded the work onto HPC, the desktop was still running, numbers came back and results happened quickly, essentially problem solved,” he said.
One of the objectives of the POC was to be cost effective, Britton said, therefore standing up an HPC framework including the compute nodes, the patching, the backup and all of the “watering and feeding” to keep it going has associated costs.
“We needed to find another way and that’s where HPC Cloud bursting piece came in,” he said.
“When we first began negotiations with Microsoft we thought it would be easy, just deploy the module and you’re away, but it definitely wasn’t like that.
Chloe Herrick travelled to Tech Ed 2011 as a guest of Microsoft Australia.
Follow Chloe Herrick on Twitter: @chloe_CW
Follow Computerworld Australia on Twitter: @ComputerworldAU