- 08 July, 2003 12:28
Banks come in all shapes and sizes, from global financial services firms down to the smallest credit unions. But finding better ways to ensure security is a common concern. Large banks face another challenge - the need to deploy data-management tools as data volumes grow.
Finding ways to identify the holes in applications and patch them is a top priority because of the many computer worms and hacker assaults that are aimed at exploiting specific software vulnerabilities. "We have to apply patches nearly every day," says Bill Arnold, IT manager at Purdue Employees Federal Credit Union in West Lafayette, Ind.
The credit union uses a mix of Unix, Linux and Windows servers and desktops. While the patching job had been manual until recently, the bank now uses a tool from SecurityProfiling to automate the patch process for some of its computers. SecurityProfiling's software agents sit on servers and receive and install software patch updates.
However, Arnold says it's hard to feel the battle can be completely won. For example, Microsoft Corp. doesn't provide patches for some of the older software the credit union still uses. The organization eventually will upgrade to newer desktop software, but the patch management problem remains a tough one to solve.
A Deloitte & Touche survey shows IT security managers from 175 firms have big concerns about network attacks. According to the consultancy's 2003 Security Survey, only 13 percent of the industry's IT security professionals felt "extremely confident" that their organizations are shielded from Internet-based attacks. Moreover, 18 percent said they were "not very confident" their systems are safe from insider attacks. And 39 percent acknowledged that their systems had been compromised in some way last year.
"We're a pure Microsoft shop; our core concern is that we be up and running, and we want to patch in production as much as possible," says John Shields, senior vice president of e-business at Patelco Credit Union in San Francisco.
Patelco uses nCircle's IP360 appliance to scan Patelco's internal network for vulnerabilities. "We use the reports to generate the patches we need," Shields says.
Shields and Arnold want to see tighter integration between the patching tools, scanners and intrusion-detection systems they use so that security alerts from the IDSs are more relevant to their network traffic. Today, IDSs have little knowledge of the systems they are intended to protect.
Kirk Drake, CIO at National Institutes of Health Federal Credit Union in Bethesda, Md., says IDSs must improve to give reliable alerts about attacks. "They give you an insane amount of data, notifying you of about 200 to 300 alerts each day, but most of them are false positives," Drake says about the Sourcefire IDS that the credit union uses. "They trigger an alert, but if they knew what was really in your network and what was patched, it wouldn't be giving you that alert."
In recognition of that problem, Sourcefire is developing a network-discovery tool that will be able to share such information with the IDS to improve the quality of alerts. NCircle, Internet Security Systems and SecurityProfiling are other vendors actively pursuing tighter integration between IDSs and scanning tools.
Another challenge banks have is finding the people to maintain IDS implementations. For example, JP Morgan Chase wanted to make use of IDS but with no staff experienced with it, the choice was either to hire trained security specialists or choose a managed security service.
"We had no experience with IDS," says Jennifer Briggs, vice president of enterprise technology services for the bank in Newark, Del. A careful review of managed security services providers led JP Morgan Chase to Ubizen of Brussels, which had the kind of global presence the bank needed.
The destructive power of natural disasters, such as tornadoes and hurricanes, also is a major pre-occupation among IT managers. EECU, a Fort Worth, Texas, credit union, hosts its own Web servers, but for purposes of disaster recovery, the credit union recently decided to have Southwestern Bell host its online banking home page in a hardened Web-hosting facility in Dallas equipped to endure a category 3 hurricane.
Kenneth Mahan, EECU's IS manager, says the organization's goal is to preserve the online banking home page - the first place customers visit online - in the event of a catastrophe. That way, router tables can be changed to direct traffic to alternate servers and databases where customer information is stored.
In addition to security needs, the large banks are struggling with a problem of mammoth proportion: data. Banks need to centralize and analyze all the transaction data that streams through mainframes and servers. Many say they're storing data for longer times, both to meet complex privacy or capital-risk regulations and to better understand the bank's business.
"We're storing more historical information," says Doug Welch, senior manager of business intelligence for the Bank of Montreal. Moving 10 terabytes of data daily and analyzing it requires use of a new breed of tool for what's called high-volume "extraction, transfer and load" (ETL). A handful of vendors, such as Informatica, Essential and Ab Initio, offer tools for this.
Bank of Montreal uses Ab Initio tools for this heavy lifting of data as it's hauled across networks and "cleaned" for analysis. Another banking giant, KeyCorp, faces a similar challenge, according to Kevin Sexton, vice president and division manager in Cleveland.
"We don't want to use last month's data to make decisions today," says Sexton, who is aiming to process 4 to 5 terabytes of data daily instead of weekly or monthly by using the new breed of ETL tools. "We need to increase the data feeds to give business managers the information they need now."
KeyCorp uses a home-grown data-management tool. "We're going to see if these ETL tools, which are part of an aggressively growing niche in the marketplace, can do it better," Sexton says.