Review uncovers rampant virus infections

When the call came, I wasn't surprised. The voice at the other end of the line said, "Our network security analysis tool has found something we call 'suspicion indicators' on your network. I think you need to look at this right away." I had a sick feeling as my mind raced to the conversations I would soon be having with senior management. I tried to focus on the conversation at hand.

After several years of network security consulting in health care, financial services and high-tech companies, I had accepted a full-time position at a not-so-high-tech company, thinking it would be an easy, kick-back kind of job with a regular paycheck.

In the short time I've been here, I've realized that I traded one set of problems for another. Before, I could point out all the security issues to my clients and get paid for that -- and require additional large sums of money for the follow-on remediation work. But now, I'm on the other end of the equation. I am now responsible for the network, and the security consultants are pointing out all the problems to me. What was I thinking?

The caller was with an outside network security company we had brought in to do a high-level security assessment. I had hired this firm under the constraints of a small budget and tight timeline, and with the internal and external IT auditors one week away from walking in the door, a quick decision was needed.

The security assessment vendors we had used in the past had primarily focused on access-control issues and known vulnerabilities -- the usual fare. This time, I wanted consultants who were more advanced and experienced than our staff.

I had interviewed a number of consulting firms and was becoming a little desperate. When I asked the team if they knew of anyone, I was introduced to a company that had developed a proprietary tool that incorporated many of the open-source tools we use. I was impressed by its functionality and said, "This is like ArcSight on steroids!" The lead security consultant's resume read like a who's who of the scientific world. He was involved in the Internet before it was ever called that. And to top it off, he was the most unassuming and interesting person I had ever met. I hired his firm, and we got started right away.

Within days of data collection and analysis, the consultants had discovered hundreds of thousands of suspicious connections to sites all over the world, many of which were very undesirable. Using their reports, I typed in a Google search string that included the port numbers and country names. The information I found pointed to backdoor Trojan horses associated with a number of viruses. These Trojans horses had to be the culprit for the mysterious connections.

I tracked down my boss and explained the findings, my theory of virus/Trojan activity and my plan of attack. I also told her that I couldn't say whether our network had been compromised but that I would keep her updated. We discussed whether we should take this information to the C level or do more research. Realizing the legal implications of a network compromise, we decided that we needed more information before communicating upward. My lead security engineer was on vacation, so I got to work.

Ports of Call

I had to determine which connections were valid. A review of outbound firewall rules confirmed that they were appropriately limiting. I put together a spreadsheet comparing the firewall rules with known backdoor Trojans that operate on our allowable ports. At least I had narrowed the list of Trojans to look for. I decided to focus immediately on Ports 80 (HTTP) and 25 (SMTP), since they are the most widely exploited.

We use Web filtering technology and have a robust antivirus program, so I couldn't understand what was going on. Then I found that most of the suspicious activity was on a network segment that wasn't being filtered, which means malicious Web sites weren't being blocked. Great. I asked the network architect if we could deploy our Web filtering technology to that segment. His answer: We needed two more routers, and it would take weeks to get the equipment in place. Roadblock.

The next step was to identify and inspect the systems from whence the connections were originating, but the DHCP logs were basically useless. Another roadblock.

I logged onto the central antivirus control manager. It was reporting that hundreds of systems were infected! I ran month-to-month reports beginning in January to see if there was a pattern. Sure enough, January through April, we had no virus outbreaks. In May, something went boink. From May through the present, the control manager was reporting a long list of viruses active on our network, many of which installed backdoor Trojans. This was not a good sign. I asked the data center operations manager for his help.

Our antivirus vendor confirmed that in May the antivirus services went down on the primary e-mail server because of file corruption. It took days to get the system back. During that time, an unknown number of e-mail-borne viruses came into our network. Many of those, though detected, couldn't be quarantined. The cleanup will require manual effort and resources. Time to involve upper management.

I have had quite the headache in the past few weeks, but the good news is that the unsavory results of the security assessment have allowed us to identify a number of things that need to be fixed and to justify a budget for better tools, additional head count and possibly an organizational adjustment. The consultants recommended that information security be given visibility at the C level, as in CSO. But for me, it's back to the salt mines. I will be spearheading a virus cleanup effort.

Join the newsletter!

Error: Please check your email address.

More about Google

Show Comments

Market Place