Cloud computing needs better security, interoperability to live up to hype
- 03 April, 2009 08:19
If cloud computing is to move beyond the hype cycle, vendors need to put aside their differences and agree on common principles related to security and the interoperability of cloud platforms, a growing number of industry players are saying.
Two events last week demonstrated rising interest in making security a priority and creating an open infrastructure that lets applications and data move freely from one cloud to another.
ING and eBay highlighted a mix of user companies and vendors that announced the formation of the Cloud Security Alliance, saying the delivery of on-demand computing capacity over the Web is putting new demands on security tools.
"The very nature of how businesses use information technology is being transformed by the on-demand cloud computing model," says Dave Cullinane, CISO at eBay. "It is imperative that information security leaders are engaged at this early stage to help assure that the rapid adoption of cloud computing builds in information security best practices without impeding the business."
Separately, a large collection of vendors threw their support behind the Open Cloud Manifesto, which challenges the industry to avoid proprietary technologies that would limit cloud choices. Besides security, the manifesto urges vendors to focus on portability and interoperability of data and applications, governance and management, and metering and monitoring.
Customers need to be skeptical, particularly when they are considering sending critical data and applications to cloud providers, says David Snead, an attorney who spoke about legal issues related to virtualization and cloud computing at Sys-Con's Cloud Computing Conference & Expo in New York City last week. Companies such as Amazon do have downtime, and service-level agreements may not guarantee severe penalties, he said.
"There's no such thing as a cloud," Snead said. "Your data is going somewhere. It's going to some infrastructure provider. ... Something I don't think a lot of companies understand when they're sending things out to the cloud, is where it's going and what companies are going to stand behind it."
Critical applications such as databases, transaction processing and ERP workloads probably should not be the first ones sent out to the cloud, said Kristof Kloeckner, the cloud computing software chief at IBM. Kloeckner recommended that enterprises just now looking at the cloud choose a few "quick wins" that benefit many employees, but carefully analyze applications with mission-critical requirements before making any decisions. Beyond simply outsourcing, the cloud could provide opportunities for enterprise to start using new workloads, such as high-volume, low-cost analytics, or collaborative business networks, he said.
Controversy over the "open cloud"
Last week's debut of the Open Cloud Manifesto was not without controversy, as Microsoft claimed that an open process was not used to create the document, and that it was asked to sign it without the opportunity to provide feedback or revisions.
Reuven Cohen, the founder and chief technologist for cloud computing start-up Enomaly, and one of the people responsible for bringing the manifesto to the public, is advocating for the creation of an industry association focused on marketing a cohesive picture of what cloud computing is.
While many vendors are still defining cloud computing in different ways, Cohen argues that "we can still compete, but we don't necessarily have to tell different stories about what the cloud is. There is an opportunity to come together and grow the market."
How the cloud is defined will be important to limit confusion in the marketplace. Every vendor is using the word "cloud" to suit their own purposes, but the Sys-Con conference last week demonstrated that a common definition is probably not that far away.
As an approach to building IT services, cloud computing harnesses several converging factors in the IT world, including the rapidly increasing horsepower of servers and virtualization technologies that combine many servers into large computing pools and divide single servers into multiple virtual machines that can be spun up and powered down at will.
Led by companies such as Amazon, vendors are building massively scalable server farms to offer compute power, storage, business software and application building platforms over the Internet, using self-service interfaces that let customers acquire resources at any time they want and get rid of them the instant they are no longer needed. Private clouds deployed by enterprises for their own users are built along the same principles, but done so completely within the firewall.
"There is a shift from infrastructure being a capital expense to a variable cost," said Amazon CTO Werner Vogels, during a speech at Sys-Con.
If you are the founder of a start-up that is building an application for Facebook, you have to prepare for the possibility of becoming immensely popular overnight, Vogels said. But you might also fail. That's why you need on-demand access to the power of 5,000 servers at any time, without having to spend the money up front. Or if you run a seasonal business, you may need huge amounts of computing power one month out of the year, but very little during the remaining 11 months.
Page BreakCloud computing borrows concepts from grid computing, namely the ability to harness large collections of independent computing resources to perform large tasks; and from utility computing, namely the metered consumption of IT services, according to Kloeckner.
But perhaps the real impetus for cloud computing are failings within the current IT infrastructure, Kloeckner said. Seven out of ten IT dollars are spent on maintaining systems, and perhaps 85% of capacity in distributed computing environments sits idle at any given time, he said. Storage requirements are escalating too quickly for many data centers to keep up.
The basic message from vendors: Cloud computing, while still in its infancy, is the solution to these problems.
Early days, lingering issues
Still, there's a lot more work that needs to be done to address the concerns customers have when deciding whether to move key applications outside of their firewalls. Ideally, an application built for one cloud service should not be locked into that service forever. It should be able to move freely from one to another, or from within an enterprise's network to outside the network.
Some vendors are already working on portability. An application virtualization company called AppZero recently unveiled technology that moves server-based applications from within the enterprise data center to services such as Amazon's Elastic Compute Cloud in seconds. Moving applications from Amazon to another cloud provider, such as GoGrid, also is possible with the AppZero tool set.
The problem also is being addressed in the academic world, where a standards group called the Open Cloud Consortium is trying to improve the performance of computing clouds spread across geographically disparate data centers and also promote open frameworks that will let clouds operated by different entities work seamlessly together.
Within security there are many issues that must be addressed, according to the Cloud Security Alliance, including compliance and auditing, e-discovery, encryption and key management, identity and access management, disaster recovery, and incident response, notification and remediation.
Ultimately, cloud vendors will be judged on five broad categories: security, scalability, availability, performance and cost-effectiveness, Vogels said. While there are shortcomings today, he predicted huge advancements in the next few years.
"It is still day one," Vogels said. "We've just begun widespread deployment of these services."