Security in the cloud: cloud control
Top Story

Security in the cloud: cloud control

Angela Partington examines the issues surrounding security in the cloud. How can service providers balance the need for data protection while providing a functional and reliable service?

Cloud computing is not a new phenomena, but the role which telecommunications providers are playing in the cloud is evolving rapidly. They are fighting for market position with IT service providers and systems integrators, such as Accenture and IBM; web-based service providers, like Amazon and Google; network equipment vendors; and infrastructure integrators. They also appear to be finding a more secure footing in the cloud.



Recent research conducted by Ovum shows that 49% of multinational corporations (MNCs) rate telecommunications providers as trusted partners for cloud services, a huge 12% increase over the last 12 months.

The other aspect of cloud computing which appears to have grown disproportionately recently is the number of reported breaches and the subsequent public awareness of this threat. Amazon’s major cloud outage in April, when a configuration error was made during a network upgrade, lasted throughout a weekend and took a number of major websites offline. Epsilon recently reported that millions of individual email addresses were exposed during an attack on its servers. “We’ve had a breach a week this year,” says Josh Corman, research director for enterprise security at the 451 Group. “This is going to kill cloud adoption.”

Numerous other bugs, leaks and data security risks have plagued cloud providers in recent months. Threats include ‘sniffing technologies’, which look to intercept traffic movement in networks and identify important information. Malware takes numerous different forms, including password crackers which track and replicate keyed log-in details. And good, old fashioned hackers continue to attempt to breach firewalls looking for data that is at risk.

The renewed awareness of the risks challenging data security could be one of the reasons why telcos appear to be thriving. “Research shows that telcos are generally trusted for managed networks and managed communications,” says Evan Kirchheimer, practice leader for enterprise services at Ovum. “That’s not surprising: they’re telcos’ core activities.” Ken Owens, VP of security and virtualisation technologies at Savvis agrees: “Telcos traditionally were looked at as not being early adopters and being slow to move. On a positive side, telcos are very good at availability and reliability, at showing that products are secure and are going to work well.”

Guaranteeing security

The physical security of the cloud rests with the data centre, and can be as basic as ensuring comprehensive access control, functional generators and a ready supply of diesel, separate redundant connectivity to the internet via different suppliers, access to cables following different routes and an ability to mirror data off-site.

Of course, essential as these steps are, they barely scratch the surface when it comes to the complexities of protecting data and processing. Cloud computing security is a tangled web of aspirations, best practices, standards, principles and acronyms, reflecting the fact that this is truly a process in development. Andy Burton, chairman of the Cloud Industry Forum, speaks of their code of practice “requiring self-certification around three basic facets: transparency, capability and accountability”. Corman believes there are three optimisation points for all security: “CIA: confidentiality, integrity and availability.”

One of the major difficulties is that, due to the evolving nature of the service, there are no formal, universally accepted standards governing best practice in cloud security. There are, however, a number of standards and tools which are commonly referred to by cloud providers. An ISO 27001 certification specifies a management system that is intended to bring information security under explicit management control. The payment card industry data security standard (PCI DSS) provides an actionable framework for developing a robust data security process for card payments, which can be further authenticated by 3-D Secure. IPsec is a protocol suite for securing IP communications by authenticating and encrypting each IP packet of a communication session.

Standard solutions?

All may contribute a degree of reassurance, but none of these are bespoke standards designed to provide comprehensive assessment and certification of the cloud. Nor are we likely to see these soon. Eric Hemmendinger, senior product manager, managed security services at Tata Communications, says: “You won’t see a single standard. It’s going to develop first and foremost based on early adopters. Financial services are an early adopter, so they will be defining their requirements and you could argue they will set the standards. Unlike internet protocols, this will be a situation where the buying community really defines what the requirements are. We’ll turn around within three to four years and realise that this has become the standard; this is what we all expect now.”

There may not be a standardised requirement for compliance but, without doubt, there is a need in cloud computing to provide sophisticated and comprehensive security measures to counteract the threats they face. With an array of protective tools and systems at their disposal – firewalls, VPN gateways, intrusion protection systems (IPS), anti-spam and anti-virus systems, authentication and encryption – one of the key issues appears to be the architecture of the systems. There is general consensus amongst industry experts that these tools should be incorporated while planning and designing the system architecture. As Corman says, “If you don’t plan for it in advance, then it’s much harder and more expensive to do after you’ve architected your system.”

Encryption

Encryption is a major tool in the cloud provider’s arsenal. With continual advances in programming, it provides a robust challenge to sophisticated data attacks, and different algorithms can protect data to various strengths, all the way up to military grade encryption if desired.

The challenge lies with the choice of which data should be encrypted and the use to which encrypted data is put. Part of the problem is simple storage, as Burton explains: “Storage has become very efficient. Data that doesn’t get used very often is compacted very finely; data that is used more regularly is compacted more lightly. But when you store information in an encrypted form, it is not very efficient in how it uses space. By nature, the space that is used for encrypted data is not as efficient as that used for unencrypted data.” There are quite straightforward financial repercussions for heavy handed encryption.

More serious, however, are the efficiency implications. Generally, in order to use encrypted data it needs to be unencrypted and then re-encrypted after use, which can slow execution. There is a balance to be struck between security and efficiency. “You must look at the balancing act between security and operational efficiency, and ease of use for the end user,” says Burton. “Security should be there to protect, but not to hinder business from happening.” Key to successful encryption is selecting which data needs to be encrypted, and establishing appropriate security to meet differing priority levels. “Data classification is the first step,” says Owens. “But acceleration technologies need to be in place to really benefit from encryption. These basically provide the encryption in hardware, so it becomes a hardware acceleration process rather than slowing down the virtual machine to do

the encryption.”

Feeling private?

One of the most common questions is the relative merits of the public cloud versus the private cloud. Private clouds can be run independently by an organisation, using the technological framework of storage processing, hypervisor (or virtual machine manager) and virtualisation technology to run their own data centre as a private cloud linked to the internet. Alternatively, they can be run using a cloud provider. As Burton explains, “Enterprises effectively have their own resources, separated from the other resources in the data centre through virtual LAN, virtual private network and firewall technology. It’s a private set of resources for them to run, controlling the shape and use of their resources. Private clouds are generally more popular at the moment.”

There are, however, resourcing issues with the private cloud. Steve Holford, the product and marketing director of Fasthosts Internet Group, explains the problems with selling dedicated servers: “Businesses have to overbuy so they can grow into their servers. If they bought what they needed on day one, they’d be at capacity by day two. So in a data centre full of dedicated servers, a lot of that capacity isn’t actually being used.” Public cloud, on the other hand, is a multi-tenancy environment, where organisations are sharing access to that pool of resources. It allows enterprises to pay for what they use, expanding their requirements only as they need to – a more efficient and cost effective undertaking for both parties. But when securing the public cloud, focus moves to management of access to the environment, ensuring that only the right people have access to a specific instance within a public cloud. Stan Fromhold, senior deal architect with BT, explains: “In a multi-tenanted environment, you have to determine whether there’s enough isolation provided between the customers that are running those virtual machines, removing the risk of someone else accessing the virtual machines that don’t belong to them. You also need a unique encryption relationship for each customer that stores their data in the cloud.”

Private and public clouds are not binary states, however. More enterprises are seeking bursting capabilities, increasing the complexities of how cloud providers segment their customers from each other. Owens explains: “A lot of customers with private clouds don’t want to buy significant amounts of excess capacity. They want to buy a private cloud or dedicated infrastructure that they run their business on 24/7, 365 days a year. But perhaps once or four times in a year they need additional capacity. Then, they’d like to use a public cloud infrastructure, and just pay for what they need while they’re doing their critical processing.” Protecting that data transfer between private and public cloud is critical to ensure the integrity of the private cloud remains unhindered. Owens adds, “They should be able to stipulate a private connection or a public connection with an IPsec tunnel solution.”

Testing times

Proving your system to be secure requires it to be audited, with vulnerability testing and penetration testing. A big problem at the moment is that many data centres, in the name of security, won’t allow access to inspectors; however, many comprehensive audits require on-site, in-person, physical inspections of data centres.

Forensics, as the name suggests, provides an analysis of cloud computing systems and processing in order to gain a comprehensive understanding of any security breaches. As Corman says, “It’s incredibly difficult to do court-admissible forensics without having logs at many levels that cross the shared responsibility boundary between the cloud provider and the enterprise and given the highly virtualised nature of the cloud. To do proper forensics, you often need to chain together a time-synchronised chain of events, and this is making forensics a very ominous and difficult problem in any public cloud.” A more basic challenge to successful forensics is that some providers consider it to be a breach of privacy. As Holford comments, “We have no real visibility of what our customers use on their servers. They have the opportunity to lock us out and stop us from accessing it. Which is absolutely right. That’s part of their own security.”

Successful forensics can provide a clearer understanding of any breaches and demonstrate accountability for any errors, but – wherever it shows that the liability for a breach lies – the enterprise is ultimately responsible for the security of the data which it has placed in the cloud. Many involved in cloud computing argue, believably, that most credible cloud providers are more secure than the alternatives in on-premise environments, with their array of firewalls, encryption, anti-malware, identity management and authentication processes. But as in any service, there will always be a tension between those businesses providing the minimum safety standards while attempting to minimise costs, and those doing all they can to guarantee the security and safety of their data.

Service level agreements (SLAs) and governance therefore need to be in place in order to guarantee protection and business continuity, allowing businesses to define their expectations and to establish what will happen if those expectations are not met. Telcos are completely familiar with the concept of 5-9s, ensuring that their services are available 99.999% of the time. “It’s one of the things the industry has been designed around,” says Gordon Rawling, director of EMEA marketing for Oracle. “I think there will be more of an appreciation of that in the marketplace. 5-9s is a wonderful underpinning which communications service providers can use to differentiate their provision of cloud services.”

So while security may currently be the main objection to using cloud computing, it could also be the major selling point. Corman says, “If we did measurably better than our peers, we could charge twice as much for a secure cloud. We’ll have succeeded if, in a few years, we don’t see security as the number one inhibitor of the cloud, but the number one differentiator of successful clouds.” If telcos can continue to reassure enterprises that they are a trustworthy and competent repository for their systems and information, the opportunities offered by cloud computing could reach to the skies.

More from across our site
Gift this article