We asked three industry experts how organizations can ensure effective cloud security. Here are their responses:
Jon Toor, CMO, Cloudian
A common pitfall we see is organizations relying solely on their cloud provider for security and data protection. In fact, ESG Senior Analyst Christophe Bertrand recently stated that 35% of organizations using Software-As-a-Service (SaaS) completely abdicate their data backup and recovery responsibilities to SaaS vendors. What they fail to realize is that many SaaS providers don’t actually provide full data protection.
For example, Microsoft Office 365 offers geo-redundancy, which protects data from site or device failure, but this does not constitute a true data backup. If data is accidentally deleted or maliciously attacked, Microsoft 365 offers limited recovery options. While it does provide basic recycling bin capabilities, Microsoft 365 only stores deleted files for a limited period. Beyond that time frame, the data is permanently deleted. Worse, if a user is accidentally deleted from Microsoft 365, his/her data is erased entirely from the entire Microsoft 365 network.
All of this highlights the need for organizations to take greater responsibility for safeguarding their data, particularly with the proliferation of ransomware attacks over the last two years. This means ensuring their cloud provider has comprehensive security measures in place or adopting a hybrid cloud strategy in which such measures are applied on-premises in their own data center.
Regardless of where they’re deployed, these measures should include traditional defenses such as anti-malware software and anti-phishing training. However, because these defenses often fall short – in a recent survey we sponsored, 49% of ransomware victims had perimeter defenses in place and 54% had conducted anti-phishing training – organizations must also protect data at the storage layer.
This means encrypting data both in flight and at rest to keep cybercriminals from reading it or making it public in any intelligible form. In addition, organizations should have an immutable (unchangeable) backup copy of their data. Immutability prevents such criminals from altering or deleting the data and ensures the ability to recover the uninfected backup copy in the event of a ransomware attack, without having to pay ransom.
Whatever your cloud-based application may be, maintaining a copy of your data on-prem gives you recourse in the event that something goes wrong. Whether it is a data corruption issue, service interruption or hacker encryption, having an on-prem copy of your data gives you options. Furthermore, it gives you full control over the management policies, retention of deleted data and immutability settings.
Data is the life blood of any organization. Think of the cloud as yet another IT resource, not an infallible entity, and then act accordingly.
Don Boxley, CEO and Co-Founder, DH2i
The network perimeter continues to evolve – perhaps never more so than over the past year during which the way we work, learn, shop and live changed so dramatically. We were fortunate to be able to leverage mobility, cloud computing, the Internet of Things (IoT), Edge Computing and other innovative, advanced technologies to enable most of the necessary changes.
However, the road to today’s new compute paradigm was not without its bumps. It is a road that remains rather bumpy for many – especially, when it comes to security. Or more specifically, securing the network perimeter.
The past year has proven without a doubt that traditional VPN and direct-link approaches to communications and security are fatally ill equipped to face today’s security demands. Current VPN and direct link approaches are cumbersome to maintain and open the entire network to lateral movement.
What is required is an ‘unVPN’ – i.e., a solution that takes a more secure approach, giving users app-level access rather than network-level access, thereby reducing the attack surface. And it should do all of this with the most secure and performant approach to create a Software Defined Perimeter (SDP) to grant connectivity to distributed apps and clients running across multiple sites, clouds and domains.
Of course, not all SDP solutions are created equal. First and foremost, today’s enterprise IT executives should seek a solution that ensures a Zero Trust architecture by permitting users to access only authorized apps, not a slice of the network, thereby eliminating the ability for any lateral movement.
Ideally, data should flow directly between users, sites, and clouds using application-level DTLS encrypted micro-tunnels and Public Key Authentication. The SDP solution should also only use randomly generated non-standard UDP ports, making the tunnels and servers untrackable and invisible to port scanners and other hacking tools.
Configuration and management should be uncomplicated. The software should integrate into any existing networking infrastructure. With no appliances to install, configure or maintain, you will get a vastly simplified deployment with no ACLs or firewall configuration headaches. And remote users can easily connect to their tunnels from wherever they are.
Last but not least, traditional networking tools for multi-site connectivity can be complex and expensive to maintain-especially for the cloud. SDP does not require a dedicated VPN appliance. So, for cloud connectivity there is no requirement to pay cloud vendors an hourly VPN fee to allow clients to connect. That means, costly direct links and VPNs can be phased out for even more savings.
Tom Callahan, Director of Operations, MDR – PDI Software
When most people think of security, they tend to focus on why they need it in the first place. If you know what your primary threats are, it’s much easier to define a security strategy to help prevent or stop those threats. This approach definitely applies to safeguarding cloud-based data.
Even as the popularity of cloud computing soars, many organizations still have fundamental security concerns. However, the cloud isn’t necessarily more or less vulnerable than any local IT systems as long as you utilize security best practices. You can start by defining a clear cybersecurity strategy and avoid migrating any data to the cloud until your IT team thoroughly understands that strategy and any related processes.
As you begin to work with cloud services providers, you need to do your due diligence in understanding service level agreements and identifying which party is responsible for certain areas of security (what’s commonly known as a ‘shared responsibility model’).
This really comes into play if you handle financial or personally identifiable information that’s subject to compliance and industry regulations. Most cloud providers offer some level of security, but you’re typically on the hook for key items such as backups, passwords, multi-factor authentication and logon restrictions.
It’s important to train all personnel that will be interacting with the cloud platform about security. Many of the greatest vulnerabilities for breaches in the cloud stem from improperly maintained access controls, or weak passwords and logon credentials. In fact, many ransomware attacks in the cloud rely on account hijacking or stolen credentials to access sensitive data. This is where on-going security awareness training and easy-to-understand security policies go a long way in reducing risk.
Threat detection and response capabilities are also critical for securing the cloud. If you can’t identify potential threats in real time, you’ll struggle to prevent breaches. Programmatic detection and response tools are usually a good way to strengthen your overall security posture and proactive 24/7/365 monitoring is a must.
If you don’t happen to have the in-house resources or expertise to handle this type of cybersecurity work, you should seek out a reliable partner that can provide services such as extended detection and response. And don’t forget that you should always have a reliable Disaster Recovery and Business Continuity plan no matter where your data resides.
Sudip Banerjee, Senior Director, Transformation Strategy, Zscaler
Growing use of cloud-based resources is delivering both benefits and challenges to organizations throughout the region. On one hand there is improved flexibility and efficiency, while on the other there’s a need for better IT security.
The continuing increase in cloud usage is being driven by a range of factors, but one of the primary ones is a significant change in work patterns. Where traditionally only a small proportion of employees would work outside an organization’s premises, doing so has now become the norm.
The change, brought about by the COVID-19 pandemic, appears likely to be permanent. Staff may eventually start to work from offices again, but it’s unlikely to be on a five-day-a-week basis.
For this reason, demand for cloud-based resources is going to remain. Staff will continue to rely on productivity platforms such as Microsoft’s Office 365, collaboration services such as Zoom and Teams, and storage resources such as Dropbox and Google Drive.
A changed technology landscape
This new landscape is very different. In the past, an organization would have had the majority of its IT infrastructure located on-premise and connected using local-area and wide-area networks.
This has now changed with remote workers accessing resources over the public Internet, via
home Wi-Fi networks. They are also going directly to cloud-based resources without the protection afforded by office-based security measures.
For IT teams, the challenge therefore becomes finding a way to protect the overall IT infrastructure while at the same time ensuring staff still have efficient and reliable access. This is critical to ensure productivity can be maintained while unauthorized access and threats such as ransomware can also be prevented.
For this to be achieved, IT teams need to have full visibility of all network traffic. Unless they can see what’s going on, they’ll be unable to protect their organization.
However, while this visibility is critical, it can be frustratingly difficult to attain. The IT team needs to know who their users are, what devices they are using, and how they are connecting to both on-premise and cloud-based resources.
Further challenges are being posed by the continuing growth in network traffic. This growth is being driven both by increasing use of cloud resources and also a rise in multi-media traffic generated by video conferencing and other activities.
A new approach to security
For these reasons, organizations are increasingly needing to take a different approach to IT security. The combination of remote working and growing use of cloud resources means traditional measures are simply no longer effective.
In many cases, existing virtual private network (VPN) links can’t deliver the goods. They were designed to support small numbers of remote users, but are challenging and expensive to scale to meet the demands of larger user numbers.
Effective cloud security needs to be based on a zero-trust strategy, where all users must prove their identity and authority to connect. Replacing traditional perimeter defenses, zero trust offers effective protection whether resources are located on-premises or in the cloud.
With the growth in cloud usage only likely to climb, adopting zero trust makes increasing sense. It’s time to start your journey today.Click below to share this article