How can IT leaders ensure better visibility over cloud access in the workplace?

How can IT leaders ensure better visibility over cloud access in the workplace?

Remote work will catalyze a shift from the corporate perimeter concept to micro-office security certification. Outsourcing of IT and cybersecurity functions will be crucial to solve expertise shortages and save budgets. To co-ordinate managed service providers along with using multiple cloud services, cloud security and management skills will become a ‘must-have’.

These and other cybersecurity challenges and trends will be among those that businesses will have to manage this year, according to a new Kaspersky report.

A shift to remote work, financial constraints due to economic recession and the growth of cyberthreats due to the global pandemic will affect the day to day role of cybersecurity professionals in 2021.

Understanding the challenges but also perceiving opportunities in IT and IT security management is key for companies to maintain their protection. The recent Kaspersky report, Plugging the Gaps: 2021 Corporate IT Security Predictions suggests advice for each role related to cybersecurity, including CEOs or business owners, CISOs, SOC team leads and IT managers.

Here are some of the main trends to monitor:

  • Protecting the perimeter is no longer enough – home office assessment and certification will be needed. There should be tools to scan the level of security in a workplace – from the presence of software vulnerabilities to connecting to an unreliable or unprotected Wi-Fi hotspot. It will also require wider adoption of VPN, privileged access management, multifactor authentication systems, the implementation of stricter monitoring and the updating of existing contingency and emergency plans.
  • Transition to a service model will enable required levels of IT and IT security with lower investments. According to Kaspersky’s survey, many businesses are planning to use a managed service provider (MSP) or managed security service provider (MSSP) in the next 12 months. This is for good reason as the service model helps to minimize capital investments and transition business costs from CAPEX to OPEX.
  • Training for internal IT security specialists should incorporate management skills. Cybersecurity professions split into very narrow specializations, meaning that hiring staff for each specific role may be too expensive. This is where outsourcing can help plug the gap. However, businesses that outsource key cybersecurity components still need to focus on developing management skills for their in-house teams to handle those outsourced functions.
  • The survey showed thatin 2020, a significant number of employees used non-corporate software and cloud services such as social networks, messengers or other applications. This is unlikely to change when staff return to the office. To ensure that any corporate data is kept under control, better visibility over cloud access will be necessary. IT security managers will need to align themselves with this cloud paradigm and develop skills for cloud management and protection.

Steve Leeper, Office of the CTO, Datadobi

Steve Leeper, Office of the CTO, Datadobi

Unstructured data (i.e. file and object) is growing exponentially. In addition to user-generated data, application-generated and machine-generated data is consuming more and more capacity and placing strain on storage devices and budgets. 

The answer to this dilemma is not to always expand on-prem storage capacity. Rather the procurement of enterprise grade data mobility software to scan, analyze and visualize the characteristics of content being stored along with the ability to take action is key to providing the flexibility organizations need today.

These actions could be to classify, move, migrate, archive or even delete obsolete data and are the key to managing ever growing content. As is becoming more common, public cloud storage is filling a new role to either replace or augment traditional enterprise storage systems.

Today, data is often collected on an on-prem NAS or object storage system and many organizations are examining how to best derive value from that data. They are taking large existing collections of unstructured data and either making a copy for resiliency’s sake or transferring data to a public cloud provider for the purpose of executing analytics. Plus, when data is relocated to a public cloud there are times when it must be recalled to local on-prem systems.

For data that remains in the public cloud there are issues such as lifecycle management, which is required to efficiently manage cost, governance requirements to comply with and so forth. Key to handling these requirements is having the aforementioned visibility of the overall environment. This is critical to understanding what data is stored where, what data needs to be relocated, being able to relocate that data and to ensure the validity of that data as it is relocated.

Surya Varanasi, Chief Technology Officer, StorCentric

Surya Varanasi, Chief Technology Officer, StorCentric

Cloud computing has maintained an impressive growth trajectory over the past two decades and for good reason. Whether you are looking to extend and enhance your organization’s competitive advantage, further protect your business, or level the competitive playing field, the cloud enables IT and business capabilities, as well as cost efficiencies that would otherwise be quite challenging, if not impossible.

In fact, recent research from Synergy Research Group found that even during the on-going COVID-19 pandemic, while many in the tech industry were taking tremendous hits, Q4 2020 enterprise spending on cloud infrastructure services grew to US$37 billion, up 35% from Q4 2019.

However, before you can utilize the cloud, you need to get there. Data migration, data replication and data synchronization can be complicated endeavors that result in creating obstacles, instead of delivering the strategic business value, IT benefits and budgetary advantages for which they are intended.

The ideal data mobility and management solution should enable the seamless movement of data to, back and between heterogenous hybrid on-site, remote and cloud infrastructures. This capability also eliminates vendor lock-in and enables more extensive content sharing opportunities – increasingly critical during what will likely become an enduring work from home (WFH) paradigm.

More specifically, an ideal data mobility and management solution should streamline point-to-point data movement and tackle data flow requirements from any storage platform to another; with fine-grained filtering and continuous incremental updates to alleviate the challenges of moving and consolidating data across heterogeneous environments.

Next, it should provide complete visibility and management control via an intuitive interface for efficient replication and content distribution across on-premises, remote and cloud resources.

Files should be replicated from anywhere to anywhere, quickly and securely, addressing Business Continuity, Disaster Recovery (DR) and archive requirements, as well as legal and regulatory compliance mandates.

This enables organizations with remote branch offices to ensure fast, local access to critical files, to gain the greatest efficiency and capabilities at the Edge. Reverse workflow capabilities are also necessary for data consolidation from remote locations back to central locations.

And last but not least, the data mobility and management solution should enable files to be synchronized across multiple storage repositories, including disk and tape, as well as private and public cloud providers. This will ensure internal and external users can be provided with safe, secure and uninterrupted access to all appropriate data.

Asif Savvas, Senior Vice President, Product and Offerings at Simeio

Asif Savvas, Senior Vice President, Product and Offerings at Simeio

Ensuring clear visibility into cloud applications and data is a priority for every IT leader. Yet, many are still unable to properly monitor and manage user access across multiple cloud environments.

Enterprises require seamless and robust visibility over their diverse cloud environments, to protect users, data and applications. Whether on-premises, in the cloud or within a hybrid environment, all user credentials must be secured from those who might use them for untoward purposes.

Visibility starts with controlling access and allowing permissions. However, this presents a challenge when every cloud provider has its own dashboard, access controls and processes. It becomes an arduous task of coalescing access records from multiple cloud providers. It’s nearly impossible to have the visibility and context needed to quickly identify and analyze attacks.

The complexities extend much further than access management. Working with multi-cloud providers requires federated single sign-on, onboarding and off-boarding users, authentication, authorization and privileged account management. Additionally, automated controls with policy-based and centralized orchestration are needed to govern identity across multi-clouds.

Visibility across an organization’s entire ecosystem of on-prem, multi-clouds and hybrid environments must apply to all users. This includes business users accessing SaaS, like Salesforce and Office 365. It applies to privileged users and DevOps team members building and working with apps within IaaS and PaaS environments, like AWS, Azure and Google Cloud, where access is typically accomplished through PAM or native DevOps management systems.

These independent cloud platforms represent different siloed access solutions, each with their own unique login methods.

Securing data is difficult to accomplish with inefficient visibility across complex multi-cloud and hybrid environments. The problems associated with siloed access to multi-clouds are many, including potential data breaches from hackers and insider threats. Misconfigurations and insufficient change controls, insecure interfaces and APIs and inadequate credential and key management, undermine security.

Seamless and robust visibility into multi-clouds requires a single identity platform with an orchestration layer that unifies access to all cloud services. This would include management and control of access, authentication, authorization and governance for business users, privileged users and DevOps secrets.

IT leaders seeking simplified and actionable cloud visibility require a common identity fabric that abstracts siloed cloud access services. This provides a single, yet ubiquitous view of multi-cloud platforms, on-premises infrastructure and identities of business, privileged and DevOps users, all within a single pane-of-glass. Apply corporate policies once and enforce them consistently and automatically across all cloud environments. This is how enterprise IT and DevOps teams can ensure all of their user identities, applications and systems are protected.

Kunal Agarwal, CEO, Unravel Data

Kunal Agarwal, CEO, Unravel Data

We’ve built our DataOps platform to address cost, performance and reliability concerns, for every cloud platform. It interfaces nicely with governance and security concerns such as those described above. So we discuss these issues with customers all the time.

So this question about cloud access is valuable but it’s tactical. It’s part of a bigger question: How can IT leaders have better visibility over access to all of an organization’s data processing tools and all of its data? Whether the code and data involved are on-premises, in any cloud, or in transit.

You should always have access to your current and budgeted spending for all your data services. We think of this as a DataOps issue. DataOps is the intersection between data engineering, data processing and operations. It’s like DevOps but it starts with data flows first.

When software is in development and being run against test data, or copies of existing data, it can more easily be kept controlled. But when software and data are used in production, you get operational concerns.

These questions gain a sharper edge in the cloud because it’s new, so let me answer this directly. Governance controls need to be implemented and maintained at the technical level in order to meet a wide variety of concerns, including security of code, data and metadata. Metadata leaks, at very large scale, have been hugely embarrassing and costly to some of the companies that have suffered them.

Cost concerns also come up forcefully in the cloud. On-premises, costs are largely sunk – you paid for a certain number of servers, within a deliberative process and any large expenditure is closely scrutinized. The cloud is ‘pay as you go’; if you turn that around, it means, ‘as you go, you pay.’ When you’re working at scale, it’s easy to run up a six-figure bill in the cloud in just a couple of days, even if you’re just testing a new service on production-type volumes of data.

We have careful performance and resource use tracking in the Unravel data platform. As people have moved to cloud, we are adding in explicit cost reporting and controls. You can put a ‘stop loss’ on cloud spending for a job – for example, alerting you, or even pausing a job if spending on a workload reaches, say, four figures in cost.

So it may be that moving some workloads to the cloud is exposing the fact that you didn’t have the controls needed – which, perhaps, you should have had – on-premises, as well. Take a holistic approach to your entire data and processing estate. ‘Solve the problem’ across your organization, rather than playing ‘whack-a-mole’ as questions get asked, or problems arise. You’ll sleep better at night and so will your customers and other stakeholders.

JG Heithcock, GM of Retrospect, a StorCentric Company

JG Heithcock, GM of Retrospect

The cloud has long been viewed by IT leaders as indispensable, as it provides the ability to access and combine a virtually limitless expanse of IT infrastructure and software, in order to meet almost any IT, business and budgetary requirement.

More recently, in the face of the on-going COVID-19 pandemic, organizations around the world experienced an almost overnight shift of tens of millions of workers from an on-site to work from home (WFH) scenario (a paradigm that will likely endure for years, if not indefinitely). For IT leaders, this meant that enabling convenient, continuous and secure remote access to the central data center, as well as to cloud resources, became priority number one just as quickly. 

While there are countless uses and benefits of the cloud, topping the list for many is ensuring data protection and Business Continuity. Certainly, over the past year as ransomware and other cyberattacks became increasingly prevalent driven by bad actors hoping to exploit WFH security vulnerabilities, the need to fortify data protection and Business Continuity grew exponentially in importance.

For most IT leaders, a clear backup strategy enabled them to sleep at night regardless of their security strategy. This is because they knew secure and clean copies existed that would keep their business running and enable them to avoid paying exorbitant ransom in the case of a data loss, hack attack or disaster.

And, this is where the cloud comes in. A popular backup method is the 3-2-1 backup rule, which specifies that a minimum of three different copies of data be saved across multiple locations to help organizations quickly and easily recover data and avoid disruption. With this method, at a minimum data should be stored on the computer, on local storage and on offsite storage. While this third copy can be tape or a remote data center, the cloud has emerged as perhaps the fastest, easiest, safest and most affordable option.

The question then becomes how to choose the ideal backup solution to get your data to where it needs to be. My advice would be to first choose a solution that connects to virtually any cloud storage provider, anywhere in the world, to avoid vendor lock-in and ensure you are able to obtain the capabilities and pricing that makes the most sense for you, your users and your applications.

The backup solution should not only backup from your data center but be capable of moving data from one cloud to another, and back again. Certainly, security is critical in the backup solution as well. Look for providers that can offer AES-256 encryption in-transit and at-rest, so that only those that are approved can access the backups. And of course, time is money. Choose a solution that provides fast upload speeds. One that can saturate any connection with multiple simultaneous backups or restores is ideal. 

Click below to share this article

Browse our latest issue

Intelligent CIO North America

View Magazine Archive