As more and more organisations look to implement edge computing strategies, it’s important that they ensure their infrastructure is future-proofed for the technology. Olivier Alquier, Head of Enterprise, Europe, at CommScope, talks us through five points that service providers ought to consider before moving to the edge.
Currently sitting at the peak of Gartner’s hype cycle alongside other hot technology trends such as IoT and Machine Learning, edge computing is attracting significant interest from businesses across a range of sectors.
According to Futurum Research’s ‘Edge Computing Index: From Edge to Enterprise’, nearly three quarters (72%) of organisations have already implemented an edge computing strategy, or are in the process of doing so, and 93% of those that aren’t intend to invest in the technology over the next year.
The popularity of edge computing is hardly surprising given the increasing demand for a better customer experience and for improved latency as networks continue to grow and become more widely distributed.
Indeed, meeting this demand has a direct effect on an organisation’s bottom line. Amazon, for example, found that every 100ms of latency cost the company 1% in sales, while Google saw traffic fall by 20% for every 0.5 of a second in search page generation time.
Building an edge data centre is not an exercise to be undertaken lightly however and requires significant planning and preparation.
Here are five points we’d suggest service providers ought to consider before moving to the edge.
Location, location, location
When it comes to geographical area and the site’s physical characteristics, location is everything. It’s worth considering, for example, whether the data centre is located close enough to a customer to ensure it delivers minimal latency and maximum experience. It’s important too to bear in mind the impact that relevant regional data privacy regulations will have on an organisation and to ensure that each site is compliant.
There are several factors to bear in mind with regard to the physical building, such as whether there is sufficient square footage for the number of racks and cabinets required and whether the space will allow for future expansion. The building’s existing infrastructure will require scrutiny; if it’s not found to be fit for purpose, the entire building may need to be retrofitted.
The idea of power planning for a data centre may be appear obvious but edge data centres have very specific needs. Power redundancy, for instance, will tend to be a given a given for traditional data centres. However, at the edge it can often be too expensive or even, in some cases, unavailable.
Ideally, power should enter a facility via a number of different points. Providers should therefore consider whether that facility could be serviced by more than the utility grid. At the same time, though, it’s important to plan for the worst. Should a power outage occur, back-up generators should be able to support the data centre for at least 48 hours.
While heating, ventilation and air-conditioning (HVAC) are essential to the smooth running of a data centre, they are also one of the biggest consumers of power; half of all a data centre’s power is currently used on HVAC. It’s important therefore that service providers find a way of ensuring greater efficiency and cost-effectiveness.
Free-cooling or hot-aisle/cold-aisle designs, for example, are simple, cost-effective means of controlling the temperature within a facility, while temperature sensors on racks are an efficient means of monitoring it.
Safety and security
The majority of edge data centres are designed to protect their valuable infrastructure and the data it holds. However, while cybersecurity dominates conversation around the subject nowadays, it’s important not to overlook physical security measures such as the use of biometrics in addition to key cards and more traditional means of identification.
Fire safety is a key consideration and must be viewed from a specific data centre perspective. In order to prevent specialist equipment suffering water damage, for example, the installation of special inert gas-based systems is required in place of traditional sprinkler systems.
Forward to the future
It is essential to ensure that any data centre is ready to deal with whatever the future may hold. After all, with the first commercial rollouts of 5G happening this year and all that this entails, the volume of data due to be processed at the edge is expected to be huge.
Service providers looking to embrace the cloud and virtualisation should prepare for any eventuality and design their physical layer infrastructure to accommodate a number of upgrades over its first three to five years.
The points outlined above offer an overview of just some of the considerations service providers should bear in mind when building an edge data centre.
Ultimately though, with demands around latency, bandwidth and customer experience growing – and showing no sign of slowing in the foreseeable future – the key is to accommodate today’s needs and prepare for tomorrow’s.Click below to share this article