How edge computing will shake up IT

4


6

Over the last few years, edge computing has come to the fore as organisations seek to overcome the problems of latency that can bedevil cloud computing deployments.

It’s clearly set to grow massively. The worldwide edge computing market was estimated to be around $4 billion in 2020 and is forecast to grow to $17.8 billion by 2026, according to research firm StrategyR.

Why choose edge and when?

So, why should CIOs investigate edge computing? Quite simply put, as companies scale their computing power, this power is concentrated. In turn, this puts stress on networking capacity, creating problems in terms of infrastructure architecture.

Edge computing overcomes these problems. It places processing power near the “edge” of a network so that data can be processed locally, rather than needing to be transmitted over a wide area network. This differs from cloud computing; here, data gets routed to a central location. Instead, edge computing localises data, reducing data transfer and increasing response times.

This last part is important. For example, driverless cars can’t wait milliseconds sending data to a distant data centre before deciding whether to stop at a red light. Heart monitoring systems need to maintain a connection with devices that record a patient’s heart activity to know if that person is stable or in distress. Point-of-sale systems need to process card transactions even if the connection to head office is offline or poor. If there is a gas leak at a factory and a network connection is not available, will this pollution go unnoticed?

Sending data back to the cloud would not be practical – there would be too much latency and not enough bandwidth to send all the data in time. Any delay in processing and making decisions could have potentially devastating consequences.

Factors to consider

If your organisation needs to overcome latency and bandwidth issues, then taking computing to the edge may well be the solution. But there can be several issues to consider.

For a start, it won’t make your cloud or centralised data centre redundant overnight. Edge computing is a complementary technology that plays to its strengths when used as near as possible to the data it needs to process.

Security - That means that security will be a factor in deploying edge computing. Data is collected and processed near to the sensors that create this data and may be vulnerable to hacking. Strong end-to-end security is necessary from the sensor to the data centre to mitigate security threats.

Interoperability - This is essential as edge computing deployment will differ in terms of sensors, devices, and connectivity. Sensors on the factory floor will have very different needs to heart monitoring systems in hospitals. Organisations will need an edge provider that can help not only implement endpoint sensors but also connect and secure them to the cloud.

Support – Data centres at the edge will be in more geographically diverse locations, making maintenance more difficult. The management systems supporting edge computing need to be highly automated and orchestrated.

Overcoming the challenges and operating an edge computing set-up

Computing at the edge is all about implementing sensors, devices, and servers at remote locations; it must be a completely integrated part of an organisation’s IT infrastructure.

The capabilities needed are the edge are like those in traditional IT infrastructure. But that is not all – this infrastructure may have to be rugged as it may be placed in locations that are exposed to the elements. Devices and systems may need to be ruggedised or need special packaging. There will be a need for systems to have physical security.

Edge applications, while business critical, will be located remotely. This means that there is more need for reliability and maintenance planning must be specialised.

Infrastructure on the edge may not always have a direct connection to the enterprise network and may need to use third-party connectivity such as 5G. Edge sensors may be located where connectivity is unreliable, Applications are the edge need to keep running even when there is no connection to the main network.

Changing the system architecture to accommodate edge computing

With edge computing, it will be necessary to create a system architecture to fulfil both user and application requirements. CIOs need to understand which parts of the system are run in the cloud, and which are run in the edge. That means there are several issues to consider when designing an architecture that is right for edge computing.

Data response times – How quickly does data need to be analysed at the source to make decisions? Machine controls and system systems should always be on the edge, for everything else, the matter is one of response times. When the edge becomes more distant from computing resources, data response/analysis times get slower.

Data transfer and connectivity – What data needs to be processed centrally versus at the edge? If connectivity and bandwidth are limited, more processing needs to be done at the edge. If bandwidth is reliable and fast, that processing can be done centrally in the cloud.

Cybersecurity and regulation - When designing a system architecture that incorporates edge computing resources, CIOs need to ensure that this architecture adheres to cyber security best practices and any regulations that govern the data where it is located. CIOs should also implement a zero-trust policy for all systems and devices on the edge with a least-privileged approach to access control.

How can AI help at the edge?

Artificial intelligence has come on leaps and bounds, and the cloud has given it a massive dataset in which it can learn from and use to make better decisions for organisations. As computing moves further to the edge, it makes sense for AI to also go there as that’s where the data is. Simply put, it doesn’t make sense for data such as video or audio streaming to be sent into the cloud and back for every endpoint or situation.

What vendors are doing to give organisations the edge

The move to edge computing has had vendors re-thinking their approaches. Dell, for example, has introduced its VxRail satellite nodes to crack what Michael Dell is calling “the new frontier”. This extends the benefits found in the core data centre to address most edge use cases. This means the edge has the same management, security and automation features as the core.

As digital transformations get underway in the enterprise, workloads expanding outside of traditional data centres and the proliferation of 5G networks, there is an immediate need for a small footprint, low-cost, easy-to-manage infrastructure solution. With a standardised architecture such as VxRail, the IT silos typically found are considerably reduced and unintentional architectures (or sprawl) becomes a thing of the past.

Dell has helped thousands of organisations modernise their infrastructure and deploy essential Edge solutions critical to their businesses.

Learn more about Dell Technologies and VMware solutions: APEX Cloud Services - Cloud-as-a-Service | Dell Technologies United Kingdom