Edge computing is becoming enormously important in the age of IoT and 5G. Edge computing is revolutionizing the way data aggregated, processed and transferred to millions of devices worldwide. While the goal of the adoption of edge computing was initially to reduce the distances for data transmission, it is now about more: In particular, the rapid growth of the Internet of Things (IoT) and the emergence of novel applications that require data in real time are contributing to the fact that edge computing systems are becoming increasingly important and increasingly in demand.
Edge Computing – Definition
According to Gartner analysts, edge computing is defined as “part of a distributed computing topology where data processing takes place at the edge of the network – where things or people consume the information.”
In other words, edge computing relies on decentralized processing of data: Instead of resorting to a distant data center, the data is processed in the devices that aggregate the data – or at least in close proximity. This is done primarily to reduce latency times – which brings a significant speed advantage, especially for applications that require near-real-time data. In addition, companies can also save costs in terms of bandwidth, data volume and cloud storage space if data processing takes place locally. Sensitive data and specially developed algorithms remain on the company premises and do not migrate to the cloud. The technology was developed primarily to cope with the exponential growth of IoT devices and the amounts of data they raise.
---
A variety of devices can be used as edge devices – from the IoT sensor on a machine in the production hall to a smartphone up to the microwave with online function. Edge gateways are edge devices within an edge computing infrastructure.
Computing at the Network Edge – Advantages & Disadvantages
For many companies, the cost savings are already reason enough to rely on an edge computing architecture. In particular, companies that relied on the cloud at an early stage were often relied on by the actual costs for the bandwidth surprised.
The ability to process and store data faster is increasingly perceived as the biggest benefit of edge computing, after all, real-time applications are critical to success for more and more companies. Without edge computing, for example, a smartphone with facial recognition would first have to send the scan data to a cloud instance and then wait for feedback. With edge computing, on the other hand, the algorithm can handle the data locally on an edge server or gateway (for example, the smartphone itself). In particular, technologies such as virtual and augmented reality, autonomous driving, smart cities or systems for building automation require particularly fast data processing.
Improved interconnectivity, as well as new IoT and industry-specific use cases, will make edge computing one of the key growth areas in the server and storage market over the next decade – and beyond”. Companies have long recognized the increasing demand for data processing at the network edge and are working on new system modules that also include artificial intelligence.
As is the case with new technologies, there are also downsides to edge computing: From a security perspective, data at the network edge. Especially if the data processing involves various devices that are significantly less secure than centralized systems or cloud instances. It is therefore essential that the IT specialists involved are aware of the potential security risks of IoT devices and secure them accordingly. This includes, for example:
- the encryption of data
- sustainable access controls
- the use of Virtual Private Networks
In addition, the varying computing power and connectivity requirements of IoT equipment can also affect the reliability of edge devices. This makes redundancy and failover management mandatory for devices that process data at the network edge. This is the only way to ensure that all data is transmitted and processed correctly in the event of a single network node failing.
Edge Computing – What Role 5G Plays
5G technology is increasingly outgrowing its experimental status worldwide. More and more network operators are rolling out wireless technology, which promises high bandwidths with low latency. Many carriers are integrating edge computing strategies into their 5G deployments to realize data processing in real time – for example with a focus on mobile devices as well as networked and autonomous vehicles.
5G technology is becoming the catalyst for edge computing. In their predictions for 2020, Forrester analysts also assume that the increasing demand for on-demand computing power and real-time applications will be a key driver for edge computing.