The Advantages and Disadvantages of Edge Computing

November 6, 2020 • Devin Partida


The rise of real-time IoT data analytics has businesses turning to a new kind of latency-reducing tech, edge computing, which distributes data processing to the network’s edge. The tech can cut down on latency and reduce the amount of processing that needs to happen in the cloud. However, there are also some significant disadvantages of edge computing that companies will need to contend with.

How Edge Computing Enables New IoT Innovations

For some in-development IoT technology — like the remote surgical robots, or self-driving car — latency will literally be a matter of life and death. A big enough delay in processing could cause serious harm. Or, at the very least, result in less-than-adequate performance or analysis.

Edge computing tries to solve this problem by distributing the function of the cloud. Rather than rely on the cloud center to handle all requests from IoT devices, businesses can simply move processing directly on to edge gateways, or devices themselves.

By keeping data discrete and on the device that created it, you can more cost-effectively analyze information in real time.  

Edge computing is sometimes used in conjunction with fog computing. In fog computing, a company also maintains nodes of processing power closer to the “edge” of the network.  

These edge nodes handle some portion of the processing or storage needed by edge devices. If data needs time-critical analysis or temporary storage that can’t be handled on-device, it will be sent to an edge node. If more intensive and less time-sensitive processing is necessary, data will be sent to the cloud instead.

In some applications, it may even be possible to do away with the cloud altogether.

Why Edge Computing Can Be Hard to Implement

Unfortunately, there are also some significant disadvantages of edge computing that IoT companies will need to contend with.

IoT devices are notoriously difficult to keep secure. Medium to large businesses may be managing IoT fleets including hundreds of devices or more. Without robust security practice, data collected and analyzed on edge devices could be extremely vulnerable to hackers. 

With control of an edge device, a hacker could feed other devices bad data, causing them to behave unpredictably. They could even gain access to the business’s main network.

All this, however, is less of a risk of edge computing and more of a risk inherent to IoT technology right now. Even without edge computing, IoT devices create some significant security challenges.

With edge computing, devices and gateways on the edge may also be working with incomplete data. IoT data-collection solutions often focus on collecting vast stores of data. This is done to provide the clearest possible picture of business operations. 

With edge computing, a device is only guaranteed to have access to a subset of this data. Because some data analysis is handled on-device, some data may not be sent to the enterprise data center, depending on how edge computing was implemented. 

As a result, a business may have an incomplete picture of their operations.

Edge computing can also increase hardware costs at the edge of the network. Real-time data analysis isn’t cheap. If you want a device to process some of the data it collects, you’ll need to spend on compute power and storage at the network edge.

The Edge May Still Be the Future

Edge computing may be inevitable. Some IoT devices won’t be safe or useful without near-real-time processing. Latency can render these devices unusable. Some combination of edge and fog computing can reduce that latency to an acceptable level.

It’s likely that despite the disadvantages of the tech, companies will start to adopt edge computing over the next few years. As they do, they’ll need to contend with the serious pitfalls of the approach. 

Edge computing may significantly increase security risks. The approach can also make data transparency and availability more difficult to manage.