Definition

edge computing

Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible. The move toward edge computing is driven by mobile computing, the decreasing cost of computer components and the sheer number of networked devices in the internet of things (IoT).

Depending on the implementation, time-sensitive data in an edge computing architecture may be processed at the point of origin by an intelligent device or sent to an intermediary server located in close geographical proximity to the client. Data that is less time-sensitive is sent to the cloud for historical analysis, big data analytics and long-term storage.

Content Continues Below

How does edge computing work?

One simple way to understand the basic concept of edge computing is by comparing it to cloud computing. In cloud computing, data from a variety of disparate sources is sent to a large centralized data center that is often geographically far away from the source of the data. This why the name "cloud" is used -- because data gets uploaded to a monolithic entity that is far away from the source, like evaporation rising to a cloud in the sky.

Edge-to-cloud layers
Unlike cloud computing, edge computing allows data to exist closer to the data sources through a network of edge devices.

By contrast, edge computing is sometimes called fog computing. The word "fog" is meant to convey the idea that the advantages of cloud computing should be brought closer to the data source. (In meteorology, fog is simply a cloud that is close to the ground.) The benefits of the cloud are brought closer to the ground (data source) and are spread out instead of centralized.

The name "edge" in edge computing is derived from network diagrams; typically, the edge in a network diagram signifies the point at which traffic enters or exits the network. The edge is also the point at which the underlying protocol for transporting data may change. For example, a smart sensor might use a low-latency protocol like MQTT to transmit data to a message broker located on the network edge, and the broker would use the hypertext transfer protocol (HTTP) to transmit valuable data from the sensor to a remote server over the Internet.

Edge infrastructure places computing power closer to the source of data. The data has less distance to travel, and more places to travel to than in a typical cloud infrastructure. Edge technology -- sometimes called edge nodes -- is established at the periphery of a network and may take the form of edge servers, edge data centers or other networked devices with compute power. Instead of sending massive amounts of data from disparate locations to a centralized data center, smaller amounts are sent to the edge nodes to be processed and returned, only being sent along to a larger remote data center if necessary.

Why does edge computing matter?

Edge computing is important because the amount of data being generated is growing as time goes on, and the array of devices that generate it are as well. The rise of IoT and mobile computing contribute to both factors.

Transmitting massive amounts of raw data over a network puts a tremendous load on network resources. It can also be difficult to process and maintain that massive amount when it is gathered from disparate sources. The data quality and types of data may vary significantly from source to source, and many resources are used to funnel it all to one centralized location. An edge network architecture can ease this strain on resources by decentralizing and processing the generated data closer to the source. 

In some cases, it is much more efficient to process data near its source and send only the data that has value over the network to a remote data center. Instead of continually broadcasting data about the oil level in a car's engine, for example, an automotive sensor might simply send summary data to a remote server on a periodic basis. A smart thermostat might only transmit data if the temperature rises or falls outside acceptable limits. Or an intelligent Wi-Fi security camera aimed at an elevator door might use edge analytics and only transmit data when a certain percentage of pixels significantly change between two consecutive images, indicating motion

Benefits of edge computing

A major benefit of edge computing is that it improves time to action and reduces response time down to milliseconds, while also conserving network resources. Some specific benefits of edge computing are:

  • Increases capacity for low latency applications and reduced bottlenecks.
  • Enables more efficient use of IoT and mobile computing.
  • Enables 5G connectivity.
  • Enables real-time analytics and improved business intelligence (BI) insights by using machine learning and smart devices within the edge compute
  • Enables quick response times and more accurate processing of time-sensitive data.
  • Centralizes management of devices by giving end-users more access to data processes, allowing for more specific network insight and control.
  • Increases availability of devices and decreases strain on centralized network resources.
  • Enables data caching closer to the source using content delivery networks (CDNs).

Challenges of edge computing

Despite its benefits, edge computing is not expected to completely replace cloud computing. Despite the ability to reduce latency and network bottlenecks, edge computing could pose significant security, licensing and configuration challenges.

  • Security challenges: Edge computing's distributedarchitecture increases the number of attack vectors. Meaning, the more intelligence an edge client has, the more vulnerable it becomes to malware infections and security exploits. Edge devices are also likely to be less protected in terms of physical security than a traditional data center.
  • Licensing challenges: Smart clients can have hidden licensing costs. While the base version of an edge client might initially have a low ticket price, additional functionalities may be licensed separately and drive the price up.
  • Configuration challenges: Unless device management is centralized and extensive, administrators may inadvertently create security holes by failing to change the default password on each edge deviceor neglecting to update firmware in a consistent manner, causing configuration drift.

Edge computing use cases and examples

Edge computing can benefit remote office/branch office (ROBO) environments and organizations that have a geographically dispersed user base. In such a scenario, intermediary micro data centers or high-performance servers can be installed at remote locations to replicate cloud services locally, improving performance and the ability for a device to act upon perishable data in fractions of a second. Depending on the vendor and technical implementation, the intermediary may be referred to by one of several names including edge gateway, base station, hubcloudlet or aggregator.

Another edge computing use case is in the mining industry, where service and the ability to transmit data to a centralized location for processing varies greatly. Edge computing, paired with artificial intelligence and IoT, allows data processing to take place on the devices used in the mine, giving workers access to real-time insights even while in extreme conditions.

Edge computing, IoT and 5G possibilities

5G requires mobile edge computing, largely because 5G relies on a greater multitude of network nodes -- more than 4G, which relied on larger centralized cell towers. 5G's frequency band generally travels shorter distances and is weaker than 4G, so more nodes are required to pass the signal between them and mobile service users. More nodes means a higher likelihood that one may be compromised, so centralized management of these nodes is crucial. The nodes must also be able to process and collect real-time data that enables them to better serve users.

Edge computing is being driven by the proliferation and expansion of IoT. With so many networked devices, from smart speakers to autonomous vehicles to the industrial internet of things (IIoT), there needs to be processing power close to or even on those devices to handle the massive amounts of data they constantly generate.

IoT provides the devices and the data they collect, 5G provides a faster, more powerful network for that data to travel on, and edge computing provides the processing power to make use of and handle that data in real-time. The combination of these three makes for a tightly networked world that would make newer technologies like autonomous vehicles more feasible for public use. Together, the proliferation of 5G, IoT and edge computing could also make smart citie s more feasible in places where they are not now.

This was last updated in July 2020

Continue Reading About edge computing

Dig Deeper on Data center storage and networking

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

How does your organization benefit from edge computing?
Cancel

SearchWindowsServer

SearchServerVirtualization

SearchCloudComputing

Close