How Edge Computing Compares with Cloud Computing

Author Topic: How Edge Computing Compares with Cloud Computing  (Read 364 times)

Offline lamisha

  • Full Member
  • ***
  • Posts: 100
    • View Profile
How Edge Computing Compares with Cloud Computing
« on: July 07, 2019, 09:56:15 AM »
Edge computing is a form of cloud computing, but unlike traditional cloud computing architectures that centralize compute and storage in a single data center, edge computing pushes the compute -- or data processing power -- out to the edge devices to handle. Thus, only the results of the data processing need to be transported over networks. In certain situations, this provides precise results and consumes far less network bandwidth.

The internet of things is the most common use case for edge computing. IoT is all about the collection of data from geographically dispersed areas using edge sensors. Those sensors are connected using a data network that often leverages WAN technologies such as MPLS, cellular, and VPNs. In traditional IoT architectures, all collected sensor data is transported to a central repository where it is combined, and the data is processed collectively. This works well only if data needs to be collected and analyzed cumulatively. But what if it's not necessary to combine data to get the desired results? What if each IoT sensor simply needs to process the data it collects and send results when certain requirements are met?

This is where we start to see the benefits of edge computing. If there's no true need to collect all data in a centralized cloud repository, it doesn't make sense to waste expensive bandwidth on transporting it. In fact, a completely valid IoT design may be one where the sensors only connect to the cloud when they have something important to report. This design provides the opportunity to reduce IoT networking costs by leveraging technologies such as cellular-based technologies that use a lower-cost, pay-per-kilobit billing method as opposed to more expensive always-on connectivity.