The rise of the IoT
IoT devices have given birth to a tremendous amount of information, changing the way data is uploaded and shared. Traditionally, the cloud has played a pivotal role in managing the data churned by IoT devices. However, as the distance between the cloud and IoT devices increases, so does the transmission latency. Consequently, edge computing, which shortens the distance between the cloud and devices by situating an edge server between them, has been championed as a way to tackle this hurdle. As a result, many are wondering if edge computing is weeding out the cloud.
The need for decentralization
One reason many suspect edge computing is weeding out the cloud is the need among IoT devices to process data quickly. Edge computing accomplishes this task by processing data closer to the source rather than a centralized location as is the case with cloud computing. The approach involves allocating and dividing the processes between the centralized system and device.
Take smart lights as an example, which have to process data to determine the color of a traffic light. A smart light needs to be able to be able to process the criteria it is fed immediately, so whatever response it is considering it still pertinent once executed. Edge computing can, in theory, meet this need by processing data near the source, such as inside the light bulb. Autonomous vehicles are another common example of IoT systems that must have computing capabilities built into them.
What is left for the cloud?
This doesn’t mean their is nothing left for the cloud to process, however. While it is true issues related to network performance need to be addressed, abandoning the cloud entirely is an extreme countermeasure. The cloud can still be used to provide various services, including media and entertainment. It can also be used to store and share data produced by IoT devices to be reviewed by stakeholders.
Moreover, from the viewpoint of data centers, building smaller data centers or disaster recovery sites has the potential to decrease economies of scale and make operations less efficient. Although this can reduce latency issues, pertinent data stored in edge computing devices may be lost if the device is corrupted. Therefore, it may be beneficial for companies to store data somewhere other than the edge of a network. With wide-area networks (WAN), for example, it is possible to build data centers at a distance from each other without increasing data or compensating network latency. Latency and its impact can be significantly mitigated regardless of where the data is located.
Complementary approaches
Despite the opportunities opened up by edge computing, many companies and industries will be using the cloud for the years to come. Rather than one weeding out the other, cloud and edge computing can be regarded as two complementary data management processes. With a hybrid approach, each method can be used to compensate for the other’s blind spots, allowing enterprises to reap the benefits of both worlds.
The post Will edge computing weed out the cloud? appeared first on RCR Wireless News.