To say cloud services are pervasive would be an understatement. From individuals binge-watching content from OTT video platforms to enterprises deploying SaaS solutions, cloud services have numerous touch points throughout our home and working lives. For years, large, centralized data center and cloud architectures have provided access to infrastructure for these services, but that is rapidly evolving.

Today’s centralized cloud is comprised of more than 10,000 data centers scattered across the globe, but a new generation of cloud-native applications are arriving across entertainment, retail, manufacturing, and automotive that are more compute-intensive and latency-sensitive than those we are accustomed to. Consequently, the traditional centralized cloud architecture won’t suffice for these applications any longer. It cannot meet the Quality of Experience (QoE) expectations for this new generation of applications and use cases that require a more dynamic and distributed cloud model.

A rethink of cloud architecture is happening across the industry, as stakeholders realize compute and storage cloud resources must move closer to the edge of the network, where content is created and consumed, to meet the expected QoE. In the next five years, driven by a need to get data and applications closer to end-users, tens of thousands of additional scaled-down data centers will sprout up at the edge of the network to form what is referred to as the Edge Cloud.

So what applications are driving the need for edge cloud?

There are a handful of use cases acting as the main drivers towards edge cloud and accelerating this rethink. Let’s explore a few of these applications in greater detail, that cover both consumer and enterprise applications:

For starters, cloud gaming is touted as the most exciting use case for low-latency edge cloud networks due to its potential to expand the horizons of the already lucrative gaming industry even further. The point of cloud gaming is that gamers no longer require dedicated hardware, such as a traditional gaming console, smartphone or PC to render the game environment. They continue to play their games using gaming controllers but are instead connected to a gaming app on their device of choice with gaming video streamed to the gamer’s device from a server in an edge compute data center. This provides greater flexibility to gamers on the move to pick-up and play at any time and allows gaming companies to increase their addressable market and turn the industry into more of a subscription-based model. 

The comparison is always made: “cloud gaming will do to games console what Netflix did to DVDs”, but to match the performance of a console gaming set-up, the network needs to deliver low latency responses and the required bandwidth to deliver 4K gaming video. This simply must be served from the cloud edge to have any chance of success, otherwise the network won’t keep up and will consequently leave a lot of gamers frustrated when their inputs aren’t matched with outputs in real-time.

And, of course, there’s the video streaming market where the numbers have soared off-the-charts this year as more and more live and recorded video content is being streamed by users at home and on mobile devices. Video consumption has been driving the telecoms industry forward for years, but this latest surge in content consumption means there’s an increased need to move content closer to users to improve performance and optimize long haul networking costs.

The use of Content Delivery Networks (CDNs) and local caching has been a traditional use case for edge storage. Most streaming services will cache popular content in edge data centers, closest to where end-users consume their content, using AI and predictive analytics to help them judge what shows should be cached where. This means when a request comes in for the same content from a different user, it can be processed from the local cache and thus served quicker instead of being fetched from a server in a centralized data center. 

With the growth of streaming video only headed in one direction, more real-time content such as live sports will be added by streaming services in the coming years, with higher quality outputs becoming the norm, enabling the edge cloud for both a real-time and time-shifted multicast environment will be crucial for the future of streaming video delivery.

Another example is autonomous vehicles, which became almost the flagship idealistic use case for next-gen low latency networks when 5G was first mentioned some years ago. One reason we haven’t seen them cruising our highways just yet, is because of the extremely sensitive latency requirements to make them operate safely. It is also important to note that until 5G is widely deployed it will be some time before we see drivers fully embrace this use case. While it is expected that autonomous vehicles will process vast amounts of data from sensors within the car, there’s also a need to locally combine a subset of this data with that of adjacent cars and street sensors in support of traffic congestion, road conditions and accident management. That localization of compute requirement is perfect for the edge cloud to solve.

The potential value of the autonomous driving industry is huge, and those looking to develop autonomous vehicles will no doubt be looking at the level of investment in edge cloud as one key to its success.

Another use case is Industrial IoT. This is an initiative around Industry 4.0 where the manufacturing process is highly customizable and automated. Manufacturing lines are occupied by industrial robots whose functions are carefully controlled by local computing resources that use machine learning and AI to detect defects in the manufacturing process and adjust accordingly in near real time. To simplify connectivity on the plant floor, intelligent industrial robots will be connected via 5G (private or carrier managed) and will require low-latency, and often high-capacity, network performance to ensure low manufacturing defects and maximize safety of local workers.

As with autonomous driving, this is a use case that has been spoken about for some time but the potential value of the IIoT is so great that it will continue to push investment in edge cloud infrastructure. It is only with interconnected micro data centers that the required levels of low-latency to power the IIoT are possible.

And finally, cashier-less retail stores, which is a new approach where retailers use embedded cameras and near-field communication techniques to create a digital-twin of a conventional brick-and mortar store. Think Amazon Go Stores for example. Image and location information are analysed through AI to determine what goods shoppers have purchased and eliminate the need for shoppers to go through a cashier at the check-out counter while their credit cards are billed directly for the goods they have purchased. The system needs to be responsive enough to also identify when customers may return items back on the shelf after deciding they don’t want to purchase them. Significant compute and storage resources are needed to perform this near real-time image processing to deliver a seamless customer experience, and to ensure the system functions as intended for the retailer in that goods are billed to customers who take them.  

Adapting the network to serve 3x the sites or more

These use cases all share similar characteristics; the network needs to enable split-second decisions for services that are, in some instances, replacing human inputs. The network allows compute and store to move closer to the end user all while maintaining the capabilities of the distributed cloud.

This shift to a distributed edge cloud model is already happening. Some have estimated that a fully built-out edge cloud could result in at least five times as many data centers at the network edge as exist today by 2025. These scaled down centers, located in closer proximity to end-users, humans and machines, will require the entire cloud ecosystem to think differently about the role of network connectivity.

The key challenge for edge providers is to efficiently manage the network and application resources for edge cloud data centers during peak periods of usage. The industry should look towards adaptive networking solutions which provide a framework for Edge Cloud, allowing providers to collectively achieve an end-to-end network that grows smarter and more agile every day, with the scale required to respond dynamically to the constant and ever-changing pressures being placed upon it.

The post The services taking cloud to the edge (Reader Forum) appeared first on RCR Wireless News.