AWS exec lays out four principles guiding edge computing vision

According to Satyen Yadav, general manager, IoT and edge services, Amazon Web Services, the network edge is all around us—at our homes, along our roadways, at the hospitals we visit and at the manufacturing facilities where we work. And by investing in edge computing, we can improve healthcare, use energy more efficiently and create safer cities that offer residents a better quality-of-life, he told attendees of the Wireless Infrastructure Association’s recent Connect X event in Charlotte, North Carolina. “So the question is,” he said, “what advancement can we expect in the future, together as an industry, to make that happen?”

Yadav made a distinction between edge computing and intelligent edge computing, which adds machine learning and artificial intelligence to the equation. He gave the example of a connected camera capable of monitoring traffic or safety conditions. “It can do a lot more things when you apply machine learning to it. It can detect what other city services you need to do. If trash cans are lined up in the view, it can detect when the trash can is full. You don’t have to put anymore sensors on it. If you apply intelligence to these use cases, you can see that over time the same use cases can improve because the systems are learning and changing their behavior.”

During his presentation, Yadav delineated four principles that can inform how intelligent edge computing systems are designed:

  • “The speed of light is constant, so the distance from the source of the data that you want to process and where you want to process it will dictate how fast you can respond.”
  • Should data be processed locally or in a centralized cloud? “The second factor to consider is the cost of the data transportation—not [all]data has the same value.”
  • Regulations around data privacy, HIPA and GDPR for example, “will dictate other boundaries.”
  • And, “To make this work, you need to think of a consistent programming model. One of the difficulties we have today is the embedded systems are built to run for a number of years as an independent entity. In order make this this happen…there has to be a programming model that we could apply across from the edge to the cloud.”

In terms of infrastructure, the edge is essentially a network endpoint, whether that’s a macro tower, a small cell or colocated base stations serving a group of buildings or large enterprise campus. To that point, Jay Brown, CEO of Crown Castle, touched on edge computing in a Q4 2017 earnings call earlier this year. “The development of future technologies has the potential to further extend the runway of growth. Emerging technologies including 5G, autonomous vehicles, augmented or virtual reality and internet of things applications will require mission critical network infrastructure that provides availability anywhere at any time on any device.” Noting latency and density needs, Brown said, “With our distributed real estate…we believe Crown Castle is in a unique position to benefit from these trends longer-term.”

In addition to the siting component, Crown Castle has also invested in an Austin, Texas-based firm, Vapor IO, which is keenly focused on edge computing. Vapor IO produces the Vapor Edge Module, designed for multi-tenancy edge processing, and last year Crown Castle made an investment in the firm.

Watch Yadav’s  full presentation here.

 

 

 

 

 

The post Infrastructure key to enabling intelligent edge computing appeared first on RCR Wireless News.