Personal tools

Distributed Cloud Edge Computing Architecture

Luzern_DSC_0139
(Luzern, Switzerland - Alvin Wei-Cheng Wong)
 

 

- The Cloud Is Descending To The Network Edge

"Pushing computing, control, data storage and processing into the cloud has been a key trend in the past decade. However, cloud alone is encountering growing limitations in meeting the computing and intelligent networking demands of many new systems and applications. Local computing both at the network edge and among the connected things is often necessary to, for example, meet stringent latency requirements, integrate local multimedia contextual information in real time, reduce processing load and conserve battery power on the endpoints, improve network reliability and resiliency, and overcome the bandwidth and cost constraints for long-haul communications.

The cloud is now "descending" to the network edge and sometimes diffused onto end user devices, which forms the "fog". Fog computing is a mediator between hardware and remote servers. It regulates which information should be sent to the server and which can be processed locally. In this way, fog is an intelligent gateway that offloads clouds enabling more efficient data storage, processing and analysis.

Fog computing distributes computing, data processing, and networking services closer to the end users along the cloud-to-things (C2T) continuum. Instead of concentrating data and computation in a small number of large clouds, fog computing envisions many fog systems deployed close to the end users or where computing and intelligent networking can best meet user needs. Fog computing and networking present a new architecture vision where distributed edge and user devices collaborate with each other and with the clouds to carry out computing, control, networking, and data management tasks.

 

- The Distributed Cloud Edge Computing Infrastructure

For over a decade, centralized cloud computing has been considered a standard IT delivery platform. Though cloud computing is ubiquitous, emerging requirements and workloads are beginning to expose its limitations. Few cloud service providers seriously considered the requirements needed to support resource-constrained nodes reachable only over unreliable or bandwidth-limited network connections, or thought about the needs of applications that demand very high bandwidth, low latency, or widespread compute capacity across many sites. 

New applications, services, and workloads increasingly demand a different kind of architecture, one that’s built to directly support a distributed infrastructure. New requirements for availability and cloud capability at remote sites are needed to support both today’s requirements (retail data analytics, network services) and tomorrow’s innovations (smart cities, AR/VR). The maturity, robustness, flexibility, and simplicity of cloud now needs to be extended across multiple sites and networks in order to cope with evolving demands.  

Edge computing is a distributed computing model in which computing takes place near the physical location where data is being collected and analyzed, rather than on a centralized server or in the cloud. This new infrastructure involves sensors to collect data and edge servers to securely process data in real-time on site, while also connecting other devices, like laptops and smartphones, to the network. Edge computing solutions that facilitate data processing at the origin of business data reduce the steps of data handling and, subsequently, increase workflow efficiency.

Edge computing is a major driver and key enabler for digital transformation projects. One of the guiding principles behind digital transformation initiatives is to achieve efficiency with business workflows. Any additional step in the processing of data and any delay in the processing of digital inputs is likely causing a negative impact. This challenge is amplified by an exploding volume of business inputs from new sources, including IoT devices. 

Generally, the centralized core of data centers is the one with high-speeds and ultra-low latencies. But this power needs to be distributed to the edges of networks so that technologies like 5G can really take off. Having power at the edge is the only way to provide low latency support to applications such as self-driving vehicles, high-frequency trading, or mobile VR. The Distributed Edge Computing Infrastructure is the key to mobile 5G.

 

  

 [More to come ...]

 

 

 

 

 

Document Actions