Personal tools

Foundations of Edge AI

Princeton University_050622A
[Princeton University]

 

- Overview

Artificial intelligence (AI) solutions, especially those based on deep learning in the field of computer vision, are completed in cloud environments that require large amounts of computing power. 

Inference is a relatively less computationally intensive task than training, where latency is more important to provide immediate results on the model. Most inference is still performed in the cloud or on servers, but as the diversity of AI applications grows, centralized training and inference paradigms are being questioned. 

Edge AI, or edge artificial intelligence, is the use of artificial intelligence (AI) techniques in an edge computing environment. 

Presently, common examples of edge AI include smartphones, wearable health-monitoring accessories (e.g., smart watches), real-time traffic updates on autonomous vehicles, connected devices and smart appliances

 

- Micro-Data Centers

Edge computing plays a vital role in the efficient implementation of several embedded applications such as artificial intelligence (AI), machine learning (ML), deep learning (DL), and the Internet of Things (IoT). However, today's data centers are currently unable to meet the requirements of these types of applications. This is where the Edge-Micro Data Center (EMDC) comes into play.

By moving intelligence closer to the embedded system (i.e., the edge), it is possible to create systems with a high degree of autonomy and decision-making capabilities. In this way, reliance on the cloud (typically centralized systems) is reduced, resulting in benefits in terms of energy savings, reduced latency, and lower costs.

Self-driving cars, robotic surgery, augmented reality in manufacturing, and drones are a few examples of early applications of edge computing. As of today, current data centers with "cloud services" (hyperscale, mega, and colocation) cannot meet the requirements of these applications, thus requiring complementary edge infrastructure such as EMDC and "edge services".

This edge infrastructure, hardware, and edge services must meet the following requirements:

  • High computational speed, requiring data to be processed as locally as possible (i.e. at the edge)
  • High elasticity
  • High efficiency

 

The University of Chicago_050723B
[The University of Chicago]

- Edge AI Is The Next Wave of AI

Edge AI is the next wave of AI. detaching the requirement of cloud systems. Edge AI is processing information closer to the users and devices that require it, rather than sending that data for processing in central locations in the cloud.

In the last few years, AI implementations in various companies have changed around the world. As more enterprise-wide efforts dominate, Cloud Computing became an essential component of the AI evolution. 

As customers spend more time on their devices, businesses increasingly realize the need to bring essential computation onto the device to serve more customers. This is the reason that the Edge Computing market will continue to accelerate in the next few years.

Today, it is possible and easier to run AI and ML and perform analytics at the edge, depending on the size and scale of the edge site and the specific systems used. 

Although edge site computing systems are much smaller than those in central data centers, they have matured and can now successfully run many workloads due to the tremendous growth in processing power of today's x86 commodity servers. It’s amazing how many workloads can run successfully at the edge now.

  

- Distributed Edge Computing and Edge AI

Distributed edge computing and edge AI are two popular paradigms. 

Distributed edge computing delegates computational workloads to autonomous devices located at the data source. This is different from edge computing, which moves computation and data storage closer to the data source. 

Edge AI uses artificial intelligence (AI) techniques to enable a data gathering device in the field to provide actionable intelligence. Edge AI chips have three main parts: 

  • Scalar engines: Run Linux-class applications and safety-critical code
  • Adaptable engines: Process data from sensors
  • Intelligence engines: Run common edge workloads such as AI


Edge AI has many notable examples, including: 

  • Facial recognition
  • Real-time traffic updates on semi-autonomous vehicles
  • Connected devices
  • Smartphones
  • Robots
  • Drones
  • Wearable health monitoring devices
  • Security cameras

 
 

[More to come ...]



Document Actions