Edge AI : AI and Edge in synergy
Reading time: 5 min
Fog and Edge Computing technologies are extensions, further advances in Cloud computing. If we put Edge Computing aside, we'll see how it synergizes with Artificial Intelligence to make this association the winning pair of the century. With the advent of the 5G wireless Internet, it is timely to discuss how Edge Computing plays a leading role in this advanced technology. [1] De facto, not only AI extends the Edge computing capacity, but reciprocity is also true: they challenge themselves to push the boundaries of user experience to provide the end-users with a good one. Together, these two trending technologies add more value than separately. [2] Let us see why.
Edge AI or why combining Edge Computing with Artificial Intelligence
Artificial Intelligence and Edge computing are cutting-edge technologies that are gaining more and more popularity, each on its own. Nevertheless, what we will see in this article is the case when they should be considered correlated and complementary to bring benefits to each other.
Indeed, while AI represents a significant application of edge computing, this latter, in turn, provides a channel for unlocking AI's power to its maximum. This is what is called Edge AI.
The idea is to provide enough bandwidth to broadcast data such as videos and audio tracks on streaming while at the same time encouraging cheap production and seamless consumption. To optimize the throughput, backhaul and cloud storage costs for streaming and storing are calculated.
This is better encountered during data buffering: the more harmonious the internet connection, the less hazardous loading an online application or playing a movie will be for instance.
What exactly does AI bring to Edge computing?
The main goal of edge computing is to reduce the latency between content delivery and the dilemma between the cost of production and data processing. AI simply promotes all of these elements in the most effective way possible.
The two broad lines of action consist of taking advantage of limited resources while aiming at a high debit versus a reduced cost or expenditure. [1] This is done simply by retrieving data directly from the source and providing end-users with a seamless experience of content delivery. End-users should receive the same quality wherever they are, whether in video or audio format as often expected, and regardless of how many people they can be worldwide.
Therefore, adding AI into the loop can but amplify the power and accuracy of the algorithms needed for optimizing such operations.
A lack of competitiveness puts the future of a given TelCo firm at stake. It is salutary to prepare for a matter of market coverage and a matter of consumers' back-up. As a part of a product performance, the high rate of waiting in comparison to other networks is among the significant reasons why customers are dissatisfied. [11] Subsequently, since there is no fierce battlefield as the TelCo market, it is certainly a proactive move to embed AI to steer clear of these issues one after another.
Why and How do Artificial intelligence and Edge computing make interesting allies?
To explain the important role AI plays on edge computing,
- the SFL partners consulting website mentions 3 leading reasons, namely: the reduction of the amount of data before it is being centralized and transmitted to the cloud or remote repository. By truncating the amount of data bit-by-bit instead of a batch, this will sooner be ingested in the lake or data warehouse instead of piling before ever getting processed.
- Simultaneous synchronization of data on the receipt: for gaining insights faster within the constraints of data-driven decision-making.
- Protection of credentials: sensitive information is treated directly at the root as opposed to storage in the cloud to avoid cyber threats and attacks during data transfer.
- ZDNet corroborates this by highlighting two other benefits apart from low latency. Based on what Edge Computing does, not only does it provide easier access to care and troubleshooting via micro data centers (µDC),
- it also optimizes the power usage effectiveness (PUE) which is a metric for evaluating either the billing for cooling servers balances the cost of energy used for activating them. [2]
Read their article for a complete, intensive analysis of the matter.
Use Cases
As its name already suggests, Edge analytics is a method for performing analytics upon peripheral elements of the system.
It is used in distributed systems to intercept signals coming from connected devices, sensors, transducers, and other data acquisition devices. Delivering and receiving, uploading, and downloading: these are actions that absorb huge resources.
The Internet of Things and the Manufacturing field fit into that description. These technologies demand acute precision as well as security.
If the signals were subject to lose on the way, that would be dramatic at the reception of the product. In consequence, performance needs to remain high no matter what happens even during bad weather.
This is why with AI, risk management is controlled with predictive analytics, computer vision as a niche domain of AI will be duly used in IoT and camera surveillance, robotic commands involving feed-forward control loops are made easy in manufacturing.
Here is a list of a few use cases:
- Supervision and tracking purposes:
- Autopiloting cars : Tesla, Acura: RLX , Audi: A8, E-Tron, Q5/SQ5, Q7, Q8, BMW: 5 Series, 6 Series, 7 Series, Ford: Edge....
- AI-based Speakers: Siri, Alexa, Cortana
- Industrial distributed systems
Examples
Firms working in telecommunications (TelCo) are the first to make use of AI in Edge Computing:
- AWS IoT GreenGrass
- Cisco SmartAdvisor
- Dell Statistica
- HPE Edgeline
- IBM Watson IoT Edge Analytics
- Intel IoT Developer Kit
- Microsoft Azure IoT Edge
- Oracle Edge Analytics (OEA)
- PTC ThingWorx Analytics
- Streaming Lite by SAP HANA
Experiments
When to implement Edge AI into the connectivity infrastructure?
Edge AI consists of being able to bring the AI training -supposed to be done in the cloud- closer or right inside or at the output port of the connected device. In other words, the preprocessing of data is done through on-premise hardware.
To observe the difference, this article suggests to compare how Edge AI-based architecture performs in relation to a cloud-based architecture.
The results of the experiment summarizes as follows:
=> To be able to integrate AI into the edge computing of your company you must first consider these 3 parameters:
- if you already possess an architecture made of Edge devices and want to exploit it,
- if you have previously encountered problems of performance and cost with the Cloud
- if you want very sensitive data to stay locally but not be released into the wild,
Interesting DIY projects with nice Edge AI devices
Seeedstudio.com gathered a list of interesting Edge AI devices for those who want to try DIY projects with them.
NVIDIA Jetson Nano is a developer kit for those who want to get started. The tutorial is also provided here.
It is a powerful, compact little module for training AI models and frameworks.
Jetson Nano can be associated with JetPack, which comes with a board support package (BSP).
Seed studio also mentions that, among others, it supports "Linux OS, NVIDIA CUDA®, cuDNN, and TensorRT™ software libraries for deep learning, computer vision, GPU computing, multimedia processing". We can acquire it for $89, which is a reasonable price given all its capabilities.
The configurations of this little marvel are here:
SpecsNVIDIA® Jetson Nano™ Developer Kit
CPU Quad-core ARM® A57 CPU
GPU 128-core NVIDIA Maxwell™ GPU
RAM4 GB 64-bit LPDDR4
Take away
In a nutshell, associating Artificial Intelligence with Edge Computing can but benefit consumers who are nd more stringent and hungrier for the best content delivery these days. With the arrival of 5G in cellular data networks, more computing is indispensable to empower the management of shared ways and means to avoid consumers' discontent.
Associating AI with Edge Computing offers a new perspective in terms of smarter technology. As the paramount promising technologies in TelCo, they would pave the way to a more economical usage of bandwidth and computational complexity.
Improving the user experience is essentially what it is about, and the ability to do it in a tight budget is the show of strength these two foster.
These are some reasons why you should go for it. Otherwise, you might find that it is not worth it or even that it is superfluous.
However, there are good devices to start experimenting with Edge AI, to see what it is.
Read also
[1] Artificial intelligence: a killer app for edge computing?
[2] Edge computing: use cases, business opportunities and growth strategies
[3] Edge Computing Benefits for AI Grow More Apparent
[4] Edge Analytics in 2020: Guide to intelligent IoT applications
[5] Cloud, Fog, and Edge Computing: 3 Differences That Matter
[6} How to include edge computing in your 2020 IT budget
[7} The edge takes shape: The 5G telco cloud that would compete with Amazon