Latency-aware and Resilient Stream Data Processing in Edge Computing

Presenter

Presentation Menu

Abstract

Recent years have witnessed a huge proliferation of low-cost devices connected in the Internet of Things. Given the large amounts of data generated by these devices near the edge of the network, there is an increasing need to process them near the network edge in order to meet the strict latency requirements for the applications. For instance, Connected Autonomous Vehicle applications such as collision warning, autonomous driving and traffic efficiency applications have a low latency requirement typically between 10 ms to 100 ms. Edge computing improves the quality of service for such applications by filling the latency gaps between the devices and the typical cloud infrastructures. While Micro Data Centers provide computing resources that are geographically distributed, careful management of these resources near the edge of the network is vital for ensuring efficient, cost-effective and resilient operation of the system while providing low-latency access for applications executing near the network edge. This talk will first introduce the notion of Micro Data Centers and the edge computing architecture. We will then discuss the algorithms, techniques and design methodologies focusing on efficient and resilient resource allocation for latency-sensitive stream data processing in edge computing. Finally, we will discuss some open research problems in this area and discuss potential directions for future work.