Paper accepted at the 21st IEEE/ACM international Symposium on Cluster, Cloud and Internet Computing (CCGrid 2021)

The paper “A Two-Sided Matching Model for Data Stream Processing in the Cloud–Fog Continuum” has been accepted for publication at the 21st IEEE/ACM international Symposium on Cluster, Cloud and Internet Computing (CCGrid 2021).

Authors: Narges Mehran, Dragi Kimovskiand Radu Prodan

Abstract: Latency-sensitive and bandwidth-intensive stream processing applications are dominant traffic generators over the Internet network. A stream consists of a continuous sequence of data elements, which require processing in nearly real-time. To improve communication latency and reduce the network congestion, Fog computing complements the Cloud services by moving the computation towards the edge of the network. Unfortunately, the heterogeneity of the new Cloud–Fog continuum raises important challenges related to deploying and executing data stream applications. We explore in this work a two-sided stable matching model called Cloud–Fog to data stream application matching (CODA) for deploying a distributed application represented as a workflow of stream processing microservices on heterogeneous Cloud–Fog computing resources. In CODA, the application microservices rank the continuum resources based on their microservice stream processing time, while resources rank the stream processing microservices based on their residual bandwidth. A stable many-to-one matching algorithm assigns microservices to resources based on their mutual preferences, aiming to optimize the complete stream processing time on the application side, and the total streaming traffic on the resource side.
We evaluate the CODA algorithm using simulated and real-world Cloud–Fog scenarios. We achieved 11 to 45 % lower stream processing time and 1.3 to 20 % lower streaming traffic compared to related state-of-the-art approaches.

Paper accepted in IEEE Internet Computing: “Cloud, Fog or Edge: Where to Compute?”

The manuscript “Cloud, Fog or Edge: Where to Compute?” has been accepted for publication in an upcoming issue of IEEE Internet Computing.

Authors: Dragi KimovskiRoland MatháJosef HammerNarges MehranHermann Hellwagner and Radu Prodan

Abstract: The computing continuum extends the high-performance cloud data centers with energy-efficient and low-latency devices close to the data sources located at the edge of the network.
However, the heterogeneity of the computing continuum raises multiple challenges related to application management. These include where to offload an application – from the cloud to the edge – to meet its computation and communication requirements.
To support these decisions, we provide in this article a detailed performance and carbon footprint analysis of a selection of use case applications with complementary resource requirements across the computing continuum over a real-life evaluation testbed.