Openwolf Serverless workflow engine for ai on continuum

Abstract

Serverless Computing and Function as a Service (FaaS) are emerging as prominent solutions for making simpler, faster and architecture-independent software deployment on Cloud and Edge tiers. This new trend has been in part used in Cloud-Edge Continuum applications that need to run functions over different kinds of nodes, in order to respect some defined QoS parameters, like data closeness, or high-performance computing. Unfortunately, FaaS is not ready yet for building complex applications due to the lack of a function composition component. Synchronising functions among the Continuum has been then considered an open challenge. In this paper, we propose OpenWolf, a just-born open-source serverless workflow engine for deploying, orchestrating and synchronising functions over a Continuum environment. This solution will be tested by executing an image classification algorithm. In particular, we will compare the performances of the same workflow firstly centralizing training, data collection, and inference in the only Cloud, then in the only Edge and finally distributing the functions among the Cloud and the Edge.