Towards smarter continuum application designs

In the context of our «Smart Cities and Regions Services Enablement» efforts, space (and to some extent time) are important dimensions. First, the digital transformation has an inherent spatial component. While the research application field is pragmatically scoped to cities and regions, indeed it spans a wider spectrum from households, quarters, districts to countries and even supranational entities. The recent wave of «surface digitalisation» has primarily affected mobile citizens (pandemic apps) and workers (video conferencing in home offices) around the world. This increased the surface over the previous one that for most citizens encompassed e-banking, e-ticketing and e-tax declarations, with various degrees of voluntariness.

A much larger wave with deeper impact on society and stronger presence in industrial processes is still ahead though and warrants a holistic understanding of technical and societal effects and associated value and sustainability models. Second, with emphasis on the computer science perspective, space determines where, if in simplified terms equating digital transformation with softwareisation, the software application parts are executing. Some applications like e-tax are cloud-hosted and offer a web interface, making them suitable for a conventional cloud-native architecture. Most others will however involve mobile devices, embedded computers, sensor gateways and various forms of clouds in almost arbitrary combinations. Evidently, programming for the continuum is a prerequisite for achieving flexible continuum applications. The services enablement is a complementary effort that also investigates combining different computing paradigms and smart coding techniques.

According to the project description, we are primarily interested in three aspects:

  • Exploring spatial notions in APIs, e.g. geoinformatics syntax and spatial/spatio-temporal queries.
  • Non-intrusiveness and adherence to principles of (personal) data avoidance based on selectively computable compression. This also involves the on-demand integration of public open data.
  • Avoid fragmentation between the various deployed software parts across the continuum and foster re-use, flexible deployment and migration based on portable microservice artefacts such as cloud functions known from serverless application engineering.

By combining these techniques, we expect that future application architectures deviate significantly from their present evolution and enable the more disruptive and yet societally acceptable forms of digital transformation. As this may sound very abstract, a first potential scenario sketch (out of several to come) shall be given here. The scenario only involves devices, whereas future scenarios will also involve humans.

A weather station network consists of dozens of distributed stations and an aggregator node. Each station is composed of a set of sensors and a gateway. Every minute, the gateway reads the current information about humidity, temperature and wind from the sensors, stores the current metrics into a timeline and synchronises the timeline with the aggregator. The aggregated timeline is stored with a flexible trade-off between safety and efficiency, using n+m erasure coding. All necessary resources are requested on demand, and differentiated resource services are combined according to the self-declared or machine-learned application requirements (e.g. read/write patterns). Various event-triggered microservices rely on individual fragments or the entire data, i.e. a combination of two fragments, to perform statistical analysis, prediction or incident detection such as missing measurements. The integration of these microservices – directly within the stream processing or as hand-off deployments as in FaaS – needs to be investigated. On a higher level, area-level queries are supported across multiple streams, delivering either fuzzy or precise results which are differentiated with credentials linked to microbilling. For instance, a weather lab that pays a fee gets precise data and makes the deployment sustainable, while the general public gets acceptable data to fulfil the legal open data requirements. Further flexibility is achieved by various persistence models: Gateway only, aggregator only, or both; including all data, sampled excerpts or just recent history. This scenario combines element from at least five computing paradigms: Cloud, fog, serverless, spatial, dispersed computing. A custom application to implement the scenario can already be constructed with off-the-shelf components. But achieving this combination with little effort and with generic, re-usable parts remains a challenge, yet also an opportunity for more dependable digital services, to be addressed with convincing solution proposals.


Leave a Reply

Your email address will not be published. Required fields are marked *