Science Meets Industry and Innovation Alignment

by Josef Spillner

Often, researchers produce results which are neither re-used nor transferred to practice, and businesses ask for solutions which have been existing for a long time albeit perhaps not in packaged and polished form. Such misunderstandings should not happen; rather, the goal must be to align the innovation needs of businesses and the wider industry with the capabilities of researchers. For this purpose, Science Meets Industry has been proposed as a new event format to bring together scientific researchers and practitioners, in particular in the domain of information technology.

The first Science Meets Industry event was jointly organised by Silicon Saxony and its Cool Silicon cluster of excellence, and hosted by four co-located Fraunhofer institutes. Josef Spillner from the Service Prototyping Lab at Zurich University of Applied Sciences had been invited as keynote speaker and shared his thoughts about «Serverless Cyber-Physical Applications» which connected well with other talk topics during the event. This blog post not only reports briefly about the event, but details the thoughts behind talk and reflects on the need for innovation alignment by incorporating feedback and additional ideas from the discussions after the talk.

As outlined by one of the speakers, there is increasingly the risk of SMEs becoming dependent on a few large players across the ICT spectrum. Especially smaller companies can no longer afford to hire the necessary specialists who can relate to the technologies pushed with increasing pace into the market by the players. In Switzerland, as university of applied sciences we are available to remove this barrier through Innosuisse projects. Swiss software developers can contact us to discuss the need for research and innovation. But on a global level, and increasingly also here despite all current countermeasures, this becomes a severe issue with both vendor lock-in and technical debt aspects.

A key development of significance to both academics and software or hardware developers is the increasing deployment of sensors and their use in distributed applications. Compared to just transferring data from sensors to data processing entities or sending back commands to actuators, today’s systems available for research are a lot more sophisticated. This became apparent during a demonstration of augmented reality interaction following complex workflows in which the interaction path was determined by previous selections by the user. During the impressive and interactive demo, users were able to activate projected virtual buttons to enrich physical scenes.

However, software applications in conjunction with such systems, so-called cyber-physical applications, are still not well understood. In the keynote talk, the benefits of building these systems with cloud functions in order to reach flexibly composable atomic and molecular units was outlined. This notion was inspired by a previous talk on currently still unsolved problems in physics, such as why the mass of elements of an atom is the way it is, and why there are four fundamental forces in nature. Obviously, in computer science, we know even less about the why. In particular, why do certain computing paradigms exist, and how can we find out which ones are more or less suited for a given task? And why are some commercially more successful than others?

This leads to the requirement of teaching computer paradigms in universities, and performing research on them in order to get a better understanding and more systematic and notational representation. Beyond cloud, edge and fog computing, we are now confronted with mist and fluid and dew computing as well as osmotic computing, and we do not know why and when each one should be used, and whether any should be included in a bachelor’s or master’s curriculum. This poses a severe macro-scientific problems in terms of systematic and traceable problem solving.

Related to these questions is the need for new notation formats in order to properly document new applications without ambiguity. There are few suitable notation formats for covering implementation architectures including a rigorous specification of which runtime technologies and external service references are used. The implication is that researchers are not in a position to judge whether a proposed idea is really feasible and novel which in turn limits the researcher’s ability to enable co-innovation with a company.

Such computing trends and evolving requirements to introduce new formats, techniques and tools need to be analysed carefully yet thoroughly. After all, some of the paradigms may be more suitable abstractions than what is possible with the current cloud computing consensus architectures. Eventually, we will need to determine an innovation alignment between the long-term needs of users and the long-term enablement in research funding including also for applied institutions. It is clear that royalty-free access to a lot more innovative building blocks is already needed from an SME perspective to become competitive today, and even more so in the coming years.


Leave a Reply

Your email address will not be published. Required fields are marked *