For third time in a row we attended ROSCon, this year held in beautiful Vancouver.
Our goals besides seeing the newest trends in the ROS and Robotics universe first hand, and finding some new robotic hardware directly from manufacturers, was to support our partners from Rapyuta Robotics (RR) in presenting and performing a demo of the first preview of their upcoming Cloud Robotics Platform.
In the context of the ECRP Project, we need to orchestrate intercommunicating components and services running on robots and in the cloud. The communication of this components relies on several protocols including L7 as well as L4 protocols such as TCP and UDP.
One of the solutions we are testing as the base technology for the ECRP cloud platform is OpenShift. As a proof of concept, we wanted to test TCP connectivity to components deployed in our OpenShift 1.3 cluster. We chose to run two RabbitMQ instances and make them accessible from the Internet to act as TCP endpoints for incoming robot connections.
The concept of “route” in OpenShift has the purpose to enable connections from outside the cluster to services and containers. Unfortunately, the default router component in OpenShift only supports HTTP/HTTPS traffic, hence cannot natively support our intended use case. However, Openshift routing can be extended with so called “custom routers”.
This blog post will lead you through the process of creating and deploying a custom router supporting TCP traffic and SNI routing in OpenShift.
In the context of the ECRP Project, which is part of our cloud robotics initiative, we are aiming to build a PaaS solution for robotic applications.
The “Robot Operating System” (ROS) is widely used on several robotics platforms, and also runs on the turtlebot robots in our lab. One of the ideas behind cloud robotics is to enable ROS components (so called ROS nodes) to run distributed across the cloud infrastructure and the robot itself, so we can shift certain parts of the robotics application to the cloud. As a logical first step we tried to run existing ROS nodes, such as a ROS master in containers on Kubernetes, then we tried to use a proper Platform as a Service (PaaS) solution, in our case Red Hat OpenShift .
OpenShift offers a full PaaS experience, you can build and run code from source or run pre-built containers directly. All of those features can be managed via a intuitive web interface.
However, OpenShift imposes tight security restrictions on the containers it runs.
Tobias is an assistant researcher at ZHAW Service Prototyping Lab.
He has completed his bachelor in computer science at ZHAW in 2016.
Now he is working on the cloud robotics initiative,
which aims to connect the world of robotics with cloud computing.
He likes to challenge himself and try out all sorts of new technologies like
the Google Tango platform which he used in his bachelor thesis.
In his Bachelor Study he learned the bases of cloud computing and is now eager to
dive into the details.
Beside the work he is a passionate skier and loves to be in the mountains.