{"id":12344,"date":"2019-04-08T12:05:31","date_gmt":"2019-04-08T10:05:31","guid":{"rendered":"https:\/\/blog.zhaw.ch\/icclab\/?p=12344"},"modified":"2021-03-04T18:45:29","modified_gmt":"2021-03-04T16:45:29","slug":"running-the-icclab-ros-kinetic-environment-on-your-own-laptop","status":"publish","type":"post","link":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/","title":{"rendered":"Running the ICCLab ROS Kinetic environment on your own laptop"},"content":{"rendered":"\n<p>As we are making progress on the development of robotic applications in our lab, we experience benefits from providing an easy-to-deploy common ROS Kinetic environment for our developers so that there is no initial setup time needed before starting working on the real code. At the same time, any interested users that would like to test and navigate our code implementations could do this with a few commands. One <em>git clone <\/em>command is now enough to download our up-to-date<em> repository<\/em> to your local computer and run our ROS kinetic environment including a workspace with the current ROS projects. <\/p>\n\n\n\n<p>To reach this goal we created a container that includes the ROS Kinetic distribution,  all needed dependencies and software packages needed for our projects. No additional installation or configuration steps are needed before testing our applications. The git repository of reference can be found at this link: <a href=\"https:\/\/github.com\/icclab\/rosdocked-irlab\">https:\/\/github.com\/icclab\/rosdocked-irlab<\/a><\/p>\n\n\n\n<!--more-->\n\n\n\n<p>After cloning the repository on your laptop, you can run the ROS kinetic environment including the workspace and projects with these two simple commands:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>cd workspace_included\n.\/run-with-dev.sh<\/code><\/pre>\n\n\n\n<p>This will pull the <em>robopaas\/rosdocked-kinetic-workspace-included<\/em> container to your laptop and start it with access to your X server.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The two projects you can test<\/h2>\n\n\n\n<p>Once you are inside the container you will have everything that is needed to test and play around with the two projects we are currently working on, namely <em>robot navigation<\/em> and <em>pick&amp;place<\/em>. Both of the projects are based on the hardware we recently acquired. The hardware is our <a href=\"https:\/\/www.robotnik.eu\/mobile-robots\/summit-xl-steel\/\">SUMMIT-XL Steel<\/a> from <a href=\"https:\/\/www.robotnik.eu\/\">Robotnik<\/a>, equipped with a <a href=\"https:\/\/www.universal-robots.com\/products\/ur5-robot\/\">Universal Robots UR5 <\/a>arm and a <a href=\"https:\/\/schunk.com\/de_en\/gripping-systems\/series\/co-act-egp-c\/\">Schunk Co-act EGP-C 40<\/a> gripper (see a picture of the hardware below). Besides this, we mounted a <a href=\"https:\/\/click.intel.com\/intelr-realsensetm-depth-camera-d435.html\">Intel Realsense D435<\/a> camera on the UR5 arm and two <a href=\"https:\/\/www.sparkfun.com\/products\/retired\/14117\">Scanse Sweep LIDARs<\/a> on <a href=\"https:\/\/grabcad.com\/library\/lidar-mount-for-summit-xl-steel-1\">3D-printed mounts<\/a>. Please have a look at our previous blog post for more details about the <a href=\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#more-12279\">robot setup and configuration<\/a>.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"768\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg\" alt=\"Summit_xl and Intel Realsense camera\" class=\"wp-image-12265\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-300x225.jpg 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-768x576.jpg 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-400x300.jpg 400w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Summit_xl, with UR5 arm and Schunk gripper<\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>robot navigation project<\/strong><\/h3>\n\n\n\n<p>You can test our robot navigation project by launching a single launch file from the <em>icclab_summit_xl <\/em>project in the container:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>roslaunch icclab_summit_xl irlab_sim_summit_xls_amcl.launch<\/code><\/pre>\n\n\n\n<p>A Gazebo simulation environment will be started with an indoor simulated scenario where the <em>Summit_xl<\/em> robot can be moved around. Additionally Rviz will be launched for visualization of the Gazebo data (see picture below). <\/p>\n\n\n\n<ul class=\"wp-block-gallery columns-2 is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\"><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-35-1024x576.png\" alt=\"\" data-id=\"12356\" class=\"wp-image-12356\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-35-1024x576.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-35-300x169.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-35-768x432.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-35-500x281.png 500w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-35.png 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/li><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-28-1024x576.png\" alt=\"\" data-id=\"12357\" data-link=\"https:\/\/blog.zhaw.ch\/icclab\/?attachment_id=12357\" class=\"wp-image-12357\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-28-1024x576.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-28-300x169.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-28-768x432.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-28-500x281.png 500w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-32-28.png 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/li><\/ul>\n\n\n\n<p>By selecting the 2D Nav Goal top bar option in Rviz it is possible to give a navigation goal on the map in Rviz. The robot will start planning a path towards the goal, avoiding obstacles thanks to the environment sensing based on the LIDAR scans. If a viable path is found, the robot will move accordingly.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Pick&amp;Place project<\/strong><\/h3>\n\n\n\n<p>You can test our <em>pick&amp;place<\/em> application by calling another launch file from the <em>icclab_summit_xl <\/em>project which is part of the workspace in the container:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>roslaunch icclab_summit_xl irlab_sim_summit_xls_grasping.launch<\/code><\/pre>\n\n\n\n<p>Also in this case a Gazebo simulation environment will be started, with an empty world scenario with the <em>Summit_xl<\/em> robot and a sample object to be grasped placed in front of the robot (being the deployed gripper opening as small as 1.8cm the selected object is pretty small). Also Rviz will be launched for visualization of the Gazebo data (see picture below) with Moveit being configured for the arm movement. <\/p>\n\n\n\n<ul class=\"wp-block-gallery columns-2 is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex\"><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-56-1024x576.png\" alt=\"\" data-id=\"12362\" data-link=\"https:\/\/blog.zhaw.ch\/icclab\/?attachment_id=12362\" class=\"wp-image-12362\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-56-1024x576.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-56-300x169.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-56-768x432.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-56-500x281.png 500w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-56.png 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/li><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"576\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-46-1024x576.png\" alt=\"\" data-id=\"12363\" data-link=\"https:\/\/blog.zhaw.ch\/icclab\/?attachment_id=12363\" class=\"wp-image-12363\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-46-1024x576.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-46-300x169.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-46-768x432.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-46-500x281.png 500w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/04\/Screenshot-from-2019-04-08-10-51-46.png 1920w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><\/li><\/ul>\n\n\n\n<p>As visible from the Rviz visualization picture above, an <a href=\"http:\/\/wiki.ros.org\/octomap\">octomap<\/a> is configured for collision avoidance in the arm movements. The octomap is built based on the pointcloud received from the camera mounted on the arm. A first simple test to see the UR5 arm moving, is to define a goal for the end-effector of the arm and make <a href=\"https:\/\/moveit.ros.org\/\">moveit<\/a> plan a possible path. If a plan is found it can be executed and see the resulting arm movement. <\/p>\n\n\n\n<p>To test our own python scripts for the pick&amp;place application, you can run the following commands within the container:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>cd catkin_ws\/src\/icclab_summit_xl\/scripts\npython pick_and_place_summit_simulation.py<\/code><\/pre>\n\n\n\n<p>The python script will move the arm towards an initial position so that the object to be grasped can be seen with the front and the arm-mounted cameras. A pointcloud will be built based on the pointcloud from both cameras. Based on the resulting pointcloud, the object to grasp will be identified and a number of possible poses will be found for the gripper to grasp the object. Then moveit will look for a collision-free movement plan to grasp the object. If all of these steps are successfully executed, the object will be grasped and a new movement plan will be computed for placing the object on top of the robot (note that this last step might require some more time as we are adding orientation constraints to the object placement). You can watch a video of our pick&amp;place simulation you can perform with our project below<a href=\"https:\/\/drive.google.com\/file\/d\/1oGHSzNYxg79vnK0K8GTVuBIBajxkC6SD\/view?usp=sharing\"><\/a>.<\/p>\n\n\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"lyte-wrapper\" title=\"Summit grasping simulation\" style=\"width:640px;max-width:100%;margin:5px;\"><div class=\"lyMe\" id=\"WYL_UhN8KWOTRWQ\"><div id=\"lyte_UhN8KWOTRWQ\" data-src=\"https:\/\/blog.zhaw.ch\/icclab\/wp-content\/plugins\/wp-youtube-lyte\/lyteCache.php?origThumbUrl=https%3A%2F%2Fi.ytimg.com%2Fvi%2FUhN8KWOTRWQ%2Fhqdefault.jpg\" class=\"pL\"><div class=\"tC\"><div class=\"tT\">Summit grasping simulation<\/div><\/div><div class=\"play\"><\/div><div class=\"ctrl\"><div class=\"Lctrl\"><\/div><div class=\"Rctrl\"><\/div><\/div><\/div><noscript><a href=\"https:\/\/youtu.be\/UhN8KWOTRWQ\" rel=\"nofollow\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blog.zhaw.ch\/icclab\/wp-content\/plugins\/wp-youtube-lyte\/lyteCache.php?origThumbUrl=https%3A%2F%2Fi.ytimg.com%2Fvi%2FUhN8KWOTRWQ%2F0.jpg\" alt=\"Summit grasping simulation\" width=\"640\" height=\"340\" \/><br \/>Watch this video on YouTube<\/a><\/noscript><\/div><\/div><div class=\"lL\" style=\"max-width:100%;width:640px;margin:5px;\"><br\/><span class=\"lyte_disclaimer\">Defaulttext aus wp-youtube-lyte.php<\/span><\/div><figcaption><\/figcaption><\/figure>\n\n\n<p>As stated earlier, our default simulation setup follows our acquired hardware and uses, therefore, a Schunk gripper. However, you can simulate also a Robotiq gripper for the given robot configuration by changing a parameter when launching the project and by using a second python script as reported below:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>roslaunch icclab_summit_xl irlab_sim_summit_xls_grasping.launch robotiq_gripper:=true\ncd catkin_ws\/src\/icclab_summit_xl\/scripts\npython pick_and_place_summit_simulation_robotiq.py <\/code><\/pre>\n\n\n\n<p><br><\/p>\n<div class=\"pt-sm\">Schlagw\u00f6rter: <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/autonomous-driving\/\">autonomous driving<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/grasping\/\">grasping<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/iccla\/\">iccla<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/navigation-stack\/\">navigation stack<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/robotics\/\">robotics<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/ros\/\">ROS<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/summit\/\">Summit<\/a><br><\/div>","protected":false},"excerpt":{"rendered":"<p>As we are making progress on the development of robotic applications in our lab, we experience benefits from providing an easy-to-deploy common ROS Kinetic environment for our developers so that there is no initial setup time needed before starting working on the real code. At the same time, any interested users that would like to [&hellip;]<\/p>\n","protected":false},"author":400,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[1,943],"tags":[929,932,934,928,692,741,814],"features":[],"class_list":["post-12344","post","type-post","status-publish","format-standard","hentry","category-allgemein","category-cloud-robotics-articles","tag-autonomous-driving","tag-grasping","tag-iccla","tag-navigation-stack","tag-robotics","tag-ros","tag-summit"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.2 (Yoast SEO v27.2) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Running the ICCLab ROS Kinetic environment on your own laptop - Service Engineering (ICCLab &amp; SPLab)<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Running the ICCLab ROS Kinetic environment on your own laptop\" \/>\n<meta property=\"og:description\" content=\"As we are making progress on the development of robotic applications in our lab, we experience benefits from providing an easy-to-deploy common ROS Kinetic environment for our developers so that there is no initial setup time needed before starting working on the real code. At the same time, any interested users that would like to [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\" \/>\n<meta property=\"og:site_name\" content=\"Service Engineering (ICCLab &amp; SPLab)\" \/>\n<meta property=\"article:published_time\" content=\"2019-04-08T10:05:31+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-03-04T16:45:29+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg\" \/>\n<meta name=\"author\" content=\"milt\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"milt\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\"},\"author\":{\"name\":\"milt\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/3c82a3b5b55efe2621071e338371ee82\"},\"headline\":\"Running the ICCLab ROS Kinetic environment on your own laptop\",\"datePublished\":\"2019-04-08T10:05:31+00:00\",\"dateModified\":\"2021-03-04T16:45:29+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\"},\"wordCount\":854,\"commentCount\":2,\"image\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg\",\"keywords\":[\"autonomous driving\",\"grasping\",\"iccla\",\"navigation stack\",\"robotics\",\"ROS\",\"Summit\"],\"articleSection\":[\"*.*\",\"Cloud Robotics\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\",\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\",\"name\":\"Running the ICCLab ROS Kinetic environment on your own laptop - Service Engineering (ICCLab &amp; SPLab)\",\"isPartOf\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg\",\"datePublished\":\"2019-04-08T10:05:31+00:00\",\"dateModified\":\"2021-03-04T16:45:29+00:00\",\"author\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/3c82a3b5b55efe2621071e338371ee82\"},\"breadcrumb\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage\",\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700.jpg\",\"contentUrl\":\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700.jpg\",\"width\":4032,\"height\":3024,\"caption\":\"Summit_xl and Intel Realsense camera\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Startseite\",\"item\":\"https:\/\/blog.zhaw.ch\/icclab\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Running the ICCLab ROS Kinetic environment on your own laptop\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#website\",\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/\",\"name\":\"Service Engineering (ICCLab &amp; SPLab)\",\"description\":\"A Blog of the ZHAW Zurich University of Applied Sciences\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/blog.zhaw.ch\/icclab\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/3c82a3b5b55efe2621071e338371ee82\",\"name\":\"milt\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/123e7c17511d6676a346322fb4f15f5b0544094b0b6f4779ae2f6e5cf1bbf2f8?s=96&d=mm&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/123e7c17511d6676a346322fb4f15f5b0544094b0b6f4779ae2f6e5cf1bbf2f8?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/123e7c17511d6676a346322fb4f15f5b0544094b0b6f4779ae2f6e5cf1bbf2f8?s=96&d=mm&r=g\",\"caption\":\"milt\"},\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/author\/milt\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Running the ICCLab ROS Kinetic environment on your own laptop - Service Engineering (ICCLab &amp; SPLab)","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/","og_locale":"en_US","og_type":"article","og_title":"Running the ICCLab ROS Kinetic environment on your own laptop","og_description":"As we are making progress on the development of robotic applications in our lab, we experience benefits from providing an easy-to-deploy common ROS Kinetic environment for our developers so that there is no initial setup time needed before starting working on the real code. At the same time, any interested users that would like to [&hellip;]","og_url":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/","og_site_name":"Service Engineering (ICCLab &amp; SPLab)","article_published_time":"2019-04-08T10:05:31+00:00","article_modified_time":"2021-03-04T16:45:29+00:00","og_image":[{"url":"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg","type":"","width":"","height":""}],"author":"milt","twitter_card":"summary_large_image","twitter_misc":{"Written by":"milt","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#article","isPartOf":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/"},"author":{"name":"milt","@id":"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/3c82a3b5b55efe2621071e338371ee82"},"headline":"Running the ICCLab ROS Kinetic environment on your own laptop","datePublished":"2019-04-08T10:05:31+00:00","dateModified":"2021-03-04T16:45:29+00:00","mainEntityOfPage":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/"},"wordCount":854,"commentCount":2,"image":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage"},"thumbnailUrl":"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg","keywords":["autonomous driving","grasping","iccla","navigation stack","robotics","ROS","Summit"],"articleSection":["*.*","Cloud Robotics"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/","url":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/","name":"Running the ICCLab ROS Kinetic environment on your own laptop - Service Engineering (ICCLab &amp; SPLab)","isPartOf":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/#website"},"primaryImageOfPage":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage"},"image":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage"},"thumbnailUrl":"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700-1024x768.jpg","datePublished":"2019-04-08T10:05:31+00:00","dateModified":"2021-03-04T16:45:29+00:00","author":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/3c82a3b5b55efe2621071e338371ee82"},"breadcrumb":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#primaryimage","url":"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700.jpg","contentUrl":"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/20190304_165700.jpg","width":4032,"height":3024,"caption":"Summit_xl and Intel Realsense camera"},{"@type":"BreadcrumbList","@id":"https:\/\/blog.zhaw.ch\/icclab\/running-the-icclab-ros-kinetic-environment-on-your-own-laptop\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Startseite","item":"https:\/\/blog.zhaw.ch\/icclab\/"},{"@type":"ListItem","position":2,"name":"Running the ICCLab ROS Kinetic environment on your own laptop"}]},{"@type":"WebSite","@id":"https:\/\/blog.zhaw.ch\/icclab\/#website","url":"https:\/\/blog.zhaw.ch\/icclab\/","name":"Service Engineering (ICCLab &amp; SPLab)","description":"A Blog of the ZHAW Zurich University of Applied Sciences","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/blog.zhaw.ch\/icclab\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/3c82a3b5b55efe2621071e338371ee82","name":"milt","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/123e7c17511d6676a346322fb4f15f5b0544094b0b6f4779ae2f6e5cf1bbf2f8?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/123e7c17511d6676a346322fb4f15f5b0544094b0b6f4779ae2f6e5cf1bbf2f8?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/123e7c17511d6676a346322fb4f15f5b0544094b0b6f4779ae2f6e5cf1bbf2f8?s=96&d=mm&r=g","caption":"milt"},"url":"https:\/\/blog.zhaw.ch\/icclab\/author\/milt\/"}]}},"_links":{"self":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts\/12344","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/users\/400"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/comments?post=12344"}],"version-history":[{"count":31,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts\/12344\/revisions"}],"predecessor-version":[{"id":12827,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts\/12344\/revisions\/12827"}],"wp:attachment":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/media?parent=12344"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/categories?post=12344"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/tags?post=12344"},{"taxonomy":"features","embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/features?post=12344"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}