{"id":12279,"date":"2019-03-12T17:20:29","date_gmt":"2019-03-12T15:20:29","guid":{"rendered":"https:\/\/blog.zhaw.ch\/icclab\/?p=12279"},"modified":"2021-03-04T18:45:29","modified_gmt":"2021-03-04T16:45:29","slug":"configuring-the-ros-navigation-stack-on-a-new-robot","status":"publish","type":"post","link":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/","title":{"rendered":"Configuring the ROS Navigation Stack on a new robot"},"content":{"rendered":"\n<p>Our lab has acquired a new robot as part of its ROS based robotic fleet. We opted with the <a href=\"https:\/\/www.robotnik.eu\/mobile-robots\/summit-xl-steel\/\">SUMMIT-XL Steel<\/a> from <a href=\"https:\/\/www.robotnik.eu\/\">Robotnik<\/a>; a behemoth compared to our much-loved TurtleBots.<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter\"><img decoding=\"async\" src=\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\" alt=\"\" \/><figcaption>The Summit-XL Steel is advertised to be a great platform for robotic application that require transporting heavy loads (up to 250 kg) such as warehouse automation (retrieved from <a href=\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\">https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg<\/a>).<br><\/figcaption><\/figure><\/div>\n\n\n\n<!--more-->\n\n\n\n<p>The first vital step for any mobile robot is to setup the ROS navigation stack: the piece of software that gives the robot the ability to autonomously navigate through an environment using data from different sensors. <\/p>\n\n\n\n<p>A major component of the stack is the ROS node <a href=\"http:\/\/wiki.ros.org\/move_base\">move_base<\/a> which provides implementation for the costmaps and planners. A <a href=\"http:\/\/wiki.ros.org\/costmap_2d\/\">costmap<\/a> is a grid map where each cell is assigned a specific value or cost: higher costs indicate a smaller distance between the robot and an obstacle. Path-finding is done by a planner which uses a series of different algorithms to find the shortest path while avoiding obstacles. Optimization of autonomous driving at close proximity is done by the local costmap and local planner whereas the full path is optimized by the global costmap and global planner. Together these components find the most optimized path given a navigational goal in the real world. <\/p>\n\n\n\n<p>Most of the configuration process is spent tuning parameters in YAML files; however, this process is time consuming and possibly frustrating if a structured approach is not taken and time is not spent reading into details of how the stack works. Many helpful tuning guides are already available: <a href=\"http:\/\/wiki.ros.org\/navigation\/Tutorials\/Navigation%20Tuning%20Guide\">Basic Navigation Tuning Guide<\/a> and <a href=\"http:\/\/kaiyuzheng.me\/documents\/navguide.pdf\">ROS Navigation Tuning Guide<\/a> to name a few (we encourage anyone new to the stack to thoroughly read these). Hence, this post will aim to give solutions to some less-discussed-problems. <\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><font size=\"+2\"><strong>Configuring a 2-LIDAR setup<\/strong><\/font><\/h2>\n\n\n\n<p>To give the robot a full 360 degree view of its surroundings we initially mounted two <a href=\"https:\/\/www.sparkfun.com\/products\/retired\/14117\">Scanse Sweep LIDARs<\/a> on <a href=\"https:\/\/grabcad.com\/library\/lidar-mount-for-summit-xl-steel-1\">3D-printed mounts<\/a>. One of them recently broke and was replaced with a <a href=\"http:\/\/emanual.robotis.com\/docs\/en\/platform\/turtlebot3\/appendix_lds_01\/\">LDS-01<\/a> laser scanner from one of our TurtleBot3s. Each laser scanner provides 270 degrees of range data as shown in the diagram below. Apart from the LIDARs, the robot came equipped with a front depth camera.<\/p>\n\n\n\n<figure class=\"wp-block-image is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.roscomponents.com\/815-thickbox_default\/summit-xl-steel.jpg\" alt=\"\" width=\"597\" height=\"681\" \/><figcaption>The front and back laser scanner are located at opposite edges of the robot and each provide a field of view of 270 degrees (retrieved from <a href=\"https:\/\/www.roscomponents.com\/815-thickbox_default\/summit-xl-steel.jpg)\">https:\/\/www.roscomponents.com\/815-thickbox_default\/summit-xl-steel.jpg<\/a>).<\/figcaption><\/figure>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/IMG_5372-1-768x1024.jpg\" alt=\"\" class=\"wp-image-12324\" width=\"382\" height=\"517\" \/><figcaption>Rear Sweep Scanse LIDAR on a 3D-printed mount (in blue).<\/figcaption><\/figure><\/div>\n\n\n\n<p>The biggest challenge in setting up our own LIDARs was aligning all three range sensors: front <a href=\"https:\/\/orbbec3d.com\/product-astra-pro\/\">orbbec astra pro 3d camera<\/a>, front LIDAR, and rear LIDAR. The key here is to make precise coordinate measurements of the mounted laser scanner with respect to the origin of the robot i.e. the <code>base_link<\/code> frame.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"832\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/rviz_screenshot_2019_02_05-16_36_29-1024x832.png\" alt=\"\" class=\"wp-image-12283\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/rviz_screenshot_2019_02_05-16_36_29-1024x832.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/rviz_screenshot_2019_02_05-16_36_29-300x244.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/rviz_screenshot_2019_02_05-16_36_29-768x624.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/rviz_screenshot_2019_02_05-16_36_29-369x300.png 369w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/rviz_screenshot_2019_02_05-16_36_29.png 1096w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Screenshot of RViz showing alignment of all three range sensors on the robot: blue points = rear LIDAR, orange points = front LIDAR, and white points = front depth camera.<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><font size=\"+2\">Merging laser scans<strong><\/strong><\/font><\/h2>\n\n\n\n<p>Both <a href=\"http:\/\/wiki.ros.org\/amcl\">AMCL<\/a> and <a href=\"http:\/\/wiki.ros.org\/gmapping\">GMapping<\/a> require as input a single <code>LaserScan<\/code> type message with a single frame which is problematic with a 2-LIDAR setup such as ours. <\/p>\n\n\n\n<p>To solve this issue we used the <code>laserscan_multi_merger<\/code> node from <a href=\"https:\/\/github.com\/iralabdisco\/ira_laser_tools\">ira_laser_tools<\/a> to merge the <code>LaserScan<\/code> topics to <code>scan_combined<\/code> and to the frame <code>base_link<\/code>. The <a href=\"http:\/\/wiki.ros.org\/topic_tools\/relay\">relay<\/a> ROS node would be insufficient for this issue since it just creates a topic that alternately publishes messages from both incoming <code>LaserScan<\/code> messages.<\/p>\n\n\n\n<p>Note there is a known bug with the <code>laserscan_multi_merger<\/code> node which sometimes prevents it from subscribing to the specified topics when the node is brought up at the same time as the LIDARs (i.e. same launch file). A simple fix we found is to use the ROS package <a href=\"http:\/\/wiki.ros.org\/timed_roslaunch\">timed_roslaunch<\/a> which can delay the bring-up of the <code>laserscan_multi_merger<\/code> node by a configurable time interval.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><font size=\"+2\">Clearing 2D obstacle layer (<code>ObstacleLayer<\/code>) on costmap<strong><\/strong><\/font><\/h2>\n\n\n\n<p>&#8220;Ghost obstacles&#8221; (as the online community likes to call them) are points on the costmap that indicate no-longer-existing obstacles. This issue was seen with the Scanse Sweep LIDARs and is prevalent among cheaper laser scanners. Many possible reasons exist as to why this occurs, although the most probable cause has to do with the costmap parameter <code>raytrace_range<\/code> (definition provided below) and the <code>max_range<\/code> of the LIDAR.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>raytrace_range &#8211;<strong>The maximum range in meters at which to raytrace out obstacles from the map using sensor data.<\/strong><\/p><cite>source &#8211; <a href=\"http:\/\/wiki.ros.org\/costmap_2d\/hydro\/obstacles\">http:\/\/wiki.ros.org\/costmap_2d\/hydro\/obstacles<\/a><br><br><\/cite><\/blockquote>\n\n\n\n<p>Here\u2019s a simple example to clarify the issue at hand. An obstacle appears in the line of sight of the laser scanner and is marked on the costmap at a distance of 2 meters. The obstacle then disappears and the laser scanner returns a distance of 6 meters at the original radial position of the obstacle. Ray tracing, which is set to a max distance of 3 meters, is unable to clear these points and thus the costmap now contains ghost obstacles. From this example it is clear the parameter <code>raytrace_range<\/code> needs to be set to a slightly higher value than the maximum valid measurement returned by the laser scanner.<\/p>\n\n\n\n<p>If the issue persists, the following are a few other costmap parameters worth looking into:<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li><code>Inf_is_valid = true<\/code> this should be set for sensors that return inf for invalid measurements<\/li><li>Specify an observation source to be solely for clearing and solely for marking: e.g. <code>clearing = true; marking = false<\/code><\/li><li><code>always_send_full_costmap = true<\/code><\/li><\/ul>\n\n\n\n<h2 class=\"wp-block-heading\"><font size=\"+2\">Clearing 3D obstacle layer (<code>VoxelLayer<\/code>) on costmap<strong><\/strong><\/font><\/h2>\n\n\n\n<p>Contrary to the obstacle layer discussed above which does 2D obstacle tracking, the voxel layer is a separate plugin which tracks obstacles in 3D. Data is used from sensors that publish messages of type <code>PointCloud<\/code>; in our case this is our front depth camera.<\/p>\n\n\n\n<p>A similar issue of clearing costmaps was observed with the voxel layer. Specifically, the problem was observed to be hovering around the camera\u2019s blind spot. Two solutions were implemented in order to mitigate this issue.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">#1 Clearing costmap using recovery behavior:<\/h3>\n\n\n\n<p>Recovery behavior was setup such that it clears the <code>obstacle_3d_layer<\/code> whenever the planner fails to find a plan. Note recovery behavior only gets executed if a navigational goal is sent so clearing from this solution can only be possible while a goal is trying to be reached. Below are the parameters required for the fix and screenshots of the solution working as intended in simulation.<\/p>\n\n\n\n<p><strong>global_costmap_params_map.yaml\/local_costmap_params.yaml<\/strong><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>plugins:\n     - {name: obstacle_3d_layer, type: \"costmap_2d::VoxelLayer\"}\n     - {name: obstacle_2d_layer, type: \"costmap_2d::ObstacleLayer\"}\ufeff<\/code><\/pre>\n\n\n\n<p><strong>costmap_common_params.yaml <\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>obstacle_3d_layer:\n&lt;Hidden parameters&gt;\nobstacle_2d_layer:\n&lt;Hidden parameters&gt;<\/code><\/pre>\n\n\n\n<p><strong>move_base_params.yaml<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>recovery_behavior_enabled: true\nrecovery_behaviors:\n  - name: 'aggressive_reset'\n    type: 'clear_costmap_recovery\/ClearCostmapRecovery'\n\naggressive_reset:\n  reset_distance: 0.0\n  layer_names: [\"obstacle_3d_layer\"]<\/code><\/pre>\n\n\n\n<figure class=\"wp-block-image is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_1-1-1024x394.png\" alt=\"\" class=\"wp-image-12297\" width=\"584\" height=\"224\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_1-1-1024x394.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_1-1-300x115.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_1-1-768x295.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_1-1-500x192.png 500w\" sizes=\"auto, (max-width: 584px) 100vw, 584px\" \/><figcaption>Setup for demoing recovery behavior to clear 3D obstacle layer.<br><\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"394\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_2-1-1024x394.png\" alt=\"\" class=\"wp-image-12298\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_2-1-1024x394.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_2-1-300x115.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_2-1-768x295.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_2-1-500x192.png 500w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Obstacle on costmap persists even though virtual obstacle is no longer in front of depth camera.<br><\/figcaption><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"394\" src=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_3-1-1024x394.png\" alt=\"\" class=\"wp-image-12299\" srcset=\"https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_3-1-1024x394.png 1024w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_3-1-300x115.png 300w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_3-1-768x295.png 768w, https:\/\/blog.zhaw.ch\/icclab\/files\/2019\/03\/clearing_voxel_layer_3-1-500x192.png 500w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Global Planner fails to find a plan so recovery behavior is initiated and deletes the 3D obstacle layer from costmaps.<br><\/figcaption><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">#2 Replacing default Voxel Layer plugin with Spatio-Temporal Voxel Layer:<\/h3>\n\n\n\n<p>To further alleviate this issue, specifically when the planner does indeed find a valid plan, the <a href=\"http:\/\/wiki.ros.org\/spatio_temporal_voxel_layer\">Spatio-Temporal Voxel Layer<\/a> was implemented to replace the default Voxel Layer costmap plugin. This improved voxel grid package has a <code>voxel_decay<\/code> parameter which clears 3D obstacles on the costmap progressively with time eliminating the issue completely if <code>voxel_decay<\/code> is set to 0 seconds (though not entirely favorable when there is no rear depth camera).<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/user-images.githubusercontent.com\/14944147\/37010885-b18fe1f8-20bb-11e8-8c28-5b31e65f2844.gif\" alt=\"\" \/><figcaption>Tally robot from <a href=\"https:\/\/www.simberobotics.com\/\">Simbe Robotics<\/a> using the Spatio-Temporal Voxel Layer to mark and clear obstacles (retrieved from <a href=\"https:\/\/user-images.githubusercontent.com\/14944147\/37010885-b18fe1f8-20bb-11e8-8c28-5b31e65f2844.gif\">https:\/\/user-images.githubusercontent.com\/14944147\/37010885-b18fe1f8-20bb-11e8-8c28-5b31e65f2844.gif<\/a>).<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><font size=\"+2\"><strong>Closing Thoughts and Next Steps<\/strong><\/font><\/h2>\n\n\n\n<p>The ROS Navigation Stack is simple to implement regardless of the robot platform and can be highly effective if dedicated time is spent tuning parameters. Issues with the stack will depend on the type of mobile platform and the quality\/type of range sensors used. We hope this blog has provided new insight into solving some of these issues.<\/p>\n\n\n\n<p>In the future we aim to extend the SUMMIT\u2019s navigational capabilities to using 3D LIDARs or an additional depth camera to give a full 3D view of the environment, web access to navigational control, and real-time updating of a centralized map server. <\/p>\n<div class=\"pt-sm\">Schlagw\u00f6rter: <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/autonomous-driving\/\">autonomous driving<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/navigation-stack\/\">navigation stack<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/robotics\/\">robotics<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/ros\/\">ROS<\/a>, <a href=\"https:\/\/blog.zhaw.ch\/icclab\/tag\/summit\/\">Summit<\/a><br><\/div>","protected":false},"excerpt":{"rendered":"<p>Our lab has acquired a new robot as part of its ROS based robotic fleet. We opted with the SUMMIT-XL Steel from Robotnik; a behemoth compared to our much-loved TurtleBots. Schlagw\u00f6rter: autonomous driving, navigation stack, robotics, ROS, Summit<\/p>\n","protected":false},"author":398,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[1,943],"tags":[929,928,692,741,814],"features":[],"class_list":["post-12279","post","type-post","status-publish","format-standard","hentry","category-allgemein","category-cloud-robotics-articles","tag-autonomous-driving","tag-navigation-stack","tag-robotics","tag-ros","tag-summit"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.2 (Yoast SEO v27.2) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Configuring the ROS Navigation Stack on a new robot - Service Engineering (ICCLab &amp; SPLab)<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Configuring the ROS Navigation Stack on a new robot\" \/>\n<meta property=\"og:description\" content=\"Our lab has acquired a new robot as part of its ROS based robotic fleet. We opted with the SUMMIT-XL Steel from Robotnik; a behemoth compared to our much-loved TurtleBots. Schlagw\u00f6rter: autonomous driving, navigation stack, robotics, ROS, Summit\" \/>\n<meta property=\"og:url\" content=\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\" \/>\n<meta property=\"og:site_name\" content=\"Service Engineering (ICCLab &amp; SPLab)\" \/>\n<meta property=\"article:published_time\" content=\"2019-03-12T15:20:29+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-03-04T16:45:29+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\" \/>\n<meta name=\"author\" content=\"Rodrigue de Schaetzen\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Rodrigue de Schaetzen\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\"},\"author\":{\"name\":\"Rodrigue de Schaetzen\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/8a1588c73711626280931cde40d629b1\"},\"headline\":\"Configuring the ROS Navigation Stack on a new robot\",\"datePublished\":\"2019-03-12T15:20:29+00:00\",\"dateModified\":\"2021-03-04T16:45:29+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\"},\"wordCount\":1320,\"commentCount\":2,\"image\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\",\"keywords\":[\"autonomous driving\",\"navigation stack\",\"robotics\",\"ROS\",\"Summit\"],\"articleSection\":[\"*.*\",\"Cloud Robotics\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\",\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\",\"name\":\"Configuring the ROS Navigation Stack on a new robot - Service Engineering (ICCLab &amp; SPLab)\",\"isPartOf\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\",\"datePublished\":\"2019-03-12T15:20:29+00:00\",\"dateModified\":\"2021-03-04T16:45:29+00:00\",\"author\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/8a1588c73711626280931cde40d629b1\"},\"breadcrumb\":{\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage\",\"url\":\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\",\"contentUrl\":\"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Startseite\",\"item\":\"https:\/\/blog.zhaw.ch\/icclab\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Configuring the ROS Navigation Stack on a new robot\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#website\",\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/\",\"name\":\"Service Engineering (ICCLab &amp; SPLab)\",\"description\":\"A Blog of the ZHAW Zurich University of Applied Sciences\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/blog.zhaw.ch\/icclab\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/8a1588c73711626280931cde40d629b1\",\"name\":\"Rodrigue de Schaetzen\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/d55d2188788e83852ded2f67fe5835921874debb87fffd9e559be3e46427683f?s=96&d=mm&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/d55d2188788e83852ded2f67fe5835921874debb87fffd9e559be3e46427683f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/d55d2188788e83852ded2f67fe5835921874debb87fffd9e559be3e46427683f?s=96&d=mm&r=g\",\"caption\":\"Rodrigue de Schaetzen\"},\"url\":\"https:\/\/blog.zhaw.ch\/icclab\/author\/desc\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Configuring the ROS Navigation Stack on a new robot - Service Engineering (ICCLab &amp; SPLab)","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/","og_locale":"en_US","og_type":"article","og_title":"Configuring the ROS Navigation Stack on a new robot","og_description":"Our lab has acquired a new robot as part of its ROS based robotic fleet. We opted with the SUMMIT-XL Steel from Robotnik; a behemoth compared to our much-loved TurtleBots. Schlagw\u00f6rter: autonomous driving, navigation stack, robotics, ROS, Summit","og_url":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/","og_site_name":"Service Engineering (ICCLab &amp; SPLab)","article_published_time":"2019-03-12T15:20:29+00:00","article_modified_time":"2021-03-04T16:45:29+00:00","og_image":[{"url":"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg","type":"","width":"","height":""}],"author":"Rodrigue de Schaetzen","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Rodrigue de Schaetzen","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#article","isPartOf":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/"},"author":{"name":"Rodrigue de Schaetzen","@id":"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/8a1588c73711626280931cde40d629b1"},"headline":"Configuring the ROS Navigation Stack on a new robot","datePublished":"2019-03-12T15:20:29+00:00","dateModified":"2021-03-04T16:45:29+00:00","mainEntityOfPage":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/"},"wordCount":1320,"commentCount":2,"image":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage"},"thumbnailUrl":"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg","keywords":["autonomous driving","navigation stack","robotics","ROS","Summit"],"articleSection":["*.*","Cloud Robotics"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/","url":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/","name":"Configuring the ROS Navigation Stack on a new robot - Service Engineering (ICCLab &amp; SPLab)","isPartOf":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/#website"},"primaryImageOfPage":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage"},"image":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage"},"thumbnailUrl":"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg","datePublished":"2019-03-12T15:20:29+00:00","dateModified":"2021-03-04T16:45:29+00:00","author":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/8a1588c73711626280931cde40d629b1"},"breadcrumb":{"@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#primaryimage","url":"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg","contentUrl":"https:\/\/www.robotnik.eu\/web\/wp-content\/uploads\/\/2018\/07\/Robotnik_SUMMIT-XL-STEEL-01.jpg"},{"@type":"BreadcrumbList","@id":"https:\/\/blog.zhaw.ch\/icclab\/configuring-the-ros-navigation-stack-on-a-new-robot\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Startseite","item":"https:\/\/blog.zhaw.ch\/icclab\/"},{"@type":"ListItem","position":2,"name":"Configuring the ROS Navigation Stack on a new robot"}]},{"@type":"WebSite","@id":"https:\/\/blog.zhaw.ch\/icclab\/#website","url":"https:\/\/blog.zhaw.ch\/icclab\/","name":"Service Engineering (ICCLab &amp; SPLab)","description":"A Blog of the ZHAW Zurich University of Applied Sciences","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/blog.zhaw.ch\/icclab\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/blog.zhaw.ch\/icclab\/#\/schema\/person\/8a1588c73711626280931cde40d629b1","name":"Rodrigue de Schaetzen","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/d55d2188788e83852ded2f67fe5835921874debb87fffd9e559be3e46427683f?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/d55d2188788e83852ded2f67fe5835921874debb87fffd9e559be3e46427683f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/d55d2188788e83852ded2f67fe5835921874debb87fffd9e559be3e46427683f?s=96&d=mm&r=g","caption":"Rodrigue de Schaetzen"},"url":"https:\/\/blog.zhaw.ch\/icclab\/author\/desc\/"}]}},"_links":{"self":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts\/12279","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/users\/398"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/comments?post=12279"}],"version-history":[{"count":44,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts\/12279\/revisions"}],"predecessor-version":[{"id":12739,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/posts\/12279\/revisions\/12739"}],"wp:attachment":[{"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/media?parent=12279"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/categories?post=12279"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/tags?post=12279"},{"taxonomy":"features","embeddable":true,"href":"https:\/\/blog.zhaw.ch\/icclab\/wp-json\/wp\/v2\/features?post=12279"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}