I followed this tutorial:
http://wiki.ros.org/slam_gmapping/Tutorials/MappingFromLoggedData
I used the provided bag file to test. When playing the bag file back I keep getting a warning message from the console the slam_gmapping node started from:
> [ WARN] [1542313904.356507102,
> 139.682228808]: MessageFilter [target=odom ]: Dropped 100.00% of
> messages so far. Please turn the
> [ros.gmapping.message_notifier]
> rosconsole logger to DEBUG for more
> information.
After the bag file finishes and I attempt to generate the map the console hangs at: "Waiting for the map".
I have no idea what to do.
↧
Error when running slam_gmapping tutorial
↧
How do you add new frames to the tf tree of a Jackal robot?
I have a Jackal robot that is running ubuntu 14.04 with ROS indigo installed. I have attached a Hokuyo UTM 30lx laser to the jackal robot so I am trying to modify the jackal.urdf.xacro file to include the hokuyo laser so I can use the laser scans to do gmapping. I have got the system working in simulation, I was able to add the hokuyo laser to the jackal in gazebo and use the scans to do autonomous navigation. When I have copied over the jackal.urdf.xacro file onto the real robot, I am unable to see the hokuyo_base_link and the front_laser (hokuyo laser) link in the tf tree, and thus when I visualize the robot in Rviz I get an error on LaserScan saying: "Transform [sender=unknown_publisher] for frame [front_laser]: frame [front_laser] does not exist." I can attach my jackal.urdf.xacro file if needed. Please could someone help me and explain how I need to modify my files to get the hokuyo laser working on the real robot? Thank you very much.
↧
↧
Odometry for a holonomic robot with 3 wheels
Hi,
Considering a mobile base robot with 3 omni-directional wheels, is there a ROS package that would convert motor encoder position to an odometry message (pose and twist) to feed to the navigation stack ?
Thanks!
↧
No transform from [robot0/robot0/map] to [world]
Hey everyone
I was trying to add a fixed "World" frame to add my separate TF trees (robot/map -> robot/odom -> robot/base_link) using a static transform publisher, when I came across this problem; When I try to visualize my map in rviz, I get the following error: No transform from [robot0/robot0/map] to [world]. (I get this error whether I add the fixed frame or not)
All I do is bring up a map, separate sdf models and run gmapping for each one of them using this node in my launch file:
How can I avoid the two prefixes without having to use map without the robot_name argument? (since I need separate map frames for my navigation package)
I have uploaded the picture of my rviz and TF tree [here](https://ufile.io/rmzc0).
Thank you!
↧
How do you integrate IMU data with odom data generated from Gazebo Skid Steer Plugin?
I have gmapping working with a skid steer robot while utilizing the Gazebo skid steer plugin to actuate and control my robot. I am attempting to add an IMU and fuse the data from that sensor with the odom data currently being used to achieve the SLAM. Should I implement this similar to that of husky and other "skid steer" robots? Or is there another way to go about this since I am using the Gazebo controller plugin? My main issue is that I am using the actual skid steer plugin whereas they use joint state controllers and cater differential drive to achieve the skid steering.
I attempted to implement a localization.yaml and control.yaml similar to the husky robot and use the robot_localization package to fuse the data however I am confused about how to write the control.yaml. Mainly not sure what to replace the "diff_drive_controller" with under type.
Any guidance would be appreciated.
↧
↧
Problem creating a map with gmapping
I added a laser scanner to a custom robot model, but when I ran the the gmap node. The robot was not mapping the environment.
The warnings I got when I ran my launch file are :
Timed out waiting for transform from base_footprint to map to become available before running costmap, tf error: Could not find a connection between 'map' and 'base_footprint' because they are not part of the same tree.Tf has two or more unconnected trees.. canTransform returned after 98.215 timeout was 0.1.
The origin for the sensor at (71.50, -50.00) is out of map bounds. So, the costmap cannot raytrace for it.
Scan Matching Failed, using odometry. Likelihood=0 lp:71.5 -50 1.49224e-17 op:70.6211 -50.1456 0.000266226
The image of my TF tree is [https://imgur.com/a/shzkIOB](https://imgur.com/a/shzkIOB)
↧
Gmapping with V-REP
Hello,
I'm doing a course project whose goal is to build a SLAM simulation with ROS and V-REP.
I'm new to ROS and V-REP and I'd like some help from you guys.
Here is what I did:
1. I created a scene with a p3dx robot in V-REP.
2. I publish Twist message on ROS to control the motion of the robot in V-REP.
3. V-REP is publishing 3 frames to ROS:
"odom": the fixed world coordinate;
"base_link": the robot coordinate;
"laser": the laser sensor coordinate.
Also, V-REP is publishing laserScan data as well.
4. I run gmapping node, subscribing to the LaserScan data.
5. I run RVIZ and changed the fixed frame to /map, so that I can visualize the ongoing mapping while the robot moves in V-REP.
Here comes the question, I'm getting a error message for the global state in RVIZ:
"Fixed Frame [map] does not exist"
How should i fix this?
I assume that I need to define a new frame corresponding to "map" but how am I supposed to do it? Just create a new coordinate and add a tf between "map" and "odom"?
↧
GMapping with dynamic objects and occupancy updates
Hi Everyone,
I am using GMapping as my mapping package using a MRS6124 (I have a node to convert to laserscan). My problem is that GMapping doesn't update an obstacles location on the map when the obstacle is moved. It updates the freespace behind the obstacle when it is moved but it doesn't remove the obstacle despite that it now knows the obstacle isn't there. The new obstacle location also doesn't update its position with a new obstacle on the map. Is this due to a setting I haven't configured correctly or is this outside gmappings capabilities?
I was working from an example by Nick Charron of ClearPath Robotics and noticed that his obstacles are stationary but it would always update the obstacle positions. is this because I haven't run the global and local costmaps yet? I am currently only running gmapping, not the costmap_2D package.
Thank you in advance for any help with understanding the limitation, capabilities or configuration of this package.
↧
How to load a map in the SLAM for the resumption of the SLAM in uncharted areas
On the first day, using SLAM, I create a corridor map and save it via map_saver.
On the second day, I need to load the map into the SLAM and continue creating the map in the unexplored rooms.
If I upload a map to SLAM using map_saver, it will simply be erased.
How to save a map from SLAM so that it can be loaded back into it and continue research in unknown areas?
↧
↧
Automatic generation of map of a world
I'm new to ros and I`m working on turtlebot to understand things clearly.
I know we can use the gmapping module with a keyboard/joystick as operator but let's say I have lot of maps and I do not want to traverse all of them manually.
One way of solving this would be to generate random points within the world and let the robot traverse from one point to another. If we use enough number of points our robot should be able to create a map of the whole world. But, I`m not able to implement this idea.
Can anyone help me in implementing the above solution...?
Also interested to know any other method which can solve this problem.
↧
ROSSharp RVIZ Odometry Issue
Hello together,
I am facing following problem:
The robot in RVIZ does not move accordingly to the behavior in Unity,
I am using ROSSharp with Unity and ROS is running on a Ubuntu VM.
ROS Distribution Kinetic is used.
Unity publishes with the JointStatePublisher and the PoseStampedPublisher.
My Odometry Frame File looks promising, as all Frames are properly connected (https://docdro.id/cIGCq2o).
**Question:**
When not setting a static transform between the base_footprint and the base_link in my launch file, there is no movement at all in RVIZ, although this transform should be handled by the Robot_State_Publisher.
When I add a static transform between the base_footprint and the base_link the robot receives movement.
BUT: He is more or less jumping around in the map.
My launch file looks like the following
**Second Question**
How can I set the appropriate values for the static transformation?
Best Regards
↧
What do the colors in gmapping mean?
I'm pretty new to ROS and I've been trying to do gmapping for turtlebot2 and turtlebot3 on
Ubuntu 16.04 ROS-kinetic. I have been successful, however I've been trying to figure what
the different colors mean. I've tried googling the answer but to no avail.
Below are two links to see what I'm talking about.
The first one is an image of turtlebot2 gmapping
and the second one is of turtlebot3 gmapping.
https://drive.google.com/file/d/12ZIyruHkfT3gHhix2DmvUKCg7WPjgSsD/view?usp=sharing
https://drive.google.com/file/d/1pPFaqS6OIkRPBaflxoxmWmYxp-0dWKYB/view?usp=sharing
Could anyone explain what the different colors that surround the objects mean for each of the images?
What does the light blue, dark blue, red and pink colors in turtlebot2 gmapping mean ?
In turtlebot3, there are light blue colors surrounding the objects but why are there light blue spots or circles on places that have no objects?
Your answer would be much appreciated.
↧
Error launching Gmapping node on ROS
Hi,
I am using ROS kinetic on ubuntu 16 and have installed the binaries for openslam and slam_gmapping using the command
`sudo apt-get install ros-kinetic-slam-gmapping and sudo apt-get install ros-kinetic-openslam-gmapping`.
However, whenever I am trying to use the gmapping node with ROS, it always throws the following error
ERROR: cannot launch node of type [gmapping/slam_gmapping]: can't locate node [slam_gmapping] in package [gmapping]
I am unable to figure out how do I solve this problem. My environment variables are properly defined in the bashrc.
Thanks
↧
↧
gmapping cannot create map
Ubuntu16.04LTS & ROS Kinetic
got issues like this
[ WARN] [1544758709.859826979]: MessageFilter [target=odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information.
[ WARN] [1544758709.859953898]: MessageFilter [target=odom ]: The majority of dropped messages were due to messages growing older than the TF cache time. The last message's timestamp was: 5505952.000000, and the last frame_id was: laser
and the tf

↧
Help with Localization Algorithm
Hello, I've been working on creating a simulation of an autonomous robot and I'm very new to ROS. Till now I've used Gmapping algorithm which successfully generates a 2D map now I need Localization algorithm to locate the robot. I tried using AMCL but the problem is I've to provide AMCL with a generated map for it to localize. Can anyone kindly help me what algorithm I can use which will generate map and localize and at the same time. I'm using hokuyo laser sensor on my robot.
Thank you!
↧
How to use depthimage_to_laserscan package to create map?
I am running rosrun depthimage_to_laserscan depthimage_to_laserscan and the scan topic is published and when I rosbag record -o /scan /tf the scan topic is not been recorded and the map is not created. I am new to ros and also read thewiki page bit didn't understand how to do that any suggestion. I want to make a 2D map to implement localization.
↧
slam_gmapping Laser is mounted upwards
Hello friends,
I am trying to run
rosrun gmapping slam_gmapping
and I am facing with this result .
[ INFO] [1546503887.324696498]: Laser is mounted upwards.
Error in `/home/nvidia/catkin-ws/devel/lib/gmapping/slam_gmapping': malloc(): smallbin double linked list corrupted: 0x0000000000588bc0
Aborted (core dumped)
I have odom -> base_link and base_link -> base_laser tf
transforms:
-
header:
seq: 0
stamp:
secs: 1546503885
nsecs: 8856919
frame_id: "base_link"
child_frame_id: "base_laser"
transform:
translation:
x: 0.0
y: 10.0
z: 10.0
rotation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
transforms:
-
header:
seq: 0
stamp:
secs: 1546503884
nsecs: 900882660
frame_id: "odom"
child_frame_id: "base_link"
transform:
translation:
x: 0.225268739852
y: 0.0
z: 0.0
rotation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
I am publishing laser scans to /scan topic.
And this is my ros graph

What do you think I can do? Does the problem is about compiling gmapping because of the core dumped or about tf because of the info laser mounted upwards.
Thank you.
↧
↧
SLAM Gmapping map to odom remapping error
Hello,
I am trying to perform slam_gmapping for my project, and using filtered odometry obtained using ekf. However, the tf_tree shows that map->odom (this is for slam_gmapping), and second side of the tree has map->/odometry/filtered (this is ekf odometry).
I am attaching my file which classifies tfs, and the tf diagram I got while running the system. I have remapped odom->/odometry/filtered in the slam_gmapping section of the code, but seems like it is not being activated. Currently, no map is being received.
Can anyone tell me how to fix it so I can map the environment using the filtered odometry from ekf for gmapping?
[C:\fakepath\frames.png](/upfiles/15475825844479636.png)
Code:
[true, true, true,
false, false, false,
true, true, true,
false, false, false,
false, false, false] [true, false, true,
false, false, true,
false, false, false,
false, false, true,
false, false, false] [
false, false, false,
true, true, true,
false, false, false,
true, true, true,
false, false, false] [
false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true] [0.005, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0.005, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0.006, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0.003, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0.003, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0.006, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.0025, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0.0025, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0.004, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.001, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.001, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.002, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.001, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.001, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.0015]
Thanks,
Hdbot
↧
Poor quality maps produced from odometry based slam algorithms tested in real TurtleBot
We are currently testing several slam algorithms in a real TurtleBot(ROS Kinetic). Despite the fact that everything seems to be working fine on TurtleBot we have come across a problem on the maps coming from odometry based slam algorithms. **Although we changed the TurtleBot base to figure out whether the base had hardware or odometry issues, the maps remained the same**. The lidar we use has maximum range up to 17m.
## **Gmapping (using odometry)** ##
We tested gmapping with these parameters:
The map from **Gmapping** tested in the whole lab is the following.

## **KartoSlam(using odometry)** ##
The map produced by **KartoSlam** tested in a lab's room with the default parameters is the following.

## **CRSM Slam (no odometry used)** ##
The map produced by **CRSM Slam** tested in a lab's room, which does not use odometry is the following.

As you can see the CRSM map is far better than the previous two.
----------
The questions :
1. Where shall we look for the fix, since we have tried the algorithms on two different TurtleBots?
2. How could we improve the map quality, since what we get so far is really poor?
↧
map_saver for different map topics
Hey there.
I am trying to figure out how is possible to use
rosrun map_server map_saver -f name_of_map
in order to save maps simultaneously made by different gmapping nodes, that is, how to save maps with another topic name than /map.
For instance, /map_1 and /map_2 are being created and the point is to save both maps in different files.
I was watching map_server code in GitHub and found this test launcher:
But I don't know how to modify it in order to accomplish that.
Could you give me a hand?
↧