Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 601 articles
Browse latest View live

slam gmapping

$
0
0
Hello, I'm trying to build a map from a bag file. rosrun gmapping slam_gmapping [ WARN] [1298485439.195475816, 1297352013.562990207]: Message from [/play_1298485437842772645] has a non-fully-qualified frame_id [laser]. Resolved locally to [/laser]. This is will likely not work in multi-robot systems. This message will only print once. [ WARN] [1298485454.051371332, 1297352028.420031693]: MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.gmapping.message_notifier] rosconsole logger to DEBUG for more information. In rxbag I've checked the bag file and there is /pose instead of /odom. How to do the transforms? Thank you!

A more risk taking move_base

$
0
0
I am using turtlebot gazebo simulation. I launched the turtlebot_world, demo_amcl and demo_gmapping launch files. I am sending goal positions to the turtlebot using a custom node that sends the turtlebot 1 metre forward. It is the most basic move_base client node that is found in tutorials available on the web. The difficulty I find in using that is that less than 50 % of the time does it successfully send the turtlebot to the desired location. And it does that swirling on its axis trying to map its environment This happens whether I use the goal_position_node or the 2DNav Goal Tool in Rviz. Is there a way to make the current move_base functionality better by a simple fine-tuning of parameters or there already exists a much more risk-taking alternative to move_base. Otherwise would using a custom node which publishes to cmd_vel using laser_scan data be a better option.

Localization problem with gmapping and RPLidar

$
0
0
Hello ROS community, I'm working on a project which goal is to create an [autonomous mobile robot](http://geduino.blogspot.it/?zx=26b2e9d70329b947) running SLAM algorithm using gmapping. I'm using odometry from motors encoder and laser scan data from RPLidar. ROS Hydro is running on a UDOO board. I complete the navigation stack setup and, as soon the robot is not moving, all seems to work fine (transformations published, map published, laser scan data published). When I move the robot using teleop the map->odom transformation broadcasted by gmapping seems to make no-sense. Better than other words is a [video of RViz](https://www.youtube.com/watch?v=KEykgJtSwNY). Apart the localization problem I cannot understand why the odom and base_link frame are not overlapped after start. They don't should be? Here the transformations tree: ![The transformations tree](/upfiles/14270481164793066.png) Here the nodes and topics: ![The nodes and topics](/upfiles/14270481945751951.png) This the gmapping node configuration: throttle_scans: 1 base_frame: base_link map_frame: map odom_frame: odom map_update_interval: 10 maxUrange: 5.5 maxRange: 6 sigma: 0.05 kernelSize: 1 lstep: 0.05 astep: 0.05 iterations: 5 lsigma: 0.075 ogain: 3.0 lskip: 0 minimumScore: 0.0 srr: 0.1 srt: 0.2 str: 0.1 stt: 0.2 linearUpdate: 1.0 angularUpdate: 0.5 temporalUpdate: -1.0 resampleThreshold: 0.5 particles: 30 xmin: -10 xmax: 10 ymin: -10 ymax: 10 delta: 0.05 llsamplerange: 0.01 llsamplestep: 0.01 asamplerange: 0.005 lasamplestep: 0.005 transform_publish_period: 0.1 occ_thresh: 0.25 I will really appreciate any suggestion to fix my problem. I did not publish other configurations since the problem seems to be related to gmapping: if other informations are needed I will be happy to provide them. Many thanks! Ale ## UPDATE ## As suggested by paulbovbel I follow the guide test odometer quality. The result is quite good for straight path, a little bit less for rotation. Watching the video I think the problem could not be in odometry: in the video the first seconds (until time 0:08) after robot starts moving all seems to be fine. During this time the position is updated based on odometry only (at least... I guess!) and laser scan data (in red) match the map: this means that odometer data is quite good. After 0:08 the map->odom transformation (broadcasted by gmapping) changes: in theory to compensate odometry drift but, at the end, it spoils the estimate position of the robot. After position estimation is spoiled also laser scan data make no sense and this cause builded map to be... a no-sense! This make sense or I make some mistake in my think? Just to give more info: the video show the robot just a minute after system starts. When the video starts the robot was stopped in its initial position: for this reason I expect base_link, odom and map frame overlap (drift must be zero and robot it's in center of map). ## UPDATE ## I'm still working in order to fix this problem. I performed some test to check the quality of my odometry. On the attached image from RViz you can see the laser data impressions drawn during complete test (decay time = 120000): consider that the robot travelled from one side of the room and back to its start position. Is it a good result? I think yes since when the robot go back to start position the laser scan is really close to original and the result is really close to real room. Unfortunately I have other material to compare to... ![image description](/upfiles/14274713901833255.png) [Here](https://www.dropbox.com/s/emn0i4cngmh88o4/laser_data.bag?dl=0) the bag file recorded during this test. I used it to run gmapping and I got this map: ![image description](/upfiles/14274722019784027.png) I'm quite surprised to see that the raw laser scan data on odom frame represent the real world better than gmapping result! I expected gmapping to improve it... Now I want to make some tests changing gmapping parameters. Has anyone some tips for me? Thanks Ale ## UPDATE ## Looking for benchmark result of RPLidar I found this [video](https://www.youtube.com/watch?v=pCF7P7u8pDk) uploaded by RoboPeak. I was really surprised by the quality of result and disappointed when compare it with my own ones! Since they was using hector_slam instead gmapping I tried the same and this is the result: ![image description](/upfiles/14274799368315787.png) This is really good compared to gmapping result. I want to go in deep on this topic and understand why they performed so different. Any tips? Thanks ## UPDATE ## I'm working to some tests in order to find out how to improve gmapping performance in my robot. First test are conducted setting different values of minimum score param in gmapping. Here the result: ![image description](/upfiles/142764331169699.png) The improvement with value of 500 is a fake: with this value gmapping rely in odometry only! So minimum score is not the solution. For next test I want to use a scan filter in order to clean laser scan data and check if this can improve gmapping result. For who is interested a benchmark of different SLAM algorithm available in ROS can be found [here](http://home.isr.uc.pt/~davidbsp/publications/SPR_SSRR2013_SLAM.pdf). ## SOLVED ## At the end I found how to fix gmapping and RPLidar problem. The problem came out from different sources: - the LaserScan message expected by GMapping; - the RPLidar reference frame; - the LaserScan message created by RPLidar node. [Here](https://github.com/afrancescon/rplidar_ros) there is the RPLidar node with fixes and [here](http://geduino.blogspot.it/2015/04/gmapping-and-rplidar.html) you can find more information about the fix. In order to make RPLidar and gmapping work some attention is required broadcasting laser frame transformation. Please read the posted link for complete explanation.

Supported hardware for Hokuyo_node

$
0
0
Do [hokuyo_node](http://wiki.ros.org/hokuyo_node) supports [Hokuyo URG-04LX-UG01 Scanning Laser](http://www.robotshop.com/en/hokuyo-urg-04lx-ug01-scanning-laser-rangefinder.html), Can i use this node as the driver for the laser scanner?

Autonomous mapping an unknown environment with Turtlebot2?

$
0
0
I'm looking for a way to map out an unknown environment autonomously using a Turtlebot2. Right now, I use the gmapping_demo from the turtlebot_navigation package, rviz, and teleop to manually map out an environment but I would like to use a exploration node that would create this map automatically. I looked into the nav2d package but it looks outdated and incomplete. None of the tutorial launches work for me. Is there a simple explorer package that will work out of the box with any gmapping package? I use the indigo distribution. Thanks!

Can't find robot footprint?

$
0
0
When I run gmapping_demo.launch for my turtlebot2, I get the error "You must specify at least three points for the robot footprint, reverting to previous footprint" even though my footprint should be set. In my costmap_common_params.yaml, I have robot_radius set to 0.18. When I echo the /move_base/local_costmap/footprint topic I get: header: seq: 206 stamp: secs: 1458227575 nsecs: 605247230 frame_id: odom polygon: points: - x: 0.189999997616 y: 0.0 z: 0.0 - x: 0.176298320293 y: 0.0788830146194 z: 0.0 - x: 0.137279227376 y: 0.137279227376 z: 0.0 - x: 0.0788830146194 y: 0.176298320293 z: 0.0 - x: 0.00999999977648 y: 0.189999997616 z: 0.0 - x: -0.0788830146194 y: 0.176298320293 z: 0.0 - x: -0.137279227376 y: 0.137279227376 z: 0.0 - x: -0.176298320293 y: 0.0788830146194 z: 0.0 - x: -0.189999997616 y: 0.00999999977648 z: 0.0 - x: -0.176298320293 y: -0.0788830146194 z: 0.0 - x: -0.137279227376 y: -0.137279227376 z: 0.0 - x: -0.0788830146194 y: -0.176298320293 z: 0.0 - x: -0.00999999977648 y: -0.189999997616 z: 0.0 - x: 0.0788830146194 y: -0.176298320293 z: 0.0 - x: 0.137279227376 y: -0.137279227376 z: 0.0 - x: 0.176298320293 y: -0.0788830146194 z: 0.0 Anyone know how to fix this?

Gmapping in a complex environment.

$
0
0
I am using a turtlebot2 with an ASUS XTION laser scanner and I am trying to develop autonomous mapping functionality but I am running into a lot of issues with path planning. The robot often gets stuck and is unable to find a path to its goal. I am currently testing in an office environment that has a lot of complex obstacle features (chair legs, table legs, human legs, etc.) and I was wondering if these features might be affecting the performance of the path planner? There are also a few ceiling-to-floor windows which may be confusing the laser scanner. Does anyone know the limitations of the DWA local planner in complex environments? In videos I see online, people often create very structured maze like environments which make it easy for the robot to navigate. Otherwise, I'll need to tweak some parameters...

Wrong dimensions in map obtained using GMapping?

$
0
0
Hi all, I have a map which I obtained using GMapping and now I am using it for navigation. The robot gets lost when the laser scan particles do not fall directly on the same wall on the map. In the image shown there is a difference of 1.32m between the actual wall and the wall on the map. The dimensions of the map do not always match reality. Sometimes there are discrepancies of around 1m between the dimensions shown on the map and reality. Does anyone have any ideas why this is happening? ![image description](/upfiles/1458251228190975.png) Thanks.

why does gmapping only return a map with 0 and 100 but no other values?

$
0
0
hello everyone I'm using slam gmapping to produce a map, but I've noticed that the output Occupancy Grid Map(which is broadcast to topic /map) contains only -1,0 and 100 for unknown, free and occupied grids respectively. I'm wondering is there a way to produce a map with other values say 0.05, 0.37, 0.52 etc. which represent the probability of the grid to be occupied? I've checked out the parameters available for gmapping but could not find a clue about this.If it's not available in gmapping, then how should I do this? any help will be much appreciated, thank you all!

Navigation path is red in rviz

$
0
0
I am using a turtlebot 2 and am working on autonomous navigation. Whenever I am sending navigation goals in rviz, the path shows up as flashing red instead of green even though the robot is able to execute the navigation. For navigation of a known path, this is not an issue but when I try to do autonomous gmapping of an unknown environment, the DWA path planning fails often and I think the issue might be related to the red path. My only thought is that it is related to the robot_footprint because I always get the error "You must specify three points for the robot_footprint." Anyone know what might be causing this?

Configure nav2d for autonomous mapping with Turtlebot2

$
0
0
This question is for people who have used the nav2d package with a Turtlebot2 for autonomous exploration and mapping. Right now I am trying to customize the nav2d_tutorials/tutorial3.launch file to use with my turtlebot 2. I started by basically copying the customized launch file from [this](http://answers.ros.org/question/219227/nav2d-cant-find-path/) post. This is a good starting point for me however I am pretty much experiencing the same problems that the user from that post was having. rosservice call /StartMapping 3 does nothing for me and the fix that he used isn't working for me. I understand that my problem probably lies in having the correct msgs published to the right topics etc. since the nav2d package isn't pre-configured for the turtlebot 2, but I don't really know where to start. For anyone who has accomplished this, what parameters do I need to change or configuration files do I need to adjust? A step-by-step instruction would be amazing but I would be happy if you could tell me the nodes that I should be focusing on (operator, navigator nodes for example). Also, can the tutorial3.launch act as a standalone for the autonomous mapping or do I need to run it in parallel with a gmapping, move_base, or turtlebot_bringup node? Any help is greatly appreciated!! Thanks!!!

Map not received when using nav2d with gazebo

$
0
0
I'm trying to use nav2d to autonomously map out an environment in gazebo but I am unable to get the gmapping nav2d node "mapper" to publish a /map msg. First I launch the turtlebot_gazebo/turtlebot_world.launch to start the turtlebot in gazebo and then I launch a modified version of the nav2d tutorial3.launch file. In RVIZ, I get an error message under the map section that says "no map received" and "No transform [] to [map]. The output of rostopic echo /map is "WARNING: no messages received and simulated time is active.Is /clock being published?" I checked /clock and it is publishing so that is not the issue. Also, when I run turtlebot_gazebo/gmapping_demo instead of nav2d, a /map msg is published correctly so I don't think its an issue with my gazebo configuration. My guess is that the nav2d "mapper" node is connected to the wrong laser frame but I have tried changing the base frame from base_laser_link to base_laser and to camera_depth_frame and neither worked. Rqt_graph shows a /scan msg being published to mapper and a /map msg being published to RVIZ. Below are my launch file and ros.yaml file which defines the parameters for nav2d. Here is my current launch file:

How to stop gazebo publishing tf

$
0
0
Hi, all! Now I'm testing the node /robot_pose_ekf with gazebo simulator. And I have already let /gazebo publish IMU and odometry data to /robot_pose_ekf. Also, /robot_pose_ekf is connected to tf. However, when I checked the tf tree I found that the transform from odom -> base_footprint was provided by /gazebo but I want it to be /robot_pose_ekf. I guess it is because both /gazebo and /robot_pose_ekf are publishing /tf. Hence, I think stopping /gazebo publishing /tf can be a possible solution. FYI, I'm using GMapping which subscribes to /tf to provide SLAM functions. Can anyone tell me how to stop gazebo publishing tf? Or any other solutions to my problem? Thank you in advance!

Is there something wrong with this laser setup?

$
0
0
I'm trying to use gmapping in gazebo using [this](http://wiki.ros.org/turtlebot_gazebo/Tutorials/indigo/Make%20a%20map%20and%20navigate%20with%20it) tutorial but I'm getting an error for my local and global costmaps saying "no map received" and "no transform from [] to [map]." When I run amcl in gazebo, however, I have no issues with the costmap generations and everything works flawlessly. I've seen a lot of other people having issues with "upside down" lasers when using gmapping in gazebo and I am wondering if I am having a similar issue. When I run gmapping_demo.launch from the turtlebot_gazebo package, I get this output on terminal: [ INFO ] : Laser is mounted upwards. Laser Pose= -0.0866309 0.481637 -0.00628993 Is having a negative z component for the laser position an issue?

Costmap error for gmapping in gazebo

$
0
0
I'm trying to do the gmapping in gazebo tutorial found [here](http://wiki.ros.org/turtlebot_gazebo/Tutorials/indigo/Make%20a%20map%20and%20navigate%20with%20it) but I am getting an error with both my local and global costmaps. The three launch files that I use are turtlebot_gazebo/world.launch, turtlebot_gazebo/gmapping.launch, and turtlebot_rviz_launchers/view_navigation.launch. When RVIZ shows up, I have an error in both the local and global costmaps which says "No map received" and "No transform from [] to [map]" Does anyone know what I should do or where I should start to look?

Getting only the first snapshot of the scans with gmapping

$
0
0
Hi, I've gone through one that seems to be a shared problem, as the following links testify - [http://answers.ros.org/question/218227/how-to-create-map-with-gmapping/](http://answers.ros.org/question/218227/how-to-create-map-with-gmapping/) - [http://answers.ros.org/question/32399/slam_gmapping-registering-first-scan-problem/](http://answers.ros.org/question/32399/slam_gmapping-registering-first-scan-problem/) - [http://answers.ros.org/question/200870/gmapping-calculate-only-first-laser-scan/](http://answers.ros.org/question/200870/gmapping-calculate-only-first-laser-scan/) Some of them didn't received a satisfactory answers, while the successful replies to the others are not working for me. I'm trying to make a map with an iRobot Create 2, using the `irobotcreate2ros` node ([https://github.com/MirkoFerrati/irobotcreate2ros](https://github.com/MirkoFerrati/irobotcreate2ros)), and using a Kinect as a sensor (from which I take the laser scan using the `depth2laser` node [https://github.com/mauriliodc/depth2laser](https://github.com/mauriliodc/depth2laser)). I changed the tf published by the irobotcreate2ros node, so that the frames are directly `odom` and `base_link`, instead of `iRobot_0/odom` and `iRobot_0/base_link` (as in the original version), and then I don't need a static_transformation_publisher (it shouldn't affect the general behavior, I suppose). What happen is that when I start `gmapping`, it seems to take only the first snapshot from the sensor, showing a small segment of the map, and not updating it while the robot moves. I tried both to see the realtime mapping in rviz and to try with a rosbag, but the result is always the same. The following picture shows a snapshot or what I see in rviz [Rviz snaptshot](https://imageshack.com/i/porEiqH3p) And the following picture instead shows my tfs (that are supposed to be correct, as far as I know) [View_frames output](http://imageshack.com/a/img924/7184/K9hzrG.jpg) I already checked that my odometry, the base_link, the xtion and the laser tf are consistent with each other (as they move accordingly in rviz). I've also a bag, but I cannot yet upload it. I'm searching a space where to do it. As soon as I find it, I'll edit this post.

Odometry message errors

$
0
0
Hi! With examples I found online I made an odometry publisher that generates odometry data (for gmapping) from encoder ticks. But When the robot is going straight forward, the odometry data gives the same translation in x en y direction. Where are my mistakes? Here is the full publisher: #include "ros/ros.h" #include #include #include #include "mastering_ros_demo_pkg/encoder_msg.h" long _PreviousLeftEncoderCounts = 0; long _PreviousRightEncoderCounts = 0; ros::Time current_time_encoder, last_time_encoder; //double DistancePerCount = (3.14159265 * 0.1524) / 200; double DistancePerCount=1; double x; double y; double th=0; double W; double V; double vx; double vy; double vth; double deltaLeft; double deltaRight; double d=50; //afstand tussen de wielen int ticks_links; int ticks_rechts; //void WheelCallback(const geometry_msgs::Vector3::ConstPtr& ticks) void WheelCallback(const mastering_ros_demo_pkg::encoder_msg::ConstPtr& msg) { //tijd van de meting current_time_encoder = ros::Time::now(); //verschil in ticks tov vorige berekening deltaLeft = msg->ticks_links - _PreviousLeftEncoderCounts; deltaRight = msg->ticks_rechts - _PreviousRightEncoderCounts; //snelheid links (x) en rechts (y) vx = deltaLeft * DistancePerCount / (current_time_encoder - last_time_encoder).toSec(); vy = deltaRight * DistancePerCount / (current_time_encoder - last_time_encoder).toSec(); //als snelheid links = snelheid rechts -> snelheid = snelheid links en hoeksnelheid = 0 if (vx == vy) { V = vx; W = 0; } else { // Assuming the robot is rotating about point A // W = vel_left/r = vel_right/(r + d), see the image below for r and d double r = (vx * d) / (vy - vx); // Anti Clockwise is positive W = vx/r; // Rotational velocity of the robot V = W * (r + d/2); // Translation velocity of the robot } //vth = hoeksnelheid vth= W; //aantal ticks onthouden _PreviousLeftEncoderCounts = msg->ticks_links; _PreviousRightEncoderCounts = msg->ticks_rechts; //tijd van de laatste meting onthouden last_time_encoder = current_time_encoder; } int main(int argc, char **argv) { ros::init(argc, argv, "odometry_publisher"); ros::NodeHandle n; ros::Subscriber sub = n.subscribe("tick_topic", 100, WheelCallback); //ros::Subscriber sub1 = n.subscribe("tick_topic", 100, ticks_topic_callback); ros::Publisher odom_pub = n.advertise("odom", 50); tf::TransformBroadcaster odom_broadcaster; ros::Time current_time, last_time; current_time = ros::Time::now(); last_time = ros::Time::now(); ros::Rate r(20.0); while(n.ok()){ current_time = ros::Time::now(); //compute odometry in a typical way given the velocities of the robot double dt = (current_time - last_time).toSec(); double delta_x = (vx * cos(th) - vy * sin(th)) * dt; double delta_y = (vx * sin(th) + vy * cos(th)) * dt; double delta_th = vth * dt; x += delta_x; y += delta_y; th += delta_th; //since all odometry is 6DOF we'll need a quaternion created from yaw geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th); //first, we'll publish the transform over tf geometry_msgs::TransformStamped odom_trans; odom_trans.header.stamp = current_time; odom_trans.header.frame_id = "odom"; odom_trans.child_frame_id = "base_link"; odom_trans.transform.translation.x = x; odom_trans.transform.translation.y = y; odom_trans.transform.translation.z = 0.0; odom_trans.transform.rotation = odom_quat; //send the transform odom_broadcaster.sendTransform(odom_trans); //next, we'll publish the odometry message over ROS nav_msgs::Odometry odom; odom.header.stamp = current_time; odom.header.frame_id = "odom"; //set the position odom.pose.pose.position.x = x; odom.pose.pose.position.y = y; odom.pose.pose.position.z = 0.0; odom.pose.pose.orientation = odom_quat; //set the velocity odom.child_frame_id = "base_link"; odom.twist.twist.linear.x = vx; odom.twist.twist.linear.y = vy; odom.twist.twist.angular.z = vth; //publish the message odom_pub.publish(odom); last_time = current_time; ros::spinOnce(); r.sleep(); } }

gmapping and rplidar

$
0
0
hi friends.... I will not play a sample bag file. I couldnt solve gmapping with rplidar how can ı build a map as best simple. Can you give me a launch file for simple create a map with rplidar. I have rplidar node and gmapping nodes and laser_Scan _matchers tools. I mean a lunch file like this. #### publish an example base_link -> laser transform ########### #### start the laser scan_matcher ############################## #### start rviz ################################################ #### start gmapping ############################################ please help me....

nan in laser scan data

$
0
0
I'm trying to run gmapping on my turtlebot 2 but I always get the error "scan matching failed, using odometry" and the results are awful. I echoed the /scan topic and half of the laserscan data was "nan". My thought is that my scan increment does not increment fully through my scan range such that the scan only picks up half of the laserscan data within the scan range. Can anyone verify this and tell me how to change the scan range/scan increment parameters? Thanks!!

Scan matching failed, using odometry

$
0
0
Whenever I'm trying to navigate while using gmapping with my turtlebot2, I get the error "scan matching failed, using odometry" and as a result my local and global costmaps do not get published. In fact, the map and odom topics never get published to the move_base local and global costmaps and as a result the robot relies on odometry for navigation which is really inaccurate. Thinking this was a problem with my parameters, I tried gmapping on gazebo on a *separate* pc and got the same error. Still convinced that I messed with the parameters somehow, I completely reinstalled ros and tried the gazebo gmapping tutorial found [here](http://wiki.ros.org/turtlebot_gazebo/Tutorials/indigo/Make%20a%20map%20and%20navigate%20with%20it) but got the same error. Is anyone else experiencing this issue? On my rqt_graph when using gazebo gmapping, the move_base node doesn't show up at all.
Viewing all 601 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>