Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 601 articles
Browse latest View live

[Kinetic] mapping is inaccurate

$
0
0
Hi all I created my own launch file for gmapping however the map generated in RVIZ is really inaccurate compared to the Gazebo world. Is there any solution to this? My robot's speed was at 10 on the keyboard teleop and rotation was done in slow movements. My launch file codes are as follows: File: my_gmapping_launch.launch File: gmapping_params.yaml base_frame: base_footprint odom_frame: odom map_update_interval: 5.0 maxUrange: 6.0 maxRange: 8.0 minimumScore: 200 linearUpdate: 0.1 angularUpdate: 0.1 temporalUpdate: -1.0 resampleThreshold: 0.5 particles: 80 lskip: 10 xmin: -10 ymin: -10 xmax: 10 ymax: 10 delta: 0.05 llsamplerange: 0.01 llsamplestep: 0.01 lasamplerange: 0.005 lasamplestep: 0.005 RVIZ mapping result and RVIZ settings - ![image description](/upfiles/15353791842296108.png) ![image description](/upfiles/1535379221992880.png) ![image description](/upfiles/15353792612760691.png) Gazebo world - ![image description](/upfiles/15353793486531562.png) After creating my map I'll be running my amcl launch file to localize my robot in the map.

Can 2dslam and rtabmap be done simultaneously on the robot

$
0
0
I have tried many methods suggested in [Kinect + Odometry + 2D laser](http://wiki.ros.org/rtabmap_ros/Tutorials/SetupOnYourRobot) and changed odom and scan topics that suit my robot specifications. But, I could not get any data neither in RViz nor in Rtab GUI. I am getting this error [ WARN] [1535549844.214860264]: /rtabmap/rtabmap: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set. If topics are coming from different computers, make sure the clocks of the computers are synchronized ("ntpdate"). If topics are not published at the same rate, you could increase "queue_size" parameter (current=10). /rtabmap/rtabmap subscribed to (approx sync): /odom, /camera/rgb/image_rect_color, /camera/depth/image_rect, /camera/depth/camera_info, /scan But when I echoed each topic some data is being published and I am able to get 2D map with this data. But I could not integrate 3D map to this odom and scan data. (odom = IMU and ENCODER data) I am actually trying to integrate gmapping and RTabmap and found a launch file that suits my requirements. I don't know whether it works. Here is the launch file I am using I do not understand what voxel_cloud is here and I am a newbie to RTabmap. Could some one suggest me the right way to do this 2d + 3d integration or integrating 3d map with IMU and ENCODER data of my robot. I am ok if some one suggests me the way to integrate hector_slam and RTabmap.

Navigation Stack and Odometry issue

$
0
0
Hello everyone, I try to get the navigation Stack with gmapping running on my robot. The odometry and Gmapping are very good as long as I move the robot with my keyboard. As soon as I give the first navigation goal, the robot starts moving, but the position in the map is not updated until move_base stops its navigation. Btw. the robot does not reach its goal, but the direction is right. So it seems like the controlling by the navigation stack does block the new odometry information. I think that could happen because the serial connection to my hardware-control-board (which is controlling sensors+motors) cant handle the odometry and the navigation communication togther. Important to say is, that odometry and keyboard control are working fine together So my question is: Is there a way to recuce the frequency of the cmd_vel msgs published by the navigation stack? or does anyone have an other idea what could be wrong?! thanks so much Lorenz

map conversion to image (hector_compressed_map_transport ((GMAPPING)))

$
0
0
Hello, I am trying to use hector_compressed_map_transport for other slam algorithms too. I know that it uses /pose. I am publishing pose by the help of http://wiki.ros.org/pose_publisher in the required msgType. But map never shows up (window freezes). Any help please.? I am using kinetic and trying to view the map by rosrun image_view image_view image:=/map_image/full _image_transport:=compressed (It work perfect for hector slam).

Gmapping is adding extra points to my map that aren't from the laser scan.

$
0
0
I'm running gmapping using the sick lms1xx driver for the lidar scans and wheel encoders for odometry. I'm having a problem with gmapping adding points to the map that are outside of my walls and are not reported by the laser scanner. Please see the image in the link below for details: https://drive.google.com/file/d/0B-GsWiHCK1neOUkwODZOX2RlRUE/view?usp=sharing I've tried adjusting almost all of the parameters and I even tried filtering the laser scan message but the points don't exist in the laser scan so there isn't really anything to filter out. Hector mapping doesn't have this problem. Any thoughts on what's going on? ------- Edit: ------------- While hector mapping creates better looking maps without this issue, gmapping creates more accurate maps due to the underlying particle filter. Therefore, I would like to use gmapping over hector. To highlight the issue even more, here are two maps, one made with hector and one with gmapping. Gmapping is adding quite a few additional points to the map that are not included in the laser scan data. Hector: https://drive.google.com/file/d/0B-GsWiHCK1neUU1LMVI0c19vMTQ/view?usp=sharing Gmapping: https://drive.google.com/file/d/0B-GsWiHCK1necFo3UmlWRDFVLWc/view?usp=sharing Manually looking at the laser scan data, I can see that the min range is 0.0m and the max range is 12.17m, neither is close to my 50m laser maximum and both are well below the outlier points on the map which appear to be about 25m away from the robot. I also determined that the 0.0m points are not the problem by filtering those out. Any further help is appreciated. Thanks.

Bad results with Gmapping

$
0
0
Hey everyone, I've got some trouble with gmapping. I'm Using a robot called ["eddie"](https://www.google.com/search?q=eddie+parallax&client=ubuntu&hs=zo2&channel=fs&source=lnms&tbm=isch&sa=X&ved=0ahUKEwjk28L66-fdAhXqtYsKHTjWASQQ_AUICigB&biw=1375&bih=802) witch is quiet similar to the Turtlebot2e, a kinect is the only used Sensor in this case. I wrote some nodes to get the Navigation Stack running( such as an odometry_publisher...) For the SLAM i thought gmapping would be a good choice. I used the Parameters they use with the turtlebot, but i get really bad results. At first i thought my odometry could be the poblem. But the Odometry is realy good, as you can see on this Map (made with sigma and lsigma = 0 -> gmapping just used odometry. the robot did 21m of distance to create that map, asswell as all in all a rotation of 47rad(7,5 revolutions)) ![image description](/upfiles/15384869305053507.png) As soon as I change the sigma and lsigma back to a usefull value the resulst get as bad as this: (made with the turtlebot paramters) you can see the large jump in the top right corner. most of the time gmapping says something like: `Scan Matching Failed, using odometry. Likelihood=-3889.53 lp:3.94415 2.44182 -0.742944` if it can match the scans, it often jumps as in this case: ![image description](/upfiles/1538488235929914.png) I use a notebook with an i7 620M (1st generation). Things i've tried: -Recording bag file and play it with 10% speed ( to be sure, the notebook has enough power for gmapping) - incresing particles and anglarUpdate - many many parameters changed, always worse than befor! Has anyone an idea why that happens? Thank you all so much! Turtlebo Parameter:

what is output data form of PS 1080 chip used in kinect xbox 360?

$
0
0
please elaborate on data form and also how to process it using ROS?

TL_OLD_DATA error when using gmapping with V-REP

$
0
0
Hi, I'm new to ROS (Kinetic) and dealing with a simulated robot (robotino) and SLAM. The problem is that when i try to launch gmapping by the command: rosrun gmapping slam_gmapping scan:=/base_scan I get the following error: Warning: TF_OLD_DATA ignoring data from the past for frame odom at time 1.53886e+09 according to authority unknown_publisher Possible reasons are listed at http://wiki.ros.org/tf/Errors%20explained at line 277 in /tmp/binarydeb/ros-kinetic-tf2-0.5.16/src/buffer_core.cp I've already tried to send a empty message to the topic /reset_time but it didn't work. I'm using the SICK_TIM310_fast laser in V-REP and I publish it's data using the following piece of Lua script: laserInfo = {} intensities = {} data=sim.getStringSignal("measuredDataAtThisTime") if data then measuredData=sim.unpackFloatTable(data) laserInfo['header'] = {seq=0,stamp=simROS.getTime(), frame_id="laser_lixo"} laserInfo['angle_min'] = -1.54 laserInfo['angle_max'] = 1.54 laserInfo['angle_increment'] = 0.017444444444444 laserInfo['time_increment'] = 0.0666666667 laserInfo['scan_time'] = 1.0 laserInfo['range_max'] = 4.0 laserInfo['range_min'] = 0.05 laserInfo['ranges'] = measuredData for k=0,table.getn(measuredData) do table.insert(intensities, 0) end laserInfo['intensities'] = intensities simROS.publish(laserPub, laserInfo) end Thanks for your help and if i'm missing some information that could be useful please tell me.

Where is the origin of a map made by gmapping?

$
0
0
When I construct a 2d map using laser gmapping, how can I be sure where the origin is? For example, before starting gmapping, if I go in gazebo simulation to the desired center in my environment and then start gmapping: 1. Will that be considered as the origin? 2. If yes, then if I go left by 1 meter, does it mean that now I moved x=-1 y=0 z=0? 3. So with my resolution of 0.5, do I need to translate to ROS by saying : x= (-1 * 0.5), y = 0, z=0 ?

How to create the odom --> base_link transform in gmapping

$
0
0
Hi We are trying to run gmapping using ROS Indigo on Ubuntu 14.04 . The problem we are encountering is that the transform between "odom" and "base frame" is not created correctly. Even though we think it should exist since the transform is visible when echoing rostopic. Should the way we are launching gmapping work or is the problem that the robot is not publishing the transforms correctly? Also how do we create the transform between odom and base_link. running: >rosrun rqt_tf_tree rqt_tf_tree yields the following result: [link to tf_tree](https://ibb.co/mymHVf) As you can see the link between odom and base_link is missing however when we run >rostopic echo cvc/tf The transform between odom and base_link is visible transforms: - header: seq: 0 stamp: secs: 1539863401 nsecs: 756178000 frame_id: odom child_frame_id: base_link transform: translation: We are using the following launch file: Thanks in advance for any help.

error while launching gmapping launch file

$
0
0
WARNING: ignoring defunct tag started roslaunch server http://mudassir:40969/ SUMMARY ======== PARAMETERS * /rosdistro: melodic * /rosversion: 1.14.3 * /slam_gmapping/angularUpdate: 0.436 * /slam_gmapping/base_frame: chassis * /slam_gmapping/delta: 0.01 * /slam_gmapping/linearUpdate: 0.5 * /slam_gmapping/particles: 80 * /slam_gmapping/resampleThreshold: 0.5 * /slam_gmapping/temporalUpdate: -1.0 * /slam_gmapping/xmax: 20 * /slam_gmapping/xmin: -20 * /slam_gmapping/ymax: 20 * /slam_gmapping/ymin: -20 * /use_sim_time: True NODES / slam_gmapping (gmapping/slam_gmapping) ROS_MASTER_URI=http://localhost:11311 ERROR: cannot launch node of type [gmapping/slam_gmapping]: can't locate node [slam_gmapping] in package [gmapping] No processes to monitor shutting down processing monitor... ... shutting down processing monitor complete

Not able to create map using Gmapping?

$
0
0
I am trying to create a map with the rosbag file and doing it exactly as mentioned in the tutorial I am running Ububtu 16.04 with ROS Kinetic: After starting master I am doing: rosparam set use_sim_time true Then I am starting : rosrun gmapping slam_gmapping _base_frame:=base_footprint And then starting bag file with `--clock` as parameter then also I am getting error which I am trying to echo the /map_metadata,`No messages received and simulated time is active.Is /clock being published?` And for the matter of fact /clock is being published and when I echo the topic its running fine then why am I not getting map or why when I rosbag file there is no diagnostics being published with gmapping node runnning. The link to the bag file is this https://drive.google.com/file/d/1ZcELDsCUAkWaBajSwhhS7nyrMzLoYSqe/view?usp=sharing

How can I display map by using bag files that saved when I do SLAM?

$
0
0
Hello, I want to display the map in matlab by using bag files that saved when I do SLAM, cartrographer and gmapping demo files. I thought it would be easy to put information like odometry and lidarscan from the bag file into the matlab codes or the examples that is opened to the Internet but it was not working. This is url that I referred(https://kr.mathworks.com/help/robotics/ref/robotics.lidarslam-class.html) And bag files that I have has /scan, /tf topics(gmapping), /horizontal_laser_2d, /imu, vertical_laser_2d(cartographer) topics. I would really appreciate any help.

[noobie] Need help with custom mobile robot navigation (multiple questions)

$
0
0
Good evening guys, (questions on the bottom) I'm a Robotics student working on my senior design and REALLY NEED SOME HELP. The idea of the project is that we will build our own environment. Then build a 2D map using "hector_slam", or "gmapping". Then send this map to the MR to navigate the map. The MR will always start from the same spot and will give it a goal and pose using "rviz" (ultimate goal is to have the goals and poses of a specific spot on the map predefined. In other words, I want to send it a number (1 or 2 or 3) of predefined goals and poses and it drives from point A (the predefined start point) to goal 1 or 2 or 3. I want to send the number using another computer over network). I'm VERY new to ROS and trying my best to get learn and finish this project so I can graduate. My team and I finished building the 4 wheel drive robot using: - mobile-robot chassis: http://www.lynxmotion.com/p-603-aluminum-4wd1-rover-kit.aspx - raspberry pi: 3 model B running Ubuntu MATE 16.4 & ROS Kinetic - adafruit motor drive: https://learn.adafruit.com/adafruit-dc-and-stepper-motor-hat-for-raspberry-pi/overview - Optical Encoder on 2 of the 4 wheels - lidar sensor: https://www.slamtec.com/en/Lidar/A3Spec My current achievements:- - Installed all the ROS, dependences, and the packages - Using roslaunch rplidar_ros rplidar.launch to have the data and "/scan" node - Using roslaunch hector_slam tutorial.launch to build a map - Using rosrun map_server map_saver -f mymap to save a map - Using rosrun map_server map_server mymap.yaml to publish the saved map - Using motor-hat-node package & it's teleop (W-A-S-D) to manually move the mobile robot around to build the map - Built the environment with dimensions in gazebo - Built the mobile robot with dimensions in gazebo Stuck on: - Publishing odometery (is that the encoder data?) - Navigation (which package to use? I have navigation stack installed but no idea how to use it) - Localization (is it done with the navigation stack?) - How to add encoders for the wheels in gazebo? - How to add the right rplidar that I have in real life on top of the robot in gazebo? - How to establish the communication between any of the navigation packages and my robot simulation or in real life? Packages that I looked at but have no idea how to use for navigation: - navigation stack - mapping - hector_slam

Less number of occupancy grids

$
0
0
Hello, I am running a gmapping package and creating a map. I have used a standard resolution of 0.05. After I have finished with the mapping my next step is to create a path plan. For that, I have used a global_planner from navigation stack. As my robot is very simple it cannot take curve tracks therefore I needed a simple path planning algorithm and used a parameter use_grid_map. Still, the plan seems to count too many steps for simple linear goal. So instead of turning and going straight to the goal, the path plan estimates too many trajectories. Is it due to the fact that the gmapping produces too many occupancy grids or is it something in global_planner? What can I do?

How to use gmapping with an exist map?

$
0
0
Hi all, I am working on this project and I'm wondering is there a way to use gmapping with an existed map? Like doing some extra exploration base on a partial map. I've tried publish the map to /map topic directly. But it didn't work, I checked the source code, seems like the package stored its map in another variable so when the map is updated my published map will be simply overwritten. Thank you in advance. Hui.

Is there an existing Dubins pathing package?

$
0
0
Hi everyone, I have little experience with ROS and ROS open source packages, so bare with me. I am hoping if someone could inform me on a move_base like package that interfaces with gmapping using dubins pathing (circles)? The solution I am trying to implement is almost exactly like the husky demonstrations using move_base and gmapping however I am unable to pivot on the spot like husky is able to (I can but we are preferring not to for mechanical reasons). Would anyone happen to know of a dubins pathing-esque move_base package or whould I work on developing my own? Thanks, Grant.

How to use two gmapping packages simultaneously

$
0
0
Hi all, I was trying to use two gmapping packages, one is the original one, and the another is modified one. Following is what I did: 1. clone gmapping package 2. rename the folder (slam_gampping_modi) 2. modify CMAKE file 3. compile But it just destroyed everything, somehow my AMCL didn't work. (which is another one mystery and I had to reinstall my ubuntu this afternoon I think the problem is the modification of CMAKE file, so could anyone please tell me which part of this file I need to change so it will be dealt as a new package?

reinstallation of gmapping corrupts costmaps

$
0
0
I'm experimenting with ROS Kinetic on an Odroid Xu4, running Ubuntu 16.04, together with an iRobot Create2 and a RPLIDAR A1. My current goal is to build a simple self-navigating robot with the [create_autonomy](https://github.com/autonomylab/create_autonomy) package. For that im using [move_base](http://wiki.ros.org/move_base), [rplidar](http://wiki.ros.org/rplidar) and [gmapping](http://wiki.ros.org/gmapping?distro=kinetic). With gmapping installed via `sudo apt-get install ros-kinetic-gmapping` I managed to get reasonable outputs for laser, map and costmaps in rviz. But when I tried to set a 2D Nav Goal I got `transform require extrapolation into the future` and `global frame: odom plan frame size 2: map` errors. Following the tips [here](https://answers.ros.org/question/227694/tf-error-transform-require-extrapolation/) and [here](https://github.com/ros-industrial/industrial_training/issues/77) I purged the gmapping package and reinstalled it via `git clone` from [github](https://github.com/ros-perception/slam_gmapping) to my catkin ws. Before building with catkin it I changed the `ros::Time(0)` parts in `slam_gmapping.cpp`. As a result I now have no information in the costmap-topics, while `/map` and `/rplidar_scan` are working perfectly fine. I did no changes to my [launchfile](https://pastebin.com/UBsJm0Bn) or my other configs, my tf-tree hasn't changed too. I have absolutely no idea whats going wrong and just overwhelmed by the complexity of ROS (I'm playing only since a few months with it). Has someone an idea whats the problem with my setup? Or did a make a general mistake?

robot_localization transform functions for move_base?

$
0
0
Hi, I am curious as to the coordinate frames that my odometry data is in after using the robot_localization package ekf as a localisation estimator? I am using wheel odometry and an IMU in a continuous ekf and navsat (from a gps) (and the previous two sensors) in a discrete ekf. ekf_cont: frequency: 30 sensor_timeout: 0.1 two_d_mode: true transform_time_offset: 0.0 transform_timeout: 0.0 print_diagnostics: true debug: false map_frame: map odom_frame: odom base_link_frame: base_link world_frame: odom ekf_disc: frequency: 30 sensor_timeout: 0.1 two_d_mode: true transform_time_offset: 0.0 transform_timeout: 0.0 print_diagnostics: true debug: false map_frame: map odom_frame: odom base_link_frame: base_link world_frame: map My solution needs to use this data in conjunction with move_base and gmapping. As I understand it I need to provide a transform for the discrete data. My IMU device is mounted in the exact center and my drive shaft is mounted at the exact centre (in one dimension at an equal distance from the centre in the other dimension, so the speed is measured from the centre in both dimensions). As I understand, my continuous data has to be presented to move_base in the odometry frame and my discrete data has to be presented as a tf/tfMessage? [Clearpath Husky demo](https://roscon.ros.org/2015/presentations/robot_localization.pdf). If I understand correctly my continuous is in the odometry frame and my discrete is in the map frame but how do I produce this as a tf/tfMessage for my move_base node? Will
Viewing all 601 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>