Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all 601 articles
Browse latest View live

the necessary of the "rosparam set use_sim_time true " in gmapping

$
0
0
hello, I would like to know the necessary of "rosparam set use_sim_time true " when I build exact map by using gmapping and rviz. When I used "rosparam set use_sim_time true " before "rosrun gmapping slam_gmapping scan:=base_scan", my map becomes error " No map received" in rviz. If I don't use "rosparam set use_sim_time true " ,The error don't appear. If I don't use ""rosparam set use_sim_time true " , Can I build exact map?

Issue while creating a Map using gmapping?

$
0
0
Hello I am in confusion, what really the issue is in? Please any one can solved it, we urgently needed to solve it. In my mobile base Robot , we are now at the stage of creating a map using gmapping, but while creating the map I am facing one issue, The issue is When I move my robot using teleop_twist_keyboard the cmd_vel is generated properly and the robot moves smoothly , but when I use RVIZ tool for visualizing it , in that tool for some second the visualization of robot i.e. its movement on the tool is correct but in between it suddenly changes its route or its position automatically even if my robot has no movement. I have done the analysis of my scan data it is good and also the analysis for my odometry data. When I select the Fixed frame as "odom" it does show this issue.I am facing the issue only when I am trying to create a map i.e when I select my Fixed frame as "map". Please can anyone suggest why this would be happening, where might be the problem.

The real turtlebo pose in gmapping

$
0
0
Hi, I'm a beginner in ROS and learn the gmapping package now. I would like to visualize the real robot pose and the estimated robot pose in RVIZ. Now when I run gmapping and teleop to move the robot to explore the world, the robot jumps from one position to another, which, I guess, comes from the update of the robot pose in SLAM. But can I also have the real robot pose from gazebo? Thank you guys, Yue

ways to autonomous localization of turtlebot

$
0
0
Hello guys, i am working on turtlebot with ROS Indigo 14.04. I have created a map of an environment using gmapping and the maps turned out to be great. I am using this map for autonomous navigation and this works great too. But whenever i start the robot i have to manually localize it using RViz using AMCL. Is there any way i can make this process automated. I have seen a few videos where people localize the robot autonomously using voice commands like shown in [this](https://www.youtube.com/watch?v=zq8AcR-vL7M) video Can anyone point me in the right direction on how i can automate this process? Thanks guys.

Is there a Turtlebot package software for iRobot Roomba 600 series base?

$
0
0
Hi!! I'm trying to make my own Turtlebot using a Roomba 645 base, a Kinect sensor and ROS Indigo. I would like to use this platform to make maps (SLAM) using Gmapping package for example. I've trying to use it with some packages changing the parameters for the roomba base, but the communication fails. Does Someone has try to use the Turtlebot package for roomba 600 base with success? Thanks!!! Greetings

My robot position get changes automatically

$
0
0
Hi, I an facing one issue in ROS , Now I am at the stage of creating a map , firstly my odometry data is been correct , TF( Transform ) data between all frames is correct and the Laserscan is giving right scan data. Now I am Using gmapping pkg for creating a map, as gmapping gives map -> odom transform, My robot is giving this transform from map -> odom At time 1475829818.208 - Translation: [-1.184, -1.583, 0.000] - Rotation: in Quaternion [0.000, 0.000, 0.554, 0.832] in RPY (radian) [0.000, -0.000, 1.175] in RPY (degree) [0.000, -0.000, 67.323] At time 1475829818.958 - Translation: [-1.184, -1.583, 0.000] - Rotation: in Quaternion [0.000, 0.000, 0.554, 0.832] in RPY (radian) [0.000, -0.000, 1.175] in RPY (degree) [0.000, -0.000, 67.323] At time 1475829818.958 - Translation: [-1.184, -1.583, 0.000] - Rotation: in Quaternion [0.000, 0.000, 0.554, 0.832] in RPY (radian) [0.000, -0.000, 1.175] in RPY (degree) [0.000, -0.000, 67.323] Because of this transform from map to odom. My robot position on rviz has discrete jumps due to which I am unable to create perfect map, so can anyone suggest me what transformation should be present in map-> odom? Thanks in advanced.

Z-coordinate has to be 1 or -1

$
0
0
Hello, I'm starting to build a map using gmapping, but whenever I start slam_gmapping, the following warning is shown: Laser has to be mounted planar! Z-coordinate has to be 1 or -1, but gave: 0.00000 This is one of the published transformations: When I change the X argument into 1 or -1 the warning is gone! I don't know why!.But I'm sure there is something wrong!! and I need the X to be 0. any idea? UPDATE: These are the frames in rviz: ![frames](/upfiles/14284815379617383.png) and I cannot change the values of the orientation in rviz.

Alternative approach to gmapping and hector-mapping?

$
0
0
Hi there, i'm new to this forum and hope to ask my question the right way. I started to work with ROS. To get used to it i tested some SLAM algorithm to get a occupancy grid map. The first one was gmapping(http:// wiki.ros.org/ gmapping), the second one was hector_mapping(http:// wiki.ros.org/ hector_slam), both worked fine for me. Now im searching for some other algorithm for comparsion and wanted to ask if you can commend some others. They should be easy to use like gmapping and hector_mapping. Im using laserdata and odometry but its not neccessary that the algorithm using both. Thanks for your help. Sascha

How to use depthimage_to_laserscan package to create map?

$
0
0
I am running rosrun depthimage_to_laserscan depthimage_to_laserscan and the scan topic is published and when I rosbag record -o /scan /tf the scan topic is not been recorded and the map is not created. I am new to ros and also read thewiki page bit didn't understand how to do that any suggestion. I want to make a 2D map to implement localization.

Gmapping and tf

$
0
0
Hi all My robot in rviz has some problems when i run gmapping launch file. the robot model and it's wheel got to take apart. How can i fix it. Sorry for my bad english. Thanks first. Here is the image ![image description](http://sv1.upsieutoc.com/2016/10/29/Capture145dc.png) My launch file

How do I use a map to reduce odometry errors?

$
0
0
So, I have a *.pgm and *.yaml map and I want to use it, somehow, to reduce the odometry errors. I need to get accurate locations to set point on a map and someone recomended this to me. However I couldn't find info about that anywhere. Thanks!

error when gmaping

$
0
0
**I met error when i use gmaping:** in rviz , i see the simulated position is 1/2 of the reality, when i was using `gmapping` + `laser_scaner` to build the map. when I echo the topic `/odom` , it shows me the right value. is there experts tell me how to configure this? please see the figure below: Thank You Very Much!! ![image description](/upfiles/1477972783528097.png) I refered the gmaping wiki page and met the tf requirements. I'm using `ros_idigo`.

how to configure gmapping params to make the map denser?

$
0
0
I'm using the `gmapping/slam_gmapping` node to create the map using `ros_arduino_bridge` and `neato_laser_driver` with which the `/odom , /scan, and /tf` can be published. I referred the parameter configuration of rbx1: [here](https://github.com/pirobot/rbx1/blob/indigo-devel/rbx1_nav/launch/gmapping.launch) and I found that the map is like draw with light strockes. And there are many emissive lines which is not as the reality and should be get rid of. Can you tell me which parameter is influencing these features? Thank You!! ![image description](/upfiles/14781513091525672.jpg)

How to create 2Dmap with p3dx with rosbag?

$
0
0
I recorded the data in rosbag file of /tf and /scan topics and when i run the gmapping and playback the recorded data from rosbag file nothing changes. And the error is 100% messages dropped.Do i have to record different topic and do i need to do something in the tf topic or not. The scan noe is created by depthimage_to_laserscan and the /scan node is working but the map is not building whats the error. I have implemented same method in gazebo and its working but with p3dx its not working. I have followed from the book programming robots with ros.

Turtlebot Gmapping with RPLIDAR encounters errors

$
0
0
Hi, i know there were serveral questions about this error, but they are all about the kinect connected to the turtlebot.. I have a RPLIDAR A1 connected to the turtlebot and want to run gmapping slam algorithm. I dont have a kinect connected or any other sensors. I have Ubuntu 14.04 and ROS Indigo installed on a Intel NUC build on the turtlebot. I started this 3 commands in different terminals and a teleop command to start the Lidar scan, gmapping and RVIZ roslaunch turtlebot_le2i rplidar_minimal.launch roslaunch turtlebot_navigation gmapping_demo.launch roslaunch turtlebot_rviz_launchers view_navigation.launch It works fine, i can see the map being built in RVIZ but i get this errors in the gmapping terminal: No matching device found.... waiting for devices. Reason: std::string openni2_wrapper::OpenNI2Driver::resolveDeviceURI(const string&) @ /tmp/binarydeb/ros-indigo-openni2-camera-0.2.7/src/openni2_driver.cpp @ 631 : Invalid device number 1, there are 0 devices connected.

gmapping and frontier exploration acting weird

$
0
0
I am trying to use frontier exploration package to solve a coverage path planning problem. The robot needs to search for a a target in an empty space. Namely, There are no landmarks except the target in the searching area. I set the boundary for frontier exploration as the four corners of the searching range. After some experiments I noticed two problems. First, the husky robot will rotate 360 deg at the starting point. Second, the error of odometry is HUGE. For example, the actually heading is 0[deg] but in RVIZ the estimated heading is 90[deg]. At first I thought I must have set some parameters regarding odometry wrong. Later on, I figured that the error might be induced by the fact that the gmapping try to find landmark in the environment. However, as I mentioned above, there are no landmarks in the environment. I would be glad if you can share your experience or give me some suggestions.

Problem in Map buildiing using Gmapping

$
0
0
I have created a bag file using the attached code of bagconvert.py, The code is pasted, #!/usr/bin/env python import rospy import rosbag from sensor_msgs.msg import LaserScan from math import pi from tf2_msgs.msg import TFMessage from geometry_msgs.msg import TransformStamped import tf def make_tf_msg(x, y, theta, t): trans = TransformStamped() trans.header.stamp = t trans.header.frame_id = '/odom' trans.child_frame_id = '/laser' trans.transform.translation.x = x trans.transform.translation.y = y q = tf.transformations.quaternion_from_euler(0, 0, theta) trans.transform.rotation.x = q[0] trans.transform.rotation.y = q[1] trans.transform.rotation.z = q[2] trans.transform.rotation.w = q[3] msg = TFMessage() msg.transforms.append(trans) return msg with open('intel.clf') as dataset: with rosbag.Bag('intel.bag', 'w') as bag: for line in dataset.readlines(): line = line.strip() tokens = line.split(' ') if len(tokens) <= 2: continue if tokens[0] == 'FLASER': msg = LaserScan() num_scans = int(tokens[1]) if num_scans != 180 or len(tokens) < num_scans + 9: rospy.logwarn("unsupported scan format") continue msg.header.frame_id = 'laser' t = rospy.Time(float(tokens[(num_scans + 8)])) msg.header.stamp = t msg.angle_min = -90.0 / 180.0 * pi msg.angle_max = 90.0 / 180.0 * pi msg.angle_increment = pi / num_scans msg.time_increment = 0.2 / 360.0 msg.scan_time = 0.2 msg.range_min = 0.001 msg.range_max = 50.0 msg.ranges = [float(r) for r in tokens[2:(num_scans + 2)]] bag.write('laser', msg, t) odom_x, odom_y, odom_theta = [float(r) for r in tokens[(num_scans + 2):(num_scans + 5)]] tf_msg = make_tf_msg(odom_x, odom_y, odom_theta, t) bag.write('tf', tf_msg, t) elif tokens[0] == 'ODOM': odom_x, odom_y, odom_theta = [float(t) for t in tokens[1:4]] t = rospy.Time(float(tokens[7])) tf_msg = make_tf_msg(odom_x, odom_y, odom_theta, t) bag.write('tf', tf_msg, t) Now a file called intel.bag is created. now I am following the tutorial of building a map from logged data, roscore rosparam set use_sim_time true rosrun gmapping slam_gmapping scan:=laser rosbag play intel.bag After this in the Rviz window no map is built, and in the gmapping terminal an error message is shown "[ WARN] [1478603677.898065363]: Failed to compute laser pose, aborting initialization ("base_link" passed to lookupTransform argument target_frame does not exist. )". Please provide a solution.

Closed Loop Detection gmapping & Hector SLAM

$
0
0
All, Found this [answer](http://answers.ros.org/question/199781/how-does-gmapping-detect-loop-closure/) how do I check Neff to see if a loop closure has been found/detected. Thanks Mark

Localizing and mapping with only tf,GPS, and scan

$
0
0
I need to do localizing and mapping in an environment. I don't want to use gmapping since GPS and IMU on the robot can already give a very accurate pose estimation. My question can be divided into two sub-questions: 1. How to use IMU and GPS data to do localization? 2. How to build/update map with tf( transformation between map and bask_link) and scan?

tf tree details

$
0
0
After running gmapping from intel.bag, while generating the map, if I run rosrun tf view_frames, it gives me the frames.pdf. In this case, the transform is map->odom->base_link and odom->laser,the broadcasting node for map->odom is /slam_gmapping, the broadcasting node for odom->laser is /play_1479118047019813325 node, the broadcasting node for odom->base_link is /static_transform_publisher_1479118028670611586 node Please explain the tree.
Viewing all 601 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>