From Experimental Robotics
Cooperative Online SLAM
In the first experiment, we implement the online SLAM by modifying our client code for communicating with player.
Simulation Result in Stage
Using HIMM for mapping, we can employ more than one robot to apply the mapping process in shorter time. In this method, in each scan, each robot adds some information to global map. But in this regard we have an assumption:
- Two robots or at least one of them should see the other one. Which in this case, we should use a camera and use it to identify the other robot and estimate its location. Or,
- Two robot should start their work from the same place, i.e. the origin of the map
from known positions in the map.
We chose the second assumption in simulation. Both robots are mapping in an assumed world map.
After running the client code for SLAM, one robot generated the map:
And a map from two robots:
Real Environment Result
Implementing the algorithm in simulation world (i.e. using Stage) causes no problem regarding the localization error. It uses odometry data for localization and the final result is quiet satisfactory.
The problem occurred when we applied the algorithm for real robot. Odometry error, mostly, is because of a small wheel at the end of the pioneer robot. At first, the robot moved while mapping.
The quality of the map is acceptable but it has some problem due to localization error. In the next experiment, the robot passes the same way, however it stops, scans around and then moves to the next place. In this way localization error will reduce and the result map is better than previous case.
Final Online Mapping
Cooperative Offline SLAM
In order to compare the algorithm we used and other approaches such as PMAP and DPSLAM, we implemented the simulation offline. That is to say, we converted our mapping program to the one extracting data from log files generated by simulator or real robots.
The log file could be added simply by giving drivers in the configuration file, such as:
name "writelog" requires [ "laser:0" "position:1" ] provides ["log:0"] alwayson 1 autorecord 1 filename "xxxx.log"// log name
After identifying the data format in the log file, we need to convert them to what we calculate with in our Occupancy Grid algorithm, such φ , position X, Y of robot and laser distance R from the robot to the obstacles.
In our case, bearing starts from -90°C (or -1.5708 in radian) to 90°C (or +1.5708 in radian). The resolution is 1°C (or 0.00872665 in radian). The log generated by Stage simulation consists of 180 sets of laser data along with a coordinate of robot which is measuring its own position by odometry.
Offline Simulatin Result