Annex 2: Arbotix simulator

Simulador Arbotix

All the code used for the examples are in this repository: https://github.com/pirobot/rbx1

To make sure you have all the required ROS packages, run this bunch of commands:

sudo apt-get install ros-indigo-turtlebot-bringup \
ros-indigo-turtlebot-create-desktop ros-indigo-openni-* \
ros-indigo-openni2-* ros-indigo-freenect-* ros-indigo-usb-cam \
ros-indigo-laser-* ros-indigo-hokuyo-node \
ros-indigo-audio-common gstreamer0.10-pocketsphinx \
ros-indigo-pocketsphinx ros-indigo-slam-gmapping \
ros-indigo-joystick-drivers python-rosinstall \
ros-indigo-orocos-kdl ros-indigo-python-orocos-kdl \
python-setuptools ros-indigo-dynamixel-motor-* \
libopencv-dev python-opencv ros-indigo-vision-opencv \
ros-indigo-depthimage-to-laserscan ros-indigo-arbotix-* \
ros-indigo-turtlebot-teleop ros-indigo-move-base \
ros-indigo-map-server ros-indigo-fake-localization \
ros-indigo-amcl git subversion mercurial

Alternatively you can use this automated script:

cd ~
wget https://raw.githubusercontent.com/pirobot/rbx1/indigo-devel/\
rbx1-prereq.sh
sh rbx1-prereq.sh

Given you have installed ROS Indigo, to clone and build the rbx1 repository for Indigo for the first time, follow these steps:

cd ~/catkin_ws/src
git clone https://github.com/pirobot/rbx1.git
cd rbx1
git checkout indigo-devel
cd ~/catkin_ws
catkin_make
source ~/catkin_ws/devel/setup.bash
rospack profile

To list the packages, move into the parent of the rbx1 meta-package and use the Linux ls command:

roscd rbx1
cd ..
ls -F

You may include Include this line in .bashrc to load ROS configuration with any new terminal:

source ~/catkin_ws/devel/setup.bash

Installing the Simulator

sudo apt-get install ros-indigo-arbotix-*
rospack profile

Get the ROS visualization tool RVIZ:

sudo apt-get install ros-indigo-rviz

And also this package to be able to launch single nodes from the command line:

sudo apt-get install ros-indigo-rosbash

 

To make sure everything is working, make sure roscore is running, then launch the simulated TurtleBot as follows:

roslaunch rbx1_bringup fake_turtlebot.launch

To use a model of Pi Robot instead, run the command:

roslaunch rbx1_bringup fake_pi_robot.launch

Next, bring up RViz so we can observe the simulated robot in action:

rosrun rviz rviz -d `rospack find rbx1_nav`/sim.rviz

Note how we use the Linux backtick operator together with the rospack find command to locate the rbx1_nav package without having to type the entire path. To change views, click on the Panels menu in RViz and select the Views menu item. To stop the rotation, type Ctrl-C in the same terminal window, then publish the empty Twist message:

rostopic pub -1 /cmd_vel geometry_msgs/Twist '{}'

With RViz we can visualize the robot's motion as we try out various Twist commands and motion control scripts:

roslaunch rbx1_bringup fake_turtlebot.launch
rosrun rviz rviz -d `rospack find rbx1_nav`/sim.rviz

First example= 0.1 m/s linear velocity and -0.5 rad rotation:

rostopic pub -r 10 /cmd_vel geometry_msgs/Twist '{linear: {x: 0.1, y:0, z: 0}, angular: {x: 0, y: 0, z: -0.5}}'

We use the -r parameter to publish the Twist message continually at 10 Hz. Some robots like the TurtleBot require the movement command to be continually published or the robot will stop: a nice safety feature. While this parameter is not necessary when running the ArbotiX simulator, it doesn't hurt either. To stop the robot from rotating, type Ctrl-C in the same terminal window, then publish the empty Twist message:

rostopic pub -1 /cmd_vel geometry_msgs/Twist '{}'

Now let's try a second example. First clear the odometry arrows by clicking the Reset button in RViz. The following pair of commands (separated by a semi-colon) will first move the robot straight for about 3 seconds (the "-1" option means "publish once"), then continue indefinitely in a counter-clockwise circle:

rostopic pub -1 /cmd_vel geometry_msgs/Twist '{linear: {x: 0.2, y: 0, z: 0}, angular: {x: 0, y: 0, z: 0}}'; rostopic pub -r 10 /cmd_vel geometry_msgs/Twist '{linear: {x: 0.2, y: 0, z: 0}, angular: {x: 0, y: 0, z: 0.5}}'
NOTE: Odometry is the use of data from motion sensors to estimate change in position over time. It is used by some legged or wheeled robots to estimate their position relative to a starting location. This method is sensitive to errors due to the integration of velocity measurements over time to give position estimates. Rapid and accurate data collection, instrument calibration, and processing are required in most cases for odometry to be used effectively. This is something that should be calibrated for every real robot so that the simulation reflects precisely the robot motion.