一直在寻找一个示例可以将ROS学习中常用的基础内容大部分都包含进去,最好还包括Gazebo仿真,
这样即使没有硬件设备,也可以很好的学习ROS相关内容,但又必须有对应的硬件,便于后续研究。
这里,介绍一款意外发现的ROS的robot示例----evarobot----
官方教程网址:http://wiki.ros.org/Robots/evarobot
仿真源码网址:https://github.com/inomuh/evarobot_simulator
下面进入正文:
ROS(indigo)机器人操作系统Gazebo仿真示例evarobot
1. 基本安装和编译
下载功能包后,在catkin_ws中进行编译,一切ok,就可以使用这个功能包了。
~/catkin_ws$ catkin_make
有可能遇到一些问题,可能需要下面命令:
~/catkin_ws$ source devel/setup.bash
~/catkin_ws$ rosdep install --from-path src/ -y -i
~/catkin_ws$ rosstack profile & rospack profile
2. 示例使用
上面步骤完成后,就可以复习常用的ROS命令:
~$ rosstack find evarobot_simulator
/home/exbot/catkin_ws/src/evarobot_simulator/evarobot_simulator
[email protected]:~$ rospack find evarobot_gazebo
/home/exbot/catkin_ws/src/evarobot_simulator/evarobot_gazebo
[email protected]:~$ rosls evarobot_gazebo
cfg CMakeLists.txt launch map src
CHANGELOG.rst include LICENSE package.xml worlds
其他还有roscd等,不一一列举。在终端运行:
~$ roslaunch evarobot_gazebo evarobot.launch
~$ rosnode list
/gazebo
/joint_state_publisher
/robot_state_publisher
/rosout
/twist_marker_server
~$ rostopic list
/camera/depth/camera_info
/camera/depth/image_raw
/camera/depth/points
/camera/parameter_descriptions
/camera/parameter_updates
/camera/rgb/camera_info
/camera/rgb/image_raw
/camera/rgb/image_raw/compressed
/camera/rgb/image_raw/compressed/parameter_descriptions
/camera/rgb/image_raw/compressed/parameter_updates
/camera/rgb/image_raw/compressedDepth
/camera/rgb/image_raw/compressedDepth/parameter_descriptions
/camera/rgb/image_raw/compressedDepth/parameter_updates
/camera/rgb/image_raw/theora
/camera/rgb/image_raw/theora/parameter_descriptions
/camera/rgb/image_raw/theora/parameter_updates
/clock
/cmd_vel
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/sensor/Bumper
/gazebo/set_link_state
/gazebo/set_model_state
/imu
/ir0
/ir0/parameter_descriptions
/ir0/parameter_updates
/ir1
/ir1/parameter_descriptions
/ir1/parameter_updates
/ir2
/ir2/parameter_descriptions
/ir2/parameter_updates
/ir3
/ir3/parameter_descriptions
/ir3/parameter_updates
/joint_states
/lidar
/odom
/rosout
/rosout_agg
/sonar0
/sonar0/parameter_descriptions
/sonar0/parameter_updates
/sonar1
/sonar1/parameter_descriptions
/sonar1/parameter_updates
/sonar2
/sonar2/parameter_descriptions
/sonar2/parameter_updates
/sonar3
/sonar3/parameter_descriptions
/sonar3/parameter_updates
/sonar4
/sonar4/parameter_descriptions
/sonar4/parameter_updates
/sonar5
/sonar5/parameter_descriptions
/sonar5/parameter_updates
/sonar6
/sonar6/parameter_descriptions
/sonar6/parameter_updates
/tf
/tf_static
/twist_marker_server/feedback
/twist_marker_server/update
/twist_marker_server/update_full
~$ rosrun rqt_reconfigure rqt_reconfigure
~$ rviz
~$ rosrun rqt_graph rqt_graph
~$ rosrun rqt_console rqt_console
~$ rosrun rqt_plot rqt_plot /cmd_vel
在Gazebo环境中加入一些物体:
$ rosrun rqt_image_view rqt_image_view
可以使用键盘,手柄或手机遥控gazebo中的机器人运动,观察各类传感器数据的变化。
也可以在虚拟仿真环境中,测试各类图像算法等,构建自己的3D模型等,这里不再赘述。
如果想要查看点云,可以通过rviz:
3. SLAM示例
~$ roslaunch evarobot_gazebo evarobot.launch world_path:=$(rospack find evarobot_gazebo)/worlds/UPlat.sdf
~$ roslaunch evarobot_slam gazebo_slam.launch
~$ rosrun teleop_twist_keyboard teleop_twist_keyboard.py
~$ roslaunch evarobot_viz view_evarobot.launch
~$ rosrun map_server map_saver -f $(rospack find evarobot_slam)/gazebo_map/map
4. 导航
~$ roslaunch evarobot_navigation gazebo_navigation.launch world_path:=$(rospack find evarobot_gazebo)/worlds/UPlat.sdf
遇到点小问题,还未解决。
-End-