手眼标定之相机随动eye-in-hand 示例:handeye_movingcam_calibration

*
* This example explains how to use the hand eye calibration for the case where
* the camera is attached to the robot tool and the calibration object
* is stationary with respect to the robot.
*这个示例展示了如何使用手眼标定,针对相机固定在机械手末端且标定板相对于机械手基础坐标系静止的情形。
* The robot positions the camera with respect to the calibration plate.
*机械手相对与相机姿态反映在标定板上。
* In this case, the goal of the hand eye calibration is to determine two unknown poses:
*在这种情况下,手眼标定的目标是得到两个未知的姿态。
* - the pose of the robot base in the coordinate system
* of the calibration object (CalObjInBasePose).
*标定板相对机械手基础坐标系的姿态
* - the pose of the camera in the coordinate system of the
* tool center point (ToolInCamPose).
*机器手末端工具坐标系相对于相机的姿态
* Theoretically, as input the method needs at least 3 poses of the
* calibration object in the camera coordinate system.
* However, it is recommended to use at least 10 Poses.
* The corresponding poses of the robot tool in the robot base coordinate system
* (ToolInBasePose) changes for each calibration image,
* because it describes the pose of the robot moving the camera.
* The poses of the calibration object are obtained from images of the
* calibration object recorded with the camera attached to the robot.
* To obtain good calibration results, it its essential to position
* the camera with respect to the calibration object so that the object appears
* tilted in the image.
* After the hand eye calibration, the computed transformations are
* extracted and used to compute the pose of the calibration object in the
* camera coordinate system.
dev_update_off ()
* Directories with calibration images and data files
ImageNameStart := ‘3d_machine_vision/handeye/movingcam_calib3cm_‘
DataNameStart := ‘handeye/movingcam_‘
NumImages := 14
read_image (Image, ImageNameStart + ‘00‘)
dev_close_window ()
get_image_size (Image, Width, Height)
dev_open_window (0, 0, Width, Height, ‘black‘, WindowHandle)
dev_set_line_width (2)
dev_set_draw (‘margin‘)
dev_display (Image)
set_display_font (WindowHandle, 14, ‘mono‘, ‘true‘, ‘false‘)
* Load the calibration plate description file.
*加载标定板描述文件
* Make sure that the file is in the current directory or
* in HALCONROOT/calib, or use an absolute path.
*确保文件在正确的路径或使用相对路径
CalTabFile := ‘caltab_30mm.descr‘
* Read the initial values for the internal camera parameters
*读取相机的内参
read_cam_par (DataNameStart + ‘start_campar.dat‘, StartCamParam)
* Create the calibration model for the hand eye calibration
* where the calibration object is observed with a camera
*创建一个手眼标定模板,标定板在相机视野内
create_calib_data (‘hand_eye_moving_cam‘, 1, 1, CalibDataID)
* Set the camera type used
*设置相机内参
set_calib_data_cam_param (CalibDataID, 0, ‘area_scan_division‘, StartCamParam)
* Set the calibration object
*设置标定板参数
set_calib_data_calib_object (CalibDataID, 0, CalTabFile)
* Start the loop over the calibration images
* Set the opitmization method to be used
set_calib_data (CalibDataID, ‘model‘, ‘general‘, ‘optimization_method‘, ‘nonlinear‘)
disp_message (WindowHandle, ‘The calibration data model was created‘, ‘window‘, 12, 12, ‘black‘, ‘true‘)
disp_continue_message (WindowHandle, ‘black‘, ‘true‘)
stop ()
for I := 0 to NumImages - 1 by 1
read_image (Image, ImageNameStart + I$‘02d‘)
* Search for the calibration plate, extract the marks and the
* pose of it, and store the results in the calibration data
* The poses are stored in the calibration data model for use by
* the hand eye calibration and do not have to be set explicitly
find_calib_object (Image, CalibDataID, 0, 0, I, [], [])
get_calib_data_observ_contours (Caltab, CalibDataID, ‘caltab‘, 0, 0, I)
get_calib_data_observ_points (CalibDataID, 0, 0, I, RCoord, CCoord, Index, PoseForCalibrationPlate)
* Visualize the extracted calibration marks and the estimated pose (coordinate system)
dev_set_color (‘green‘)
dev_display (Image)
dev_display (Caltab)
dev_set_color (‘yellow‘)
disp_cross (WindowHandle, RCoord, CCoord, 6, 0)
dev_set_colored (3)
disp_3d_coord_system (WindowHandle, StartCamParam, PoseForCalibrationPlate, 0.01)
* Read pose of tool in robot base coordinates (ToolInBasePose)
*读机械手基础坐标系下的末端工具的姿态,每张图只要机械手末端相对标定板有XYZ方向的平移或旋转,此姿态就会不一样。
read_pose (DataNameStart + ‘robot_pose_‘ + I$‘02d‘ + ‘.dat‘, ToolInBasePose)
* Set the pose tool in robot base coordinates in the calibration data model
set_calib_data (CalibDataID, ‘tool‘, I, ‘tool_in_base_pose‘, ToolInBasePose)
* Uncomment for inspection of visualization
* disp_message (WindowHandle, ‘Extracting data from calibration image ‘ + (I + 1) + ‘ of ‘ + NumImages, ‘window‘, 12, 12, ‘black‘, ‘true‘)
* disp_continue_message (WindowHandle, ‘black‘, ‘true‘)
* stop ()
endfor
disp_message (WindowHandle, ‘All relevant data has been set in the calibration data model‘, ‘window‘, 12, 12, ‘black‘, ‘true‘)
disp_continue_message (WindowHandle, ‘black‘, ‘true‘)
stop ()
* Perform the hand eye calibration and store the results to file
* The calibration of the cameras is done internally prior
* to the hand eye calibration
dev_display (Image)
disp_message (WindowHandle, ‘Performing the hand-eye calibration‘, ‘window‘, 12, 12, ‘black‘, ‘true‘)
calibrate_hand_eye (CalibDataID, Errors)
* Query the camera parameters and the poses
get_calib_data (CalibDataID, ‘camera‘, 0, ‘params‘, CamParam)
* Get poses computed by the hand eye calibration
*tool_in_cam_pose:在相机坐标系下工具坐标系的关系
get_calib_data (CalibDataID, ‘camera‘, 0, ‘tool_in_cam_pose‘, ToolInCamPose)
*obj_in_base_pose:在机械手基础坐标系下标定板的姿态
get_calib_data (CalibDataID, ‘calib_obj‘, 0, ‘obj_in_base_pose‘, CalObjInBasePose)
dev_get_preferences (‘suppress_handled_exceptions_dlg‘, PreferenceValue)
dev_set_preferences (‘suppress_handled_exceptions_dlg‘, ‘true‘)
try
* Handle situation where user does not have the permission
* to write in the current directory.
*
* Store the camera parameters to file
*保存一个相机的内参
write_cam_par (CamParam, DataNameStart + ‘final_campar.dat‘)
* Save the hand eye calibration results to file
*保存工具坐标系(机械手末端)相对于相机的姿态参数
write_pose (ToolInCamPose, DataNameStart + ‘final_pose_cam_tool.dat‘)
*保存标定板相对于机械手基础坐标系的姿态参数
write_pose (CalObjInBasePose, DataNameStart + ‘final_pose_base_calplate.dat‘)
catch (Exception)
* do nothing
endtry
dev_set_preferences (‘suppress_handled_exceptions_dlg‘, PreferenceValue)
dev_display (Image)
* Display calibration errors
Message := ‘Quality of the results: root mean square maximum‘
Message[1] := ‘Translation part in meter: ‘ + Errors[0]$‘6.4f‘ + ‘ ‘ + Errors[2]$‘6.4f‘
Message[2] := ‘Rotation part in degree: ‘ + Errors[1]$‘6.4f‘ + ‘ ‘ + Errors[3]$‘6.4f‘
disp_message (WindowHandle, Message, ‘window‘, 12, 12, ‘black‘, ‘true‘)
disp_continue_message (WindowHandle, ‘black‘, ‘true‘)
stop ()
* For the given camera, get the corresponding pose indices and calibration object indices
query_calib_data_observ_indices (CalibDataID, ‘camera‘, 0, CalibObjIdx, PoseIds)
* Compute the pose of the calibration object in the camera coordinate
* system via calibrated poses and the ToolInBasePose and visualize it.
for I := 0 to NumImages - 1 by 1
read_image (Image, ImageNameStart + I$‘02d‘)
dev_display (Image)
* Obtain the pose of the tool in robot base coordinates used in the calibration.
* The index corresponds to the index of the pose of the observation object.
get_calib_data (CalibDataID, ‘tool‘, PoseIds[I], ‘tool_in_base_pose‘, ToolInBasePose)
* Compute the pose of the calibration object relative to the camera
calc_calplate_pose_movingcam (CalObjInBasePose, ToolInCamPose, ToolInBasePose, CalObjInCamPose)
* Display the coordinate system
dev_set_colored (3)
disp_3d_coord_system (WindowHandle, CamParam, CalObjInCamPose, 0.01)
Message := ‘Using the calibration results to display ‘
Message[1] := ‘the coordinate system in image ‘ + (I + 1) + ‘ of ‘ + NumImages
disp_message (WindowHandle, Message, ‘window‘, 12, 12, ‘black‘, ‘true‘)
if (I < NumImages - 1)
disp_continue_message (WindowHandle, ‘black‘, ‘true‘)
stop ()
endif
endfor
* Clear the data model
clear_calib_data (CalibDataID)
*
* After the hand-eye calibration the computed pose
* ToolInCamPose can be used in robotic grasping applications.
* If the tool coordinate system is placed at the gripper
* and a detected object ObjInCamPose shall be grasped
* (here the calibration object),
* the pose of the detected object relative
* to the robot base coordinate system has to be computed.
*姿态反转
pose_invert (ToolInCamPose, CamInToolPose)
*由两个已知姿态得到第三个姿态,相当于两个分数乘法,其中一个分子与另一个分母相等,相约的情况
pose_compose (ToolInBasePose, CamInToolPose, CamInBasePose)
pose_compose (CamInBasePose, CalObjInCamPose, ObjInBasePose)

原文地址:https://www.cnblogs.com/yangmengke2018/p/9742466.html

时间: 2024-10-07 08:46:12

手眼标定之相机随动eye-in-hand 示例:handeye_movingcam_calibration的相关文章

手眼标定eye-to-hand 示例:handeye_stationarycam_calibration

世界坐标系就是标定板上的坐标系 * * This example explains how to use the hand eye calibration for the case where* the camera is stationary with respect to the robot and the calibration* object is attached to the robot arm.*这个示例展示了如何使用手眼标定,这种情形用于相机与机械手基础坐标系位置固定且标定板固定

halcon 手眼标定的坐标转换原理讲解

原文链接:https://blog.csdn.net/opencv_learner/article/details/82113323 一直以来,对于手眼标定所涉及到的坐标系及坐标系之间的转换关系都没能有一个很好的理解,最近找了halcon手眼标定的实例在研究,发现对于相机的两种安装方式(眼在手和眼在手外),其坐标转换关系是类似的,这样说好像太抽象了,下面具体说说. 我觉得标定最基本的是要将坐标系理清楚,这里涉及到的坐标系有四个:机器人基坐标系base.法兰上的工具坐标系tool.相机坐标系cam

ROS标定IDS相机

参考 ROS 相机标定http://blog.csdn.net/ArtistA/article/details/51125560 ROS里的标定程序只要使用了OPNCV的标定程序: opencv 相机标定官网文档http://docs.opencv.org/2.4/doc/tutorials/calib3d/camera_calibration/camera_calibration.html 棋盘格参数 焦点个数(长宽方向)8x6,边长0.0435m 标定步骤 ROS下运行摄像头,其中相机发布图

MATLAB标定工具箱 相机标定

记录一下MATLAB标定工具箱的使用. MATLAB标定工具箱有新旧两种,简单来说,需要自己下载toolbox_calib文件并使用calib命令启动的是旧的标定工具箱,新的工具箱则位于MATLAB中的APPS下拉菜单中. 对新的工具箱,完全傻瓜式操作,添加完图像后,按照下图勾选相应选项,即可校正切向和径向畸变,然后点击Calibrate按钮即可得到相机内参和畸变参数: 对旧的工具箱,具体可参考这篇博文,需要注意的是,在依次点击图像角点时,需要按照顺时针,但无需每次都从棋盘格的同一位置开始. 下

相机标定简介与MatLab相机标定工具箱的使用(未涉及原理公式推导)

相机标定 一.相机标定的目的 确定空间物体表面某点的三维几何位置与其在图像中对应点之间的相互关系,建立摄像机成像的几何模型,这些几何模型参数就是摄像机参数. 二.通用摄像机模型 世界坐标系.摄像机坐标系和像平面坐标系都不重合.同时考虑两个因素 : (1)摄像机镜头的畸变误差,像平面上的成像位置与线性变换公式计算的透视变换投影结果有偏差: (2)计算机中图像坐标单位是存储器中离散像素的个数,所以像平面上的连续坐标还需取整转换. 摄像机参数 l  摄像机内部参数 (Intrinsic Paramet

单目相机标定-原理及实现

一. 标定原理      相机标定的目的就是要获得相机的内参数,得到二维平面像素坐标和三维世界坐标的关系,从而进行三维重建. 1.几个坐标系及其变换 (1)图像坐标系:是一个以像素为单位的坐标系,它的原点在左上方,每个像素点的位置是以像素为单位来表示的,所以这样的坐标系叫图像像素坐标系(u,v),u和v分别表示像素在数字图像中的列数和行数,但是并没有用物理单位表示像素的位置,因此还需建立以物理单位表示的图像坐标系,叫图像物理坐标系(x,y),该坐标系是以光轴与图像平面的交点为原点,该点一般位于图

基于OpenCV立体视觉标定和校正

这几天学习双目视觉标定,分别使用了两种工具:OpenCV和Matlab.Matlab的效果非常稳定,但是一开始OpenCV的效果很糟糕,要不是出现中断就是标定出来的结果数值很大.经过了几天的不断调试和更改,终于把OpenCV的立体视觉标定和校正的程序写出来了.立体标定时计算空间上的两台摄像机几何关系的过程,立体校正则是对个体图像进行纠正,保证这些图像可以从平面对准的两幅图像获得.程序的框架如下: 1.读取左右相机图片序列 双目相机的图片序列放在Demon的路径下,左右相机的图像的名字分别存放在两

【转载】推导相机变换矩阵

原文:推导相机变换矩阵 一些网友写信给我希望能够了解固定流水线中世界空间到相机空间变换矩阵的具体推导过程.其实之前我在<向量几何在游戏编程中的使用6>中已经简单的把相机变换作为一个使用基理论的例子进行了说明,但可能仍然不够具体.这篇文章中,我会尽力阐述相机变换的整个来龙去脉.希望能够对正在学习固定流水线的朋友们有所帮助.这里我们仍然会在推导相机变换之前介绍几个理论知识,目的是为了更好的理解推导过程.我们马上开始! 什么是相机变换? 在流水线中,当物体从模型坐标通过世界矩阵变换到世界空间之后,它

Halcon标定步骤

1.设置相机内部参数的初始值 StartCamPar := [0.016,0,0.0000074,0.0000074,326,247,652,494]set_calib_data_cam_param (CalibDataID, 0, 'area_scan_division', StartCamPar) 1.1 相机型号 (1)面阵 (2)线阵 1.2 参数设置(这里只讲面阵相机) (1)Division畸变模型 CameraParam:[Focus, Kappa, Sx, Sy, Cx, Cy,