点击下方卡片,关注「AI计算机视觉CV深度学习DL」公众号
选择星标,最新技术干货每日准时送达
AI|图像处理|计算机视觉CV|机器学习ML|深度学习DL
前言
镜头焦距越长,视场角越窄。鱼眼镜头因视场角广、拍摄效率高、画质基本满足上网浏览等特点,被认为是拍摄 3D 全景的利器。鱼眼镜头(视场角一般 180 度以上)和普通超广角镜头(视场角一般 80-120 度)的主要区别在于视场角的大小。
前言
1. 鱼眼相机基础
1.1 鱼眼相机模型

1.2 基于畸变表的拟合方法
-
内参
-
畸变表:
-
opencv Kannala-Brandt相机畸变模型描述的是光线的入射角与其经过折射后在相机归一化平面上的投影点距离归一化平面中心的距离r -
畸变表描述的是光线的入射角与其经过折射后在相机的真实的成像平面上的投影点距离成像中心的距离r
theta_input = data[:,0]*3.14/180
theta_fit = np.arctan(self.data[:,1]/0.95) #focal_lenth=0.95
distort_data, _ = curve_fit(func1, theta_input, theta_fit)
2 Opencv API 鱼眼图像去畸变方法
cv::fisheye::initUndistortRectifyMap,该函数使用相机的内参和畸变参数计算出映射图mapx和mapy。
2.1 基础鱼眼图像去畸变
@param K Camera intrinsic matrix \f$cameramatrix{K}\f$.
@param D Input vector of distortion coefficients \f$\distcoeffsfisheye\f$.
@param R Rectification transformation in the object space: 3x3 1-channel, or vector: 3x1/1x3
1-channel or 1x1 3-channel
@param P New camera intrinsic matrix (3x3) or new projection matrix (3x4)
@param size Undistorted image size.
@param m1type Type of the first output map that can be CV_32FC1 or CV_16SC2 . See convertMaps()
for details.
@param map1 The first output map.
@param map2 The second output map.
*/
CV_EXPORTS_W void initUndistortRectifyMap(InputArray K, InputArray D, InputArray R, InputArray P,
const cv::Size& size, int m1type, OutputArray map1, OutputArray map2);
cv::Mat R = cv::Mat::eye(3, 3, CV_32F);
cv::Mat mapx_open, mapy_open;
cv::Mat intrinsic_undis;
fish_intrinsic.copyTo(intrinsic_undis);
//intrinsic_undis.at<float>(0,2) *= 2;
//intrinsic_undis.at<float>(1,2) *= 2;
cv::fisheye::initUndistortRectifyMap(
fish_intrinsic, m_undis2fish_params, R, intrinsic_undis,
cv::Size(intrinsic_undis.at<float>(0, 2) * 2,
intrinsic_undis.at<float>(1, 2) * 2),
CV_32FC1, mapx_open, mapy_open);
cv::Mat test;
cv::remap(disImg[3], test, mapx_open, mapy_open, cv::INTER_LINEAR);
2.2 相机主点参数调节
cv::Mat R = cv::Mat::eye(3, 3, CV_32F);
cv::Mat mapx_open, mapy_open;
cv::Mat intrinsic_undis;
fish_intrinsic.copyTo(intrinsic_undis);
intrinsic_undis.at<float>(0,2) *= 2;
intrinsic_undis.at<float>(1,2) *= 2;
cv::fisheye::initUndistortRectifyMap(
fish_intrinsic, m_undis2fish_params, R, intrinsic_undis,
cv::Size(intrinsic_undis.at<float>(0, 2) * 2,
intrinsic_undis.at<float>(1, 2) * 2),
CV_32FC1, mapx_open, mapy_open);
cv::Mat test;
cv::remap(disImg[3], test, mapx_open, mapy_open, cv::INTER_LINEAR);
cv::Mat R = cv::Mat::eye(3, 3, CV_32F);
cv::Mat mapx_open, mapy_open;
cv::Mat intrinsic_undis;
fish_intrinsic.copyTo(intrinsic_undis);
intrinsic_undis.at<float>(0,2) *= 4;
intrinsic_undis.at<float>(1,2) *= 4;
cv::fisheye::initUndistortRectifyMap(
fish_intrinsic, m_undis2fish_params, R, intrinsic_undis,
cv::Size(intrinsic_undis.at<float>(0, 2) * 2,
intrinsic_undis.at<float>(1, 2) * 2),
CV_32FC1, mapx_open, mapy_open);
cv::Mat test;
cv::remap(disImg[3], test, mapx_open, mapy_open, cv::INTER_LINEAR);
2.3 相机f参数调节
cv::Mat R = cv::Mat::eye(3, 3, CV_32F);
cv::Mat mapx_open, mapy_open;
cv::Mat intrinsic_undis;
fish_intrinsic.copyTo(intrinsic_undis);
intrinsic_undis.at<float>(0, 0) /= 4;
intrinsic_undis.at<float>(1, 1) /= 4;
/*intrinsic_undis.at<float>(0,2) *= 4;
intrinsic_undis.at<float>(1,2) *= 4;*/
cv::fisheye::initUndistortRectifyMap(
fish_intrinsic, m_undis2fish_params, R, intrinsic_undis,
cv::Size(intrinsic_undis.at<float>(0, 2) * 2,
intrinsic_undis.at<float>(1, 2) * 2),
CV_32FC1, mapx_open, mapy_open);
cv::Mat test;
cv::remap(disImg[3], test, mapx_open, mapy_open, cv::INTER_LINEAR);
3 鱼眼去畸变算法及其实现
/*
func: warp from distort to undistort
@param f_dx:f/dx
@param f_dy:f/dy
@param large_center_h: undis image center y
@param large_center_w: undis image center x
@param fish_center_h: fish image center y
@param fish_center_w: fish image center x
@param undis_param: factory param
@param x: input coordinate x on the undis image
@param y: input coordinate y on the undis image
*/
cv::Vec2f warpUndist2Fisheye(float fish_scale, float f_dx, float f_dy, float large_center_h,
float large_center_w, float fish_center_h,
float fish_center_w, cv::Vec4d undis_param, float x,
float y) {
f_dx *= fish_scale;
f_dy *= fish_scale;
float y_ = (y - large_center_h) / f_dy; // normalized plane
float x_ = (x - large_center_w) / f_dx;
float r_ = static_cast<float>(sqrt(pow(x_, 2) + pow(y_, 2)));
// Look up table
/*int num = atan(r_) / atan(m_d) * 1024;
float angle_distorted = m_Lut[num];*/
float angle_undistorted = atan(r_); // theta
float angle_undistorted_p2 = angle_undistorted * angle_undistorted;
float angle_undistorted_p3 = angle_undistorted_p2 * angle_undistorted;
float angle_undistorted_p5 = angle_undistorted_p2 * angle_undistorted_p3;
float angle_undistorted_p7 = angle_undistorted_p2 * angle_undistorted_p5;
float angle_undistorted_p9 = angle_undistorted_p2 * angle_undistorted_p7;
float angle_distorted = static_cast<float>(angle_undistorted +
undis_param[0] * angle_undistorted_p3 +
undis_param[1] * angle_undistorted_p5 +
undis_param[2] * angle_undistorted_p7 +
undis_param[3] * angle_undistorted_p9);
// scale
float scale = angle_distorted / (r_ + 0.00001f); // scale = r_dis on the camera img plane
// divide r_undis on the normalized plane
cv::Vec2f warp_xy;
float xx = (x - large_center_w) / fish_scale;
float yy = (y - large_center_h) / fish_scale;
warpPointOpencv(warp_xy, fish_center_h, fish_center_w, xx, yy, scale);
return warp_xy;
}
void warpPointOpencv(cv::Vec2f &warp_xy, float map_center_h, float map_center_w,
float x_, float y_, float scale) {
warp_xy[0] = x_ * scale + map_center_w;
warp_xy[1] = y_ * scale + map_center_h;
}
3.1 基础的鱼眼去畸变(主点相关)
-
首先,对于图像平面上的像素点,要用相机的内参f、dx、dy,将其转化到归一化平面,对应上图中的e点。并计算其距离归一化平面中心的距离r_。并计算对应的入射角,即上图中的 theta角 -
根据Kannala-Brandt的鱼眼模型公式,使用事先拟合的k1,k2,k3,k4参数计算归一化平面上去畸变之后点的位置r_distorted -
在归一化平面上计算去畸变前后点位置的比值:r_distorted/r_ -
3中计算的比值为归一化平面上,同样可以应用到相机成像平面以及图像平面上。因此,可以对图像平面上的像素点,乘上这个系数,就得到了鱼眼图上像素点的位置。
3.2 进阶的 鱼眼去畸变(如何调整f)
-
将相机焦距调整为 f/2 后,使用新的焦距将 点转换到归一化平面上去,得到 -
使用去畸变参数 , , , 计算其畸变状态下在归一化平面上的位置 -
使用前两步的结果,计算去畸变前后线段的长度比例scale -
根据已知的 与 前面计算的scale计算出 -
*2将点映射到 f 平面上,就得到了在 f/2 成像平面上的去畸变映射关系。
-
当我们调整 f 使其更小时,相同的内容集中于更小的分辨率上,对于后续的图像处理算法会更友好。很多锯齿和模糊的问题都能得到改善 -
在鱼眼上检测棋盘格角点要比在去畸变图上检测棋盘格角点更加准确,因为去畸变后大方格被拉伸的很严重。这个结论的依据是重投影误差以及将鱼眼检测到的角点坐标映射到去畸变图上后与直接在畸变图上的位置做视觉上的比较。
3.3 Opencv API undistortPoints的实现

#forward
self.distor_para, _ = curve_fit(self.func, self.data[:, 0],self.data[:, 1])
#inverse
f_inverse_para, _ = curve_fit(self.func_inverse, self.data[:, 1], self.data[:, 0])
cv::Vec2f CalibrateInit::warpFisheye2Undist(float fish_scale, float f_dx, float f_dy, float undis_center_h,
float undis_center_w, float fish_center_h,
float fish_center_w, cv::Vec4d undis_param, float x,
float y) {
// f_dx *= fish_scale;
// f_dy *= fish_scale;
float y_ = (y - fish_center_h) / f_dy; // normalized plane
float x_ = (x - fish_center_w) / f_dx;
float r_distorted = static_cast<float>(sqrt(pow(x_, 2) + pow(y_, 2)));
float r_distorted_p2 = r_distorted * r_distorted;
float r_distorted_p3 = r_distorted_p2 * r_distorted;
float r_distorted_p4 = r_distorted_p2 * r_distorted_p2;
float r_distorted_p5 = r_distorted_p2 * r_distorted_p3;
float angle_undistorted = static_cast<float>(r_distorted +
undis_param[0] * r_distorted_p2 +
undis_param[1] * r_distorted_p3 +
undis_param[2] * r_distorted_p4 +
undis_param[3] * r_distorted_p5);
// scale
float r_undistorted = tanf(angle_undistorted);
float scale = r_undistorted / (r_distorted + 0.00001f); // scale = r_dis on the camera img plane
// divide r_undis on the normalized plane
cv::Vec2f warp_xy;
float xx = (x - fish_center_w) * fish_scale;
float yy = (y - fish_center_h) * fish_scale;
warpPointInverse(warp_xy, undis_center_h, undis_center_w, xx, yy, scale);
return warp_xy;
}
void CalibrateInit::warpPointInverse(cv::Vec2f& warp_xy, float map_center_h, float map_center_w,
float x_, float y_, float scale) {
warp_xy[0] = x_ * scale + map_center_w;
warp_xy[1] = y_ * scale + map_center_h;
}
总结
—THE END—
一个专注于开放知识分享的公众号,努力将分享变成一种习惯!
后台回复「加群」加入互助群,可在公众号【菜单】中获取完整关键词清单。
回复:图像处理丨计算机视觉丨机器学习丨深度学习丨Python丨C/C++丨PyTorch丨CVPR2024丨ECCV2024 获取相应资料(不定期更新)。
点这里👇关注我,记得标星哦~
文章仅做学术分享,如有侵权请联系删除,非常感谢!

