Lane based Targetless Vehicle Camera Calibration
Lane based Targetless Vehicle-Camera calibration:
Last updated
Lane based Targetless Vehicle-Camera calibration:
Last updated
This page lets users view, create, launch, and delete calibration datasets. Admins can manage users’ access to these datasets on this page.
Click on New Calibration to create a new calibration dataset.
Select Vehicle-Camera Calibration to create a new dataset.
Upon selecting Vehicle-Camera Calibration, the user is welcomed to the instructions page. Click on Get started to start the calibration setup.
Select the Terrain as Flat and approach as Targetless
Select the Lane based calibration option "Atleast 3 equidistant lane boundary lines (New)" option
Intrinsic parameters for the camera are to be added here. Users have three options.
Users can use the Camera Intrinsic calibration tool to calibrate the results, save them to the profile, and then load them here. For more details, click here.
Users can also load the JSON file.
Users can manually enter the intrinsic parameters if they already have them.
Select any of the uploaded image and draw at least 3 straight lines resembling the equidistant lane boundaries present in the image.
Repeat the above step for at least 10 images for better accuracy of the final calibrated results.
Upon completion, Click 'Continue' to move into the calibration page.
In the calibration page, Users must click Calibrate to calculate the extrinsic parameters of the camera with respect to the vehicle coordinate system
On successful calibration, click 'Visualize' button on the top right to view the Birds Eye View (BEV) representation of the camera image according to the calibrated extrinsic parameters.
For an ideal calibration, the lanes should appear parallel and equidistant when transformed in the BEV (Bird's Eye View) images.
Based on the above theory, we calculate the Parallelism error and Equidistant error and combine those two to get the final error.
.
The below graph is a plot between the actual ground truth error and Deepen error function on 9 observations including the ground truth on a Kitti dataset
The plot shows a strong correlation between our error function and the actual deviation from the ground truth within 1.3 degree of the ground truth.
The ground truth for the above mentioned dataset is 0.634 (Roll), -0.430 (Pitch), 0.310 (Yaw) and the estimated extrinsic parameters by optimising our error function is 1.568, -0.215, 0.327 with a deviation of just 0.95 degrees from the ground truth with most of the error being in the estimation of Roll angle.
The extrinsic parameters of the camera are with respect to the vehicle ROS coordinate system.
In the ROS coordinate system of a vehicle, the X-axis is facing along the vehicle direction, Y-axis is towards the left of the vehicle, and the Z-axis is perpendicular to the road plane facing towards the top.
In the tool, the extrinsic parameters Roll, Pitch, Yaw are in degrees.
Roll, Pitch, and Yaw are the extrinsic parameters downloaded from the calibration tool