Deepen AI - Enterprise
Deepen AI
  • Deepen AI Overview
  • FAQ
  • Saas & On Premise
  • Data Management
    • Data Management Overview
    • Creating/Uploading a dataset
    • Create a dataset profile
    • Auto Task Assignments
    • Task Life Cycle
    • Tasks Assignments
    • Creating a group
    • Adding a user
    • Embed labeled dataset
    • Export
    • Export labels
    • Import Labels
    • Import profile
    • Import profile via JSON
    • Access token for APIs
    • Data Streaming
    • Reports
    • Assessments
  • 2D/3D editors
    • Editor Content
    • AI sense (ML-assisted labeling)
    • Assisted 2d Segmentation
    • Scene Labeling
  • 2D Editor
    • 2D Editor Overview
    • 2D Bounding Boxes
    • 2D Polyline/Line
    • 2D Polygon
      • 2D Semantic/Instance Segmentation
        • 2D Segmentation (foreground/background)
    • 2D Points
    • 2D Semantic Painting
      • Segment Anything
      • Propagate Labels in Semantic Segementation
      • 2D Semantic Painting/Segmentation Output Format
    • 3D Bounding boxes on images
    • 2D ML-powered Visual Object Tracking
    • 2D Shortcut Keys
    • 2D Customer Review
  • 3D Editor
    • 3D Editor Overview
    • 3D Bounding Boxes — Single Frame/Individual Frame
    • 3D Bounding Boxes_Sequence
    • 3D Bounding Boxes Features
      • Label View
      • One-Click Bounding Box
      • Sequence Timeline
      • Show Ground Mesh
      • Secondary Views
      • Camera Views
      • Hide/UnHide Points in 3D Lidar
    • 3D Lines
    • 3D Polygons
    • 3D Semantic Segmentation/Painting
    • 3D Instance Segmentation/Painting
    • Fused Cloud
    • 3D Segmentation (Smart Brush)
    • 3D Segmentation (Polygon)
    • 3D Segmentation (Brush)
    • 3D Segmentation (Ground Polygon)
    • 3D Painting (Foreground/Background)
    • 3D Segmentation(3D Brush/Cube)
    • Label Set
    • 3D Shortcut Keys
    • 3D Customer Review
  • 3D input/output
    • JSON input format for uploading a dataset in a point cloud project.
    • How to convert ROS bag into JSON data for annotation
    • Data Output Format - 3D Semantic Segmentation
    • Data Output Format - 3D Instance Segmentation
  • Quality Assurance
    • Issue Creation
    • Automatic QA
  • Calibration
    • Calibration
    • Charuco Dictionary
    • Calibration FAQ
    • Data Collection for Camera intrinsic Calibration
    • Camera Intrinsic calibration
    • Data Collection for Lidar-Camera Calibration (Single Target)
    • Lidar-Camera Calibration (Single target)
    • Data Collection for Lidar-Camera Calibration (Targetless)
    • Lidar-Camera Calibration (Targetless)
    • Data Collection for Multi Target Lidar-Camera Calibration
    • Multi Target Lidar-Camera Calibration
    • Lidar-Camera Calibration(Old)
    • Vehicle-Camera Calibration
      • Data Collection for Vehicle-Camera Calibration
      • Vehicle Camera Targetless Calibration
      • Data collection for lane based targetless vehicle-camera calibration
      • Lane based Targetless Vehicle Camera Calibration
    • Data Collection for Rough Terrain Vehicle-Camera Calibration
    • Rough Terrain Vehicle-Camera Calibration
    • Calibration Toolbar options
    • Calibration Profile
    • Data Collection for Overlapping-Camera Calibration
    • Overlapping-Camera Calibration
    • Data collection guide for Overlapping Camera Calibration (Multiple-Targets)
    • Overlapping Camera Calibration (Multiple-Targets)
    • Data Collection for Vehicle-3D Lidar calibration
    • Data Collection for Vehicle-2D Lidar calibration
    • Vehicle Lidar (3D and 2D) Calibration
    • Data Collection for Vehicle Lidar Targetless Calibration
    • Data Collection for IMU Lidar Targetless Calibration
    • Vehicle Lidar Targetless Calibration
    • Data Collection for Non Overlapping Camera Calibration
    • Non-Overlapping-Camera Calibration
    • Multi Sensor Visualization
    • Data Collection for LiDAR-LiDAR Calibration
    • LiDAR-LiDAR Calibration
    • Data Collection for IMU Intrinsic calibration
    • IMU Intrinsic Calibration
    • Data Collection for Radar-Camera Calibration
    • Radar-Camera Calibration
    • Data Collection for IMU Vehicle calibration
    • Lidar-IMU Calibration
    • IMU Vehicle Calibration
    • Data Collection for vehicle radar calibration
    • Vehicle radar calibration
    • Calibration Optimiser
    • Calibration list page
    • Data collection for rough terrain vehicle-Lidar calibration
    • Rough terrain vehicle Lidar calibration
    • Surround view camera correction calibration
    • Data Collection for Surround view camera correction calibration
    • Data Collection for Lidar-Radar calibration
    • Lidar Radar Calibration
    • Vehicle Lidar Calibration
    • API Documentation
      • Targetless Overlapping Camera Calibration API
      • Target Overlapping Camera Calibration API
      • Lidar Camera Calibration API
      • LiDAR-LiDAR Calibration API
      • Vehicle Lidar Calibration API
      • Global Optimiser
      • Radar Camera Calibration API
      • Target Camera-Vehicle Calibration API
      • Targetless Camera-Vehicle Calibration API
      • Calibration groups
      • Delete Calibrations
      • Access token for APIs
    • Target Generator
  • API Reference
    • Introduction and Quickstart
    • Datasets
      • Create new dataset
      • Delete dataset
    • Issues
    • Tasks
    • Process uploaded data
    • Import 2D labels for a dataset
    • Import 3D labels for a dataset
    • Download labels
    • Labeling profiles
    • Paint labels
    • User groups
    • User / User Group Scopes
    • Download datasets
    • Label sets
    • Resources
    • 2D box pre-labeling model API
    • 3D box pre-labeling model API
    • Output JSON format
Powered by GitBook
On this page
Export as PDF
  1. Calibration

Non-Overlapping-Camera Calibration

PreviousData Collection for Non Overlapping Camera CalibrationNextMulti Sensor Visualization

Last updated 3 years ago

Overview: Non-Overlapping-Camera Calibration App is a software tool that makes the process of calibration of Non-Overlapping-Camera simple and quick.

  1. Calibration List Page

This page contains the list of calibrations. You can launch an existing dataset, delete and even manage your access to these datasets.

2.Non-Overlapping-Camera Calibration Launch

You can select an existing Non-Overlapping-camera calibration from the list page, or click on the “New Calibration” button on the top-right of the calibration list page and select the Non-Overlapping-calibration in the window that pops.

3. Start Page

The page lists the important instructions and requirements that you need to follow in order to complete the calibration process.

  • Start the app by clicking on the “Get Started” button

  • Choose calibration method (Calibration target/ Targetless). 'Calibration target' method requires checkerboard images in the Overlapping field of view. 'Targetless' approach does not require a checkerboard.

  • Input the name you wish to give this calibration. This name will be shown on the calibration-list page. And then click on “Set Details”

4. Setup Support-Camera details

You can fill in the basic details for the Non-Overlapping-Camera and also specifically for the left and right camera of the Non-Overlapping-camera.

  • Camera name: Input the name for the support camera sensor

  • Camera lens model: Select the lens model of your support camera

  • Fill Camera intrinsic parameter: Import the intrinsics from a profile or type them manually

  • Import from JSON: You can use JSON files to import the intrinsics parameters as well.

  • Once all the details are filled in, click the “Continue” button on the bottom right.

5. Setup Camera 1 details

The process is similar to setting up the support camera details

  • Distance between support camera and camera 1 (m): Input the distance between the support camera and camera 1 in meters. (Applicable for targetless approach)

6. Add image pairs

You need to add the images taken from camera 1 and the support camera. This page allows you to pair them manually. Also, it has an “Auto Pair” button to pair the files automatically, based on their filename order. It is recommended to follow the file naming guidelines from the Non-Overlapping-Calibration Data collection doc.

  • Upload the images for camera 1

  • Upload the images for the support camera

  • Files are automatically paired based on the file name order.

  • Verify if the pairing is correct or not. If not, select the correct file pair on left and right, and pair them up using the “Add Pair” button on the bottom.

  • Delete any unpaired files, with the delete button that appears when hovered over a file.

  • Click on the “Continue” button.

6. Set details

Checkerboard: Set checkerboard details

You need to fill in the checkerboard details like Horizontal Corners, Vertical Corners and Dimensions i.e. square size for the checkerboard and click on “Set Config”. Then click on Continue.

Square dimension should be in meters. And the horizontal and the vertical corners are inner corners of the checkerboard.

Targetless: Detect feature points

Click the "Detect" button to identify important point matches in the image pair. You can visualize the matches by following along the corresponding lines (Hover to highlight).

Go through the detected points and choose the best-detected matching point results from the dropdown. (If only one result set, then no action to take).

7. Detect Checkerboard corners in the images

On this page, you just have to click “Run Detect corners” to get the checkerboard corners for all images. This is to make sure that the calibration identifies all checkerboard corners for all images for the best results. If all checkerboard corners are detected, click on “Continue” calibration results.

8. Camera-2 Setup

Repeat the above 3 steps for camera 2 and for camera 2 and support camera images

9. Calibrate

Checkerboard

In this step, you simply have to click on the “Run calibration” button. Once clicked, the images are processed and once done, the calibration results are shown on the right side of the page. Also, the Visualise button becomes enabled and allows you to visualise the calibration results.

Targetless

On this page, you just have to click “Run Calibration” to get the calibration results. Once clicked, the images are processed and once done, the calibration results are shown on the right side of the page. Also, the 'Visualise' button and the 'Display Epilines' checkbox becomes enabled and allow you to visualise the calibration results.

10. Error stats

You can toggle the error stats and select individual errors or average errors. Individual Errors show the errors related to an image pair selected on the left pane. The error stats are shown separately for the two camera pairs present

Checkerboard

  • Left Reprojection Error: The pixel error when the detected corners from the right image are projected onto the left image using the calibration results

  • Right Reprojection Error: The pixel error when the detected corners from the left image are projected onto the right image using the calibration results

  • Translation Error: The distance between the means of the 3d projections of the checkerboard in a 3d scene from the left images and right images

  • Rotation Error: The angle between the planes of the 3d projections in a 3d scene of the checkerboard from the left images and right images

Targetless

  • Epiline Point Distance: Average pixel distance of each point to its corresponding projected epiline.

  • Epipolar Error: Proportional to the distance of a point from its epiline. Does not have a physical meaning. It is the residual error from minimizing the epipolar constraints while calculating the fundamental/essential matrix

11. Visualise

Clicking on the visualise button takes you to the visualisation screen. You can see the cameras and the selected image-pair checkerboard on the screen. You can move around in the 3d scene using the mouse left (for pan) or right (for orbit) button click-drag, to have a better sense of the view. Also, on the top-right, in the options dropdown, you have the option to show/hide all the checkerboards from all the image pairs at once.

Only frustums are visible for Targetless method.

12. Export

You can export the results of the calibration using the “Export” button on the top-right of the page. Fill in the name of the output file and click on Export.

Checkerboard image pairs
Targetless images