Deepen AI - Enterprise
Deepen AI
  • Deepen AI Overview
  • FAQ
  • Saas & On Premise
  • Data Management
    • Data Management Overview
    • Creating/Uploading a dataset
    • Create a dataset profile
    • Auto Task Assignments
    • Task Life Cycle
    • Tasks Assignments
    • Creating a group
    • Adding a user
    • Embed labeled dataset
    • Export
    • Export labels
    • Import Labels
    • Import profile
    • Import profile via JSON
    • Access token for APIs
    • Data Streaming
    • Reports
    • Assessments
  • 2D/3D editors
    • Editor Content
    • AI sense (ML-assisted labeling)
    • Assisted 2d Segmentation
    • Scene Labeling
  • 2D Editor
    • 2D Editor Overview
    • 2D Bounding Boxes
    • 2D Polyline/Line
    • 2D Polygon
      • 2D Semantic/Instance Segmentation
        • 2D Segmentation (foreground/background)
    • 2D Points
    • 2D Semantic Painting
      • Segment Anything
      • Propagate Labels in Semantic Segementation
      • 2D Semantic Painting/Segmentation Output Format
    • 3D Bounding boxes on images
    • 2D ML-powered Visual Object Tracking
    • 2D Shortcut Keys
    • 2D Customer Review
  • 3D Editor
    • 3D Editor Overview
    • 3D Bounding Boxes — Single Frame/Individual Frame
    • 3D Bounding Boxes_Sequence
    • 3D Bounding Boxes Features
      • Label View
      • One-Click Bounding Box
      • Sequence Timeline
      • Show Ground Mesh
      • Secondary Views
      • Camera Views
      • Hide/UnHide Points in 3D Lidar
    • 3D Lines
    • 3D Polygons
    • 3D Semantic Segmentation/Painting
    • 3D Instance Segmentation/Painting
    • Fused Cloud
    • 3D Segmentation (Smart Brush)
    • 3D Segmentation (Polygon)
    • 3D Segmentation (Brush)
    • 3D Segmentation (Ground Polygon)
    • 3D Painting (Foreground/Background)
    • 3D Segmentation(3D Brush/Cube)
    • Label Set
    • 3D Shortcut Keys
    • 3D Customer Review
  • 3D input/output
    • JSON input format for uploading a dataset in a point cloud project.
    • How to convert ROS bag into JSON data for annotation
    • Data Output Format - 3D Semantic Segmentation
    • Data Output Format - 3D Instance Segmentation
  • Quality Assurance
    • Issue Creation
    • Automatic QA
  • Calibration
    • Calibration
    • Charuco Dictionary
    • Calibration FAQ
    • Data Collection for Camera intrinsic Calibration
    • Camera Intrinsic calibration
    • Data Collection for Lidar-Camera Calibration (Single Target)
    • Lidar-Camera Calibration (Single target)
    • Data Collection for Lidar-Camera Calibration (Targetless)
    • Lidar-Camera Calibration (Targetless)
    • Data Collection for Multi Target Lidar-Camera Calibration
    • Multi Target Lidar-Camera Calibration
    • Lidar-Camera Calibration(Old)
    • Vehicle-Camera Calibration
      • Data Collection for Vehicle-Camera Calibration
      • Vehicle Camera Targetless Calibration
      • Data collection for lane based targetless vehicle-camera calibration
      • Lane based Targetless Vehicle Camera Calibration
    • Data Collection for Rough Terrain Vehicle-Camera Calibration
    • Rough Terrain Vehicle-Camera Calibration
    • Calibration Toolbar options
    • Calibration Profile
    • Data Collection for Overlapping-Camera Calibration
    • Overlapping-Camera Calibration
    • Data collection guide for Overlapping Camera Calibration (Multiple-Targets)
    • Overlapping Camera Calibration (Multiple-Targets)
    • Data Collection for Vehicle-3D Lidar calibration
    • Data Collection for Vehicle-2D Lidar calibration
    • Vehicle Lidar (3D and 2D) Calibration
    • Data Collection for Vehicle Lidar Targetless Calibration
    • Data Collection for IMU Lidar Targetless Calibration
    • Vehicle Lidar Targetless Calibration
    • Data Collection for Non Overlapping Camera Calibration
    • Non-Overlapping-Camera Calibration
    • Multi Sensor Visualization
    • Data Collection for LiDAR-LiDAR Calibration
    • LiDAR-LiDAR Calibration
    • Data Collection for IMU Intrinsic calibration
    • IMU Intrinsic Calibration
    • Data Collection for Radar-Camera Calibration
    • Radar-Camera Calibration
    • Data Collection for IMU Vehicle calibration
    • Lidar-IMU Calibration
    • IMU Vehicle Calibration
    • Data Collection for vehicle radar calibration
    • Vehicle radar calibration
    • Calibration Optimiser
    • Calibration list page
    • Data collection for rough terrain vehicle-Lidar calibration
    • Rough terrain vehicle Lidar calibration
    • Surround view camera correction calibration
    • Data Collection for Surround view camera correction calibration
    • Data Collection for Lidar-Radar calibration
    • Lidar Radar Calibration
    • Vehicle Lidar Calibration
    • API Documentation
      • Targetless Overlapping Camera Calibration API
      • Target Overlapping Camera Calibration API
      • Lidar Camera Calibration API
      • LiDAR-LiDAR Calibration API
      • Vehicle Lidar Calibration API
      • Global Optimiser
      • Radar Camera Calibration API
      • Target Camera-Vehicle Calibration API
      • Targetless Camera-Vehicle Calibration API
      • Calibration groups
      • Delete Calibrations
      • Access token for APIs
    • Target Generator
  • API Reference
    • Introduction and Quickstart
    • Datasets
      • Create new dataset
      • Delete dataset
    • Issues
    • Tasks
    • Process uploaded data
    • Import 2D labels for a dataset
    • Import 3D labels for a dataset
    • Download labels
    • Labeling profiles
    • Paint labels
    • User groups
    • User / User Group Scopes
    • Download datasets
    • Label sets
    • Resources
    • 2D box pre-labeling model API
    • 3D box pre-labeling model API
    • Output JSON format
Powered by GitBook
On this page
  • Calibrate List Page:
  • Calibration type selection:
  • Calibration welcome page:
  • Vehicle configuration page:
  • Configure checkerboard and Aruco:
  • Name of the Lidar:
  • Upload mounted Lidar files:
  • Make sparse board dense:
  • External camera intrinsics:
  • Upload images for the left-view:
  • Detect target corners:
  • Upload images for right-view:
  • Detect target corners:
  • Run calibrate:
  • Visualization:
Export as PDF
  1. Calibration

Rough terrain vehicle Lidar calibration

PreviousData collection for rough terrain vehicle-Lidar calibrationNextSurround view camera correction calibration

Last updated 2 years ago

Calibrate List Page:

Click on ‘New Calibration’ on the calibration list page.

Calibration type selection:

Click on Vehicle-LiDAR calibration, to start it.

Calibration welcome page:

A welcome page with the set of instructions is shown at the beginning, just click on ‘Get started’ to go forward.

Vehicle configuration page:

Enter the calibration configuration along with the calibration name. Following are the vehicle configurations.

  • Wheelbase: Distance between left/right front wheel center and left/right rear wheel center in meters. If asked for a Wheelbase without left/right, measure either the left or right wheelbase. If the wheelbase is the same on both left and right, the right wheelbase is optional.

  • Track: Distance between front/rear left wheel center and front/rear right wheel center in meters. If asked for just Track without front/rear, measure either the front or rear track.

  • Diameter: Distance between bottom of the front/rear wheel to the top of the front/rear wheel (wheel height) in meters. Note: wheel diameter must include tires as well.

Configure checkerboard and Aruco:

AruCo markers are used for automatic wheel detection. Add the square size of the marker.

Similarly, checkerboard configurations need to be updated

  • Horizontal Corner Count: These are the count of corners in the top row from first to last. (left to right). The blue dot's shown in the above preview correspond to the horizontal corners.

  • Vertical Corner Count: These are the count of corners in the left column from the first to the last. (top to bottom). The red dot's shown in the above preview correspond to the vertical corners.

  • Square Size: It is the length of a square arm in meters. The square size corresponds to the length of the yellow square highlighted in the preview.

  • Left padding: The distance from the leftmost side of the board to the left-most corner point in meters. Corresponds to the blue line in the preview.

  • Right padding: The distance from the rightmost side of the board to the rightmost corner point in meters. Corresponds to the red line in the preview.

  • Top padding: The distance from the topmost side of the board to the topmost corner point in meters. Corresponds to the red line in the preview.

  • Bottom padding: The distance from the bottom-most side of the board to the bottom-most corner point in meters. Corresponds to the blue line in the preview.

Name of the Lidar:

Upload mounted Lidar files:

Upload left and right mounted Lidar files. Left mounted Lidar pcd is recorded when the checkerboard is kept on the left side of the vehicle. Similarly right mounted pcd is recorded when the checkerboard is kept on the right side of the vehicle.

For now, we are supporting only the pcd for the Lidar recordings.

Make sparse board dense:

Users need to select the four border corners of the board. We then identify the edges of the board and also make the spare board dense. Users can select the options shown in the top right corner to display edge points or the checkerboard.

This step has to be repeated for both the point cloud files.

External camera intrinsics:

Users needs to input the intrinsic parameters for the external camera which is used for the calibration.

Upload images for the left-view:

Users need to upload the images taken from the external camera for the left-view of the vehicle.

A minimum of two images is required for the calibration.

Detect target corners:

Checkerboard corners along with the aruco markers attached to the vehicle wheels are detected in this step.

Upload images for right-view:

Users need to upload the images taken from the external camera for the right-view of the vehicle.

A minimum of two images is required for the calibration.

Detect target corners:

Checkerboard corners along with the aruco markers attached to the vehicle wheels are detected in this step.

Run calibrate:

Click on run calibrate to do the calibration. After calibration, this gives us the extrinsic parameters to the right side of the tool.

Visualization:

The visualization shows the fused point cloud along with the vehicle. Users can verify the results visually using this option.