Deepen AI - Enterprise
Deepen AI
  • Deepen AI Overview
  • FAQ
  • Saas & On Premise
  • Data Management
    • Data Management Overview
    • Creating/Uploading a dataset
    • Create a dataset profile
    • Auto Task Assignments
    • Task Life Cycle
    • Tasks Assignments
    • Creating a group
    • Adding a user
    • Embed labeled dataset
    • Export
    • Export labels
    • Import Labels
    • Import profile
    • Import profile via JSON
    • Access token for APIs
    • Data Streaming
    • Reports
    • Assessments
  • 2D/3D editors
    • Editor Content
    • AI sense (ML-assisted labeling)
    • Assisted 2d Segmentation
    • Scene Labeling
  • 2D Editor
    • 2D Editor Overview
    • 2D Bounding Boxes
    • 2D Polyline/Line
    • 2D Polygon
      • 2D Semantic/Instance Segmentation
        • 2D Segmentation (foreground/background)
    • 2D Points
    • 2D Semantic Painting
      • Segment Anything
      • Propagate Labels in Semantic Segementation
      • 2D Semantic Painting/Segmentation Output Format
    • 3D Bounding boxes on images
    • 2D ML-powered Visual Object Tracking
    • 2D Shortcut Keys
    • 2D Customer Review
  • 3D Editor
    • 3D Editor Overview
    • 3D Bounding Boxes — Single Frame/Individual Frame
    • 3D Bounding Boxes_Sequence
    • 3D Bounding Boxes Features
      • Label View
      • One-Click Bounding Box
      • Sequence Timeline
      • Show Ground Mesh
      • Secondary Views
      • Camera Views
      • Hide/UnHide Points in 3D Lidar
    • 3D Lines
    • 3D Polygons
    • 3D Semantic Segmentation/Painting
    • 3D Instance Segmentation/Painting
    • Fused Cloud
    • 3D Segmentation (Smart Brush)
    • 3D Segmentation (Polygon)
    • 3D Segmentation (Brush)
    • 3D Segmentation (Ground Polygon)
    • 3D Painting (Foreground/Background)
    • 3D Segmentation(3D Brush/Cube)
    • Label Set
    • 3D Shortcut Keys
    • 3D Customer Review
  • 3D input/output
    • JSON input format for uploading a dataset in a point cloud project.
    • How to convert ROS bag into JSON data for annotation
    • Data Output Format - 3D Semantic Segmentation
    • Data Output Format - 3D Instance Segmentation
  • Quality Assurance
    • Issue Creation
    • Automatic QA
  • Calibration
    • Calibration
    • Charuco Dictionary
    • Calibration FAQ
    • Data Collection for Camera intrinsic Calibration
    • Camera Intrinsic calibration
    • Data Collection for Lidar-Camera Calibration (Single Target)
    • Lidar-Camera Calibration (Single target)
    • Data Collection for Lidar-Camera Calibration (Targetless)
    • Lidar-Camera Calibration (Targetless)
    • Data Collection for Multi Target Lidar-Camera Calibration
    • Multi Target Lidar-Camera Calibration
    • Lidar-Camera Calibration(Old)
    • Vehicle-Camera Calibration
      • Data Collection for Vehicle-Camera Calibration
      • Vehicle Camera Targetless Calibration
      • Data collection for lane based targetless vehicle-camera calibration
      • Lane based Targetless Vehicle Camera Calibration
    • Data Collection for Rough Terrain Vehicle-Camera Calibration
    • Rough Terrain Vehicle-Camera Calibration
    • Calibration Toolbar options
    • Calibration Profile
    • Data Collection for Overlapping-Camera Calibration
    • Overlapping-Camera Calibration
    • Data collection guide for Overlapping Camera Calibration (Multiple-Targets)
    • Overlapping Camera Calibration (Multiple-Targets)
    • Data Collection for Vehicle-3D Lidar calibration
    • Data Collection for Vehicle-2D Lidar calibration
    • Vehicle Lidar (3D and 2D) Calibration
    • Data Collection for Vehicle Lidar Targetless Calibration
    • Data Collection for IMU Lidar Targetless Calibration
    • Vehicle Lidar Targetless Calibration
    • Data Collection for Non Overlapping Camera Calibration
    • Non-Overlapping-Camera Calibration
    • Multi Sensor Visualization
    • Data Collection for LiDAR-LiDAR Calibration
    • LiDAR-LiDAR Calibration
    • Data Collection for IMU Intrinsic calibration
    • IMU Intrinsic Calibration
    • Data Collection for Radar-Camera Calibration
    • Radar-Camera Calibration
    • Data Collection for IMU Vehicle calibration
    • Lidar-IMU Calibration
    • IMU Vehicle Calibration
    • Data Collection for vehicle radar calibration
    • Vehicle radar calibration
    • Calibration Optimiser
    • Calibration list page
    • Data collection for rough terrain vehicle-Lidar calibration
    • Rough terrain vehicle Lidar calibration
    • Surround view camera correction calibration
    • Data Collection for Surround view camera correction calibration
    • Data Collection for Lidar-Radar calibration
    • Lidar Radar Calibration
    • Vehicle Lidar Calibration
    • API Documentation
      • Targetless Overlapping Camera Calibration API
      • Target Overlapping Camera Calibration API
      • Lidar Camera Calibration API
      • LiDAR-LiDAR Calibration API
      • Vehicle Lidar Calibration API
      • Global Optimiser
      • Radar Camera Calibration API
      • Target Camera-Vehicle Calibration API
      • Targetless Camera-Vehicle Calibration API
      • Calibration groups
      • Delete Calibrations
      • Access token for APIs
    • Target Generator
  • API Reference
    • Introduction and Quickstart
    • Datasets
      • Create new dataset
      • Delete dataset
    • Issues
    • Tasks
    • Process uploaded data
    • Import 2D labels for a dataset
    • Import 3D labels for a dataset
    • Download labels
    • Labeling profiles
    • Paint labels
    • User groups
    • User / User Group Scopes
    • Download datasets
    • Label sets
    • Resources
    • 2D box pre-labeling model API
    • 3D box pre-labeling model API
    • Output JSON format
Powered by GitBook
On this page
  • Introduction
  • Folder Structure
  • Note
  • config.json for multitarget
  • Sample config.json
  • config.json for a Single target
  • Sample config.json
  • config.json for targetless calibration
  • Sample config.json
  • config.json key description
  • Quickstart
  • Upload file and calibrate
  • Request
  • Response
  • Get Extrinsic Parameters
  • Request
  • Response
  • Note
Export as PDF
  1. Calibration
  2. API Documentation

Vehicle Lidar Calibration API

PreviousLiDAR-LiDAR Calibration APINextGlobal Optimiser

Last updated 7 months ago

Introduction

The API requires the client to upload the PCD, and configuration for vehicle lidar setup in a zip file (.zip extension) in the format defined below. The contents of the zip file are called a dataset.

  1. The client makes an Upload and calibrate API call, which uploads their files and runs the calibration algorithm on the lidar files for the given configuration.

  2. The calibration process is completed without errors if the Upload and calibrate API call response contains dataset_id, extrinsic_parameters, and error_stats.

  3. The client can call the Get Extrinsic Parameters API using the dataset_id obtained from the Upload and calibrate API. This API responds with dataset_id, extrinsic_parameters, and error_stats.

Folder Structure

We require lidar frames for a given calibration.

  1. Place the Lidar data captured from the LiDAR in a folder.

  2. config.json contains configuration details of the calibration (intrinsic parameters, calibration name, etc.)

Note: Folder structure is optional. Users can place all files in the main directory and zip it.

Note

  1. The folder and lidar file names shown here are for demonstration purposes. Users should avoid using spaces in the folder and the lidar filename.

  2. The name of the JSON file should be config.json (case sensitive)

config.json for multitarget

{
    "calibration_name": "Lidar vehicle testing ",
    "calibration_type": "lidar_vehicle_calibration",
    "calibration_group_id": "xxxxxxxxxxxxxxxxxx",
    "is_lidar_tilted": false,
    "lidar_ground_height": 1.24,
    "vehicle_configuration":
    {
        "vehicle_shape": "rectangle",
        "wheelbase": 1.2,
        "track": 1.2,
        "front_wheel_overhang": 0,
        "rear_wheel_overhang": 0
    },
    "is_targetless_3d_lidar": false,
    "is_2d_lidar": false,
    "auto_detect_lidar_board": true,
    "multi_target": true,
    "targets":
    {
        "left":
        {
            "length": 1.2,
            "width": 1.8
        },
        "right":
        {
            "length": 1.2,
            "width": 1.8
        },
        "front":
        {
            "length": 1.2,
            "width": 1.8
        }
    },
    "use_bounding_box_on_board_detection_failure": false,
    "bounding_box":
    {
        "left":
        {
            "xmin": 1,
            "xmax": 1,
            "ymin": 1,
            "ymax": 1,
            "zmin": 1,
            "zmax": 1
        },
        "right":
        {
            "xmin": 1,
            "xmax": 1,
            "ymin": 1,
            "ymax": 1,
            "zmin": 1,
            "zmax": 1
        },
        "front":
        {
            "xmin": 1,
            "xmax": 1,
            "ymin": 1,
            "ymax": 1,
            "zmin": 1,
            "zmax": 1
        }
    },
    "all_lidar_data":
    [
        {
            "lidar_name": "Main Lidar",
            "laser_channels": 16,
            "lidar_type": "directional",
            "lidar_fov_direction": "front",
            "board_configuration_data":
            {
                "front_board_distance": 1.8,
                "left_board_distance": 0.6,
                "right_board_distance": 0.6
            }
        }
    ],
    "data":
    {
        "files":
        {
            "file": "lidar/lidardata.pcd"
        }
    }
}

Sample config.json

config.json for a Single target

{
    "calibration_name": "Lidar vehicle testing ",
    "calibration_type": "lidar_vehicle_calibration",
    "calibration_group_id": "xxxxxxxxxxxxxxxxxx",
    "vehicle_configuration":
    {
        "vehicle_shape": "rectangle",
        "wheelbase": 1.2,
        "track": 1.2,
        "front_wheel_overhang": 0.9,
        "rear_wheel_overhang": 0.86
    },
    "is_targetless_3d_lidar": false,
    "is_2d_lidar": false,
    "auto_detect_lidar_board": true,
    "multi_target": false,
    "is_lidar_tilted": false,
    "lidar_ground_height": 1.24,
    "targets":
    {
        "left":
        {
            "length": 1,
            "width": 0.6
        },
        "right":
        {
            "length": 1,
            "width": 0.6
        },
        "front":
        {
            "length": 1,
            "width": 0.6
        }
    },
    "use_bounding_box_on_board_detection_failure": false,
    "bounding_box":
    {
        "left":
        {
            "xmin": 1,
            "xmax": 1,
            "ymin": 1,
            "ymax": 1,
            "zmin": 1,
            "zmax": 1
        },
        "right":
        {
            "xmin": 1,
            "xmax": 1,
            "ymin": 1,
            "ymax": 1,
            "zmin": 1,
            "zmax": 1
        },
        "front":
        {
            "xmin": 1,
            "xmax": 1,
            "ymin": 1,
            "ymax": 1,
            "zmin": 1,
            "zmax": 1
        }
    },
    "all_lidar_data":
    [
        {
            "lidar_name": "Main Lidar",
            "lidar_config": 16,
            "lidar_type": "directional",
            "lidar_fov_direction": "front",
            "board_configuration_data":
            {
                "front_board_distance": 0.5,
                "left_board_distance": 0.5,
                "right_board_distance": 0.5
            }
        }
    ],
    "data":
    {
        "files":
        {
            "left": "lidar/left_0.5.pcd",
            "right": "lidar/right_0.5.pcd",
            "front": "lidar/front_0.5.pcd"
        }
    }
}

Sample config.json

config.json for targetless calibration

{
    "calibration_name": "Lidar vehicle testing ",
    "calibration_type": "lidar_vehicle_calibration",
    "calibration_group_id": "xxxxxxxxxxxxxxxxxx",
    "is_targetless_3d_lidar": true,
    "is_2d_lidar": false,
    "slam_algorithm_to_use": "ICP",
    "all_lidar_data":
    [
        {
            "lidar_name": "Main Lidar",
            "laser_channels": 64,
            "ground_rectangle_length": 20,
            "ground_rectangle_breadth": 20
        }
    ],
    "data":
    {
        "files":
        [
            "1.pcd",
            "2.pcd",
            "3.pcd",
            "4.pcd",
            "5.pcd",
            "6.pcd",
            "7.pcd",
            "8.pcd",
            "9.pcd",
            "10.pcd",
            "11.pcd",
            "12.pcd",
            "13.pcd",
            "14.pcd",
            "15.pcd",
            "16.pcd",
            "17.pcd",
            "18.pcd",
            "19.pcd",
            "20.pcd",
            "21.pcd",
            "22.pcd"
        ]
    }
}

Sample config.json

config.json key description

Key
Type
Description

calibration_name

string

Name of the calibration

calibration_type

string

Non-editable field.*Value should be lidar_vehicle_calibration

calibration_group_id

string

This is an optional key. Provide valid calibration_group_id to add the dataset to calibration group.

multi_target

boolean

true: if multiple targets are used false: if single target is used

lidar_name

string

It is the name given by the client to the lidar. The client can modify it as willed.

is_targetless_3d_lidar

boolean

true: for targetless calibration false: for target-based calibration

slam_algorithm_to_use

string

This parameter is needed when is_targetless_3d_lidar is true. Accepted values are 1. LOAM 2. ICP

laser_channels

integer

Laser channels of the lidar used. Accepted values are 16, 32, 64 and 128

targets

Object

It is a dictionary of dictionary with each dictionary having target properties. Accepted keys are 1. left 2. right 3. front 4. rear

length

double

length of the board of the target in meters

width

double

width of the board of the target in meters

tilted

Boolean

true: if the board is tilted false: if the board is not tilted

data

Object

It stores the data related to files of the lidar

files

Object

It is an Object, where each key is a string which is "file" in case of multi-target and "left", "right", "front" or "rear" containing the path to the file.

wheelbase

double

The length from the mid of the rear wheel to the mid of the front wheel on the same side

track

double

The length from the right mid of the wheel in front of vehicle to left mid of the wheel in front of the vehicle or The length from the right mid of the wheel in rear of vehicle to left mid of the wheel in rear of the vehicle

vehicle_configuration

Object

Object which has all the measurements 1. vehicle_shape 2. wheelbase 3. track 4. front_wheel_overhang 5. rear_wheel_overhang

vehicle_shape

string

rectangle or trapezoid based on the shape of the vehicle

breadth

double

breadth of the board of the target in meters

front_wheel_overhang

double

Overhang from the middle of the front wheels to the front end part of the vehicle in meters

rear_wheel_overhang

double

Overhang from the middle of the rear wheels to the rear end part of the vehicle in meters

is_targetless_3d_lidar

boolean

true if the calibration is targetless false for target-based calibration

is_2d_lidar

boolean

true if the calibration is 2d- lidar false if the calibration is 3d-lidar

auto_detect_lidar_board

boolean

true for auto detecting the board in the point cloud false otherwise In most cases when using api it is true other than if bounding box is provided

use_bounding_box_on_board_detection_failure

boolean

true if bounding box should be used when detection fails

bounding_box

Object of Objects

This is an object of objects with key corresponding to the target whose details are provided. Need to be provided if we want to use bounding box forboard detection The keys which gives the bounding box for a target 1. xmin 2. xmax 3. ymin 4. ymax 5. zmin 6. zmax The key refers to the corners of minimum and maximum values in all axis to create the bounding box within which the target is identified. These values are in the lidar frame.

all_lidar_data

Object

Object with all the lidar data and board configration

lidar_type

string

directional or 360 if there are 4 boards and 3 boards respectively

laser_channels

Integer

Number of channels the lidar has.

lidar_fov_direction

string

If the lidar is directional then this defines which direction with respect to the vehicle is the lidar present. Supports

  1. front

  2. rear

  3. left

  4. right

front_board_distance

double

Distance from front direction of the car to the board

left_board_distance

double

Distance from left direction of the car to the board

right_board_distance

double

Distance from right direction of the car to the board

rear_board_distance

double

Distance from rear direction of the car to the board

target_distance

double

Approximate distance between the lidar and the centroid of the board to differentiate the boards in case more than one are of similar size. needed only of the sizes are same in multiple boards

is_lidar_tilted

Boolean

If the lidar is tilted autodetection happends else if the lidar ground heigh is given then ground is detected at that much distance from ground

lidar_ground_height

number

distance from ground to the lidar when lidar is parallel to ground

Quickstart

Upload file and calibrate

This POST API call sends a zip file to the server and runs the calibration algorithm. As the response, it returns dataset_id, extrinsic_parameters, and error_stats (error_stats won't be available for targetless calibration) to the user.

https://tools.calibrate.deepen.ai/api/v2/external/clients/{clientId}/calibration_dataset

Request

Path parameters

Parameter name
Parameter type
Description

clientId

string

ClientId obtained from Deepen AI

Body

Key
Value
Description

file

.zip file

Zip file containing config and pcd in a suitable format

Response

{
    "dataset_id": "XXXXXXXXXXXXXXXXX",
    "calibration_algorithm_version": "target_based:v1.01",
    "extrinsic_parameters": {
        "roll": 0.11618719284657622,
        "pitch": 16.0300957665656,
        "yaw": 0.39630825451294216,
        "px": 1.6981482368157552,
        "py": -0.16870946105733092,
        "pz": 1.411771066176465
    },
    "error_stats": {
        "distance_error": {
            "average": 0.016561422963971915,
            "front_plane": 0.005801826261770239,
            "rear_plane": null,
            "right_plane": 0.007627213567991918,
            "left_plane": 0.03803306081172632,
            "ground_plane": 0.014783591214399178
        },
        "angle_error": {
            "average": 6.645093770858262,
            "front_plane": 0.3282866337907041,
            "rear_plane": null,
            "right_plane": 0.34793152531147714,
            "left_plane": 9.873651140968278,
            "ground_plane": 16.03050578336259
        }
    }
}
Key
Description

dataset_id

A unique value to identify the dataset. dataset_id can be used to retrieve the extrinsic parameters.

calibration_algorithm_version

The version of the algorithm used to calculate extrinsic parameters. This value can be used to map extrinsic parameters to a specific algorithm version.

extrinsic_parameters

roll, pitch, and yaw are given in degrees and px, py, and pz are given in meters.

error_stats

plane_distance_error is the mean of the distance between the plane LiDAR points to the respective planes.

Get Extrinsic Parameters

This GET api call returns dataset_id, extrinsic_parameters, and error_stats.

https://tools.calibrate.deepen.ai/api/v2/external/datasets/{datasetId}/extrinsic_parameters

Request

Path parameters

Parameter name
Parameter type
Description

datasetId

string

datasetId obtained from the response of Upload file and calibrate API.

Response

{
    "dataset_id": "XXXXXXXXXXXXXXXXX",
    "calibration_algorithm_version": "target_based:v1.01",
    "extrinsic_parameters": {
        "roll": 0.11618719284657622,
        "pitch": 16.0300957665656,
        "yaw": 0.39630825451294216,
        "px": 1.6981482368157552,
        "py": -0.16870946105733092,
        "pz": 1.411771066176465
    },
    "error_stats": {
        "distance_error": {
            "average": 0.016561422963971915,
            "front_plane": 0.005801826261770239,
            "rear_plane": null,
            "right_plane": 0.007627213567991918,
            "left_plane": 0.03803306081172632,
            "ground_plane": 0.014783591214399178
        },
        "angle_error": {
            "average": 6.645093770858262,
            "front_plane": 0.3282866337907041,
            "rear_plane": null,
            "right_plane": 0.34793152531147714,
            "left_plane": 9.873651140968278,
            "ground_plane": 16.03050578336259
        }
    }
}
Key
Description

dataset_id

A unique value to identify the dataset. dataset_id can be used to retrieve the extrinsic parameters.

calibration_algorithm_version

The version of the algorithm used to calculate extrinsic parameters. This value can be used to map extrinsic parameters to a specific algorithm version.

extrinsic_parameters

roll, pitch, and yaw are given in degrees and px, py, and pz are given in meters.

error_stats

plane_distance_error is the mean of the distance between the plane LiDAR points to the respective planes.

Note

The setup should prevent false detections. For example, other plane surfaces of similar shape may be identified as a board, which might give false solutions. You can always check the boards identified on the web application.

Before invoking the APIs, the client must obtain the clientId and auth token from Deepen AI. If you are a calibration admin, you can create different Access Tokens using the UI and use those instead. clientId is part of the path parameters in most API calls, and the auth token should be prefixed with “Bearer “ and passed to the ‘Authorization’ header in all API requests. How to get Access Tokens can be found on the following link:

Access token for APIs
2KB
config.json
2KB
config.json
1004B
config.json
All fles in the main direcory
Sub folder structure
Pcd files inside the lidar file