Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor(lidar_centerpoint): rework parameters #7204

Closed
Closed
24 changes: 6 additions & 18 deletions perception/lidar_centerpoint/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,24 +28,12 @@ We trained the models using <https://github.com/open-mmlab/mmdetection3d>.

## Parameters

### Core Parameters

| Name | Type | Default Value | Description |
| ------------------------------------------------ | ------------ | ------------------------- | ------------------------------------------------------------- |
| `encoder_onnx_path` | string | `""` | path to VoxelFeatureEncoder ONNX file |
| `encoder_engine_path` | string | `""` | path to VoxelFeatureEncoder TensorRT Engine file |
| `head_onnx_path` | string | `""` | path to DetectionHead ONNX file |
| `head_engine_path` | string | `""` | path to DetectionHead TensorRT Engine file |
| `build_only` | bool | `false` | shutdown the node after TensorRT engine file is built |
| `trt_precision` | string | `fp16` | TensorRT inference precision: `fp32` or `fp16` |
| `post_process_params.score_threshold` | double | `0.4` | detected objects with score less than threshold are ignored |
| `post_process_params.yaw_norm_thresholds` | list[double] | [0.3, 0.3, 0.3, 0.3, 0.0] | An array of distance threshold values of norm of yaw [rad]. |
| `post_process_params.iou_nms_target_class_names` | list[string] | - | target classes for IoU-based Non Maximum Suppression |
| `post_process_params.iou_nms_search_distance_2d` | double | - | If two objects are farther than the value, NMS isn't applied. |
| `post_process_params.iou_nms_threshold` | double | - | IoU threshold for the IoU-based Non Maximum Suppression |
| `post_process_params.has_twist` | boolean | false | Indicates whether the model outputs twist value. |
| `densification_params.world_frame_id` | string | `map` | the world frame id to fuse multi-frame pointcloud |
| `densification_params.num_past_frames` | int | `1` | the number of past frames to fuse with the current frame |
{{ json_to_markdown("perception/lidar_centerpoint/schema/centerpoint.schema.json") }}
{{ json_to_markdown("perception/lidar_centerpoint/schema/centerpoint_ml_package.schema.json") }}
{{ json_to_markdown("perception/lidar_centerpoint/schema/centerpoint_tiny.schema.json") }}
{{ json_to_markdown("perception/lidar_centerpoint/schema/centerpoint_tiny_ml_package.schema.json") }}
{{ json_to_markdown("perception/lidar_centerpoint/schema/centerpoint_sigma_ml_package.schema.json") }}
{{ json_to_markdown("perception/lidar_centerpoint/schema/detection_class_remapper.schema.json") }}

### The `build_only` option

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,26 @@
"default": "fp16",
"enum": ["fp32", "fp16"]
},
"encoder_onnx_path": {
"type": "string",
"description": "A path to the ONNX file of the encoder network.",
"default": "~/autoware_data/lidar_centerpoint/pts_voxel_encoder_centerpoint_tiny.onnx"
},
"encoder_engine_path": {
"type": "string",
"description": "A path to the TensorRT engine file of the encoder network.",
"default": "~/autoware_data/lidar_centerpoint/pts_voxel_encoder_centerpoint_tiny.engine"
},
"head_onnx_path": {
"type": "string",
"description": "A path to the ONNX file of the head network.",
"default": "~/autoware_data/lidar_centerpoint/pts_backbone_neck_head_centerpoint_tiny.onnx"
},
"head_engine_path": {
"type": "string",
"description": "A path to the TensorRT engine file of the head network.",
"default": "~/autoware_data/lidar_centerpoint/pts_backbone_neck_head_centerpoint_tiny.engine"
},
"post_process_params": {
"type": "object",
"properties": {
Expand All @@ -31,7 +51,7 @@
},
"circle_nms_dist_threshold": {
"type": "number",
"description": "",
"description": "It specifies the distance threshold for performing Circle NMS. Detection boxes within this distance that overlap or are close to each other will be suppressed, keeping only the one with the highest score.",
"default": 0.5,
"minimum": 0.0,
"maximum": 1.0
Expand All @@ -54,11 +74,6 @@
"default": 0.1,
"minimum": 0.0,
"maximum": 1.0
},
"has_twist": {
"type": "boolean",
"description": "Indicates whether the model outputs twist value.",
"default": false
}
}
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,16 @@
"type": "integer",
"description": "A size of encoder input feature channels.",
"default": 9
},
"has_variance": {
"type": "boolean",
"description": "determines whether the covariance matrices for the object's position and velocity information are filled in.",
"default": false
},
"has_twist": {
"type": "boolean",
"description": "Indicates whether the model outputs twist value.",
"default": false
}
}
}
Expand All @@ -58,7 +68,7 @@
"type": "object",
"properties": {
"ros__parameters": {
"$ref": "#/definitions/centerpoint_ml_package"
"$ref": "#/definitions/centerpoint_tiny_ml_package"
}
},
"required": ["ros__parameters"]
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Parameters for Centerpoint Sigma ML model",
"type": "object",
"definitions": {
"centerpoint_sigma_ml_package": {
"type": "object",
"properties": {
"model_params": {
"type": "object",
"description": "Parameters for model configuration.",
"properties": {
"class_names": {
"type": "array",
"description": "An array of class names will be predicted.",
"default": ["CAR", "TRUCK", "BUS", "BICYCLE", "PEDESTRIAN"],
"uniqueItems": true
},
"point_feature_size": {
"type": "integer",
"description": "A number of channels of point feature layer.",
"default": 4
},
"max_voxel_size": {
"type": "integer",
"description": "A maximum size of voxel grid.",
"default": 40000
},
"point_cloud_range": {
"type": "array",
"description": "An array of distance ranges of each class, this must have same length with `class_names`.",
"default": [-76.8, -76.8, -4.0, 76.8, 76.8, 6.0]
},
"voxel_size": {
"type": "array",
"description": "An array of voxel grid sizes for PointPainting, this must have same length with `paint_class_names`.",
"default": [0.32, 0.32, 10.0]
},
"down_sample_factor": {
"type": "integer",
"description": "A scale factor of downsampling points",
"default": 1,
"minimum": 1
},
"encoder_in_feature_size": {
"type": "integer",
"description": "A size of encoder input feature channels.",
"default": 9
},
"has_variance": {
"type": "boolean",
"description": "determines whether the covariance matrices for the object's position and velocity information are filled in.",
"default": false
},
"has_twist": {
"type": "boolean",
"description": "Indicates whether the model outputs twist value.",
"default": false
}
}
}
},
"required": ["model_params"]
}
},
"properties": {
"/**": {
"type": "object",
"properties": {
"ros__parameters": {
"$ref": "#/definitions/centerpoint_tiny_ml_package"
}
},
"required": ["ros__parameters"]
}
},
"required": ["/**"]
}
119 changes: 119 additions & 0 deletions perception/lidar_centerpoint/schema/centerpoint_tiny.schema.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Parameters for CenterPoint Tiny Model",
"type": "object",
"definitions": {
"centerpoint_tiny": {
"type": "object",
"properties": {
"trt_precision": {
"type": "string",
"description": "TensorRT inference precision.",
"default": "fp16",
"enum": ["fp32", "fp16"]
},
"encoder_onnx_path": {
"type": "string",
"description": "A path to the ONNX file of the encoder network.",
"default": "~/autoware_data/lidar_centerpoint/pts_voxel_encoder_centerpoint_tiny.onnx"
},
"encoder_engine_path": {
"type": "string",
"description": "A path to the TensorRT engine file of the encoder network.",
"default": "~/autoware_data/lidar_centerpoint/pts_voxel_encoder_centerpoint_tiny.engine"
},
"head_onnx_path": {
"type": "string",
"description": "A path to the ONNX file of the head network.",
"default": "~/autoware_data/lidar_centerpoint/pts_backbone_neck_head_centerpoint_tiny.onnx"
},
"head_engine_path": {
"type": "string",
"description": "A path to the TensorRT engine file of the head network.",
"default": "~/autoware_data/lidar_centerpoint/pts_backbone_neck_head_centerpoint_tiny.engine"
},
"post_process_params": {
"type": "object",
"properties": {
"score_threshold": {
"type": "number",
"description": "A threshold value of existence probability score, all of objects with score less than this threshold are ignored.",
"default": 0.35,
"minimum": 0.0,
"maximum": 1.0
},
"yaw_norm_thresholds": {
"type": "array",
"description": "An array of distance threshold values of norm of yaw [rad].",
"default": [0.3, 0.3, 0.3, 0.3, 0.0],
"items": {
"type": "number",
"minimum": 0.0,
"maximum": 1.0
}
},
"circle_nms_dist_threshold": {
"type": "number",
"description": "It specifies the distance threshold for performing Circle NMS. Detection boxes within this distance that overlap or are close to each other will be suppressed, keeping only the one with the highest score.",
"default": 0.5,
"minimum": 0.0,
"maximum": 1.0
},
"iou_nms_target_class_names": {
"type": "array",
"description": "An array of class names to be target in NMS.",
"default": ["CAR"],
"items": {
"type": "string"
},
"uniqueItems": true
},
"iou_nms_search_distance_2d": {
"type": "number",
"description": "A maximum distance value to search the nearest objects.",
"default": 10.0,
"minimum": 0.0
},
"iou_nms_threshold": {
"type": "number",
"description": "A threshold value of NMS using IoU score.",
"default": 0.1,
"minimum": 0.0,
"maximum": 1.0
}
}
},
"densification_params": {
"type": "object",
"description": "Parameters for pointcloud densification.",
"properties": {
"world_frame_id": {
"type": "string",
"description": "A name of frame id where world coordinates system is defined with respect to.",
"default": "map"
},
"num_past_frames": {
"type": "integer",
"description": "A number of past frames to be considered as same input frame.",
"default": 1,
"minimum": 0
}
}
}
},
"required": ["post_process_params", "densification_params"]
}
},
"properties": {
"/**": {
"type": "object",
"properties": {
"ros__parameters": {
"$ref": "#/definitions/centerpoint_tiny"
}
},
"required": ["ros__parameters"]
}
},
"required": ["/**"]
}
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,16 @@
"type": "integer",
"description": "A size of encoder input feature channels.",
"default": 9
},
"has_variance": {
"type": "boolean",
"description": "determines whether the covariance matrices for the object's position and velocity information are filled in.",
"default": false
},
"has_twist": {
"type": "boolean",
"description": "Indicates whether the model outputs twist value.",
"default": false
}
}
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Parameters for Detection Class Remapper Node",
"type": "object",
"definitions": {
"detection_class_remapper": {
"type": "object",
"properties": {
"allow_remapping_by_area_matrix": {
"type": "array",
"description": "A matrix that defines how to remap original classes to new classes based on area.",
"items": {
"type": "integer"
}
},
"min_area_matrix": {
"type": "array",
"description": "A matrix that defines the minimum area thresholds for each class.",
"items": {
"type": "number"
}
},
"max_area_matrix": {
"type": "array",
"description": "A matrix that defines the maximum area thresholds for each class.",
"items": {
"type": "number"
}
}
},
"required": ["allow_remapping_by_area_matrix", "min_area_matrix", "max_area_matrix"]
}
},
"properties": {
"/**": {
"type": "object",
"properties": {
"ros__parameters": {
"$ref": "#/definitions/detection_class_remapper"
}
},
"required": ["ros__parameters"]
}
},
"required": ["/**"]
}
Loading