Skip to content

Latest commit

 

History

History
138 lines (112 loc) · 7.84 KB

RESULTS.md

File metadata and controls

138 lines (112 loc) · 7.84 KB

Robo3D Benchmark

Outline

Metrics

LiDAR Semantic Segmentation

The mean Intersection-over-Union (mIoU) is consistently used as the main indicator for evaluating model performance in our LiDAR semantic segmentation benchmark. The following two metrics are adopted to compare between models' robustness:

  • mCE (the lower the better): The average corruption error (in percentage) of a candidate model compared to the baseline model, which is calculated among all corruption types across three severity levels.
  • mRR (the higher the better): The average resilience rate (in percentage) of a candidate model compared to its "clean" performance, which is calculated among all corruption types across three severity levels.

3D Object Detection

The mean average precision (mAP) and nuScenes detection score (NDS) are consistently used as the main indicator for evaluating model performance in our LiDAR semantic segmentation benchmark. The following two metrics are adopted to compare between models' robustness:

  • mCE (the lower the better): The average corruption error (in percentage) of a candidate model compared to the baseline model, which is calculated among all corruption types across three severity levels.
  • mRR (the higher the better): The average resilience rate (in percentage) of a candidate model compared to its "clean" performance, which is calculated among all corruption types across three severity levels.

Benchmark

🚗  SemanticKITTI-C

Benchmark: IoU (%)

Model mCE (%) mRR (%) Clean Fog Wet Ground Snow Motion Blur Beam Missing Cross-Talk Incomplete Echo Cross-Sensor
SqueezeSeg 164.87 66.81 31.61 18.85 27.30 22.70 17.93 25.01 21.65 27.66 7.85
SqueezeSegV2 152.45 65.29 41.28 25.64 35.02 27.75 22.75 32.19 26.68 33.80 11.78
RangeNet21 136.33 73.42 47.15 31.04 40.88 37.43 31.16 38.16 37.98 41.54 18.76
RangeNet53 130.66 73.59 50.29 36.33 43.07 40.02 30.10 40.80 46.08 42.67 16.98
SalsaNext 116.14 80.51 55.80 34.89 48.44 45.55 47.93 49.63 40.21 48.03 44.72
FIDNet34 113.81 76.99 58.80 43.66 51.63 49.68 40.38 49.32 49.46 48.17 29.85
CENet34 103.41 81.29 62.55 42.70 57.34 53.64 52.71 55.78 45.37 53.40 45.84
KPConv 99.54 82.90 62.17 54.46 57.70 54.15 25.70 57.35 53.38 55.64 53.91
PIDSNAS1.25x 104.13 77.94 63.25 47.90 54.48 48.86 22.97 54.93 56.70 55.81 52.72
PIDSNAS2.0x 101.20 78.42 64.55 51.19 55.97 51.11 22.49 56.95 57.41 55.55 54.27
WaffleIron 109.54 72.18 66.04 45.52 58.55 49.30 33.02 59.28 22.48 58.55 54.62
PolarNet 118.56 74.98 58.17 38.74 50.73 49.42 41.77 54.10 25.79 48.96 39.44
MinkUNet18 100.00 81.90 62.76 55.87 53.99 53.28 32.92 56.32 58.34 54.43 46.05
MinkUNet34 100.61 80.22 63.78 53.54 54.27 50.17 33.80 57.35 58.38 54.88 46.95
Cylinder3DSPC 103.25 80.08 63.42 37.10 57.45 46.94 52.45 57.64 55.98 52.51 46.22
Cylinder3DTSC 103.13 83.90 61.00 37.11 53.40 45.39 58.64 56.81 53.59 54.88 49.62
SPVCNN18 100.30 82.15 62.47 55.32 53.98 51.42 34.53 56.67 58.10 54.60 45.95
SPVCNN34 99.16 82.01 63.22 56.53 53.68 52.35 34.39 56.76 59.00 54.97 47.07
RPVNet 111.74 73.86 63.75 47.64 53.54 51.13 47.29 53.51 22.64 54.79 46.17
CPGNet 107.34 81.05 61.50 37.79 57.39 51.26 59.05 60.29 18.50 56.72 57.79
2DPASS 106.14 77.50 64.61 40.46 60.68 48.53 57.80 58.78 28.46 55.84 50.01
GFNet 108.68 77.92 63.00 42.04 56.57 56.71 58.59 56.95 17.14 55.23 49.48

Benchmark: CE (%)

Model mCE (%) Fog Wet Ground Snow Motion Blur Beam Missing Cross-Talk Incomplete Echo Cross-Sensor
SqueezeSeg 164.87
SqueezeSegV2 152.45
RangeNet21 136.33
RangeNet53 130.66
SalsaNext 116.14
FIDNet34 113.81
CENet34 103.41
KPConv 99.54
PIDSNAS1.25x 104.13
PIDSNAS2.0x 101.20
WaffleIron 109.54
PolarNet 118.56
MinkUNet18 100.00
MinkUNet34 100.61
Cylinder3DSPC 103.25
Cylinder3DTSC 103.13
SPVCNN18 100.30
SPVCNN34 99.16
RPVNet 111.74
CPGNet 107.34
2DPASS 106.14
GFNet 108.68

Benchmark: RR (%)

Model mRR (%) Fog Wet Ground Snow Motion Blur Beam Missing Cross-Talk Incomplete Echo Cross-Sensor
SqueezeSeg 66.81
SqueezeSegV2 65.29
RangeNet21 73.42
RangeNet53 73.59
SalsaNext 80.51
FIDNet34 76.99
CENet34 81.29
KPConv 82.90
PIDSNAS1.25x 77.94
PIDSNAS2.0x 78.42
WaffleIron
PolarNet
MinkUNet18
MinkUNet34
Cylinder3DSPC
Cylinder3DTSC
SPVCNN18
SPVCNN34
RPVNet
CPGNet
2DPASS
GFNet

🚙  nuScenes-C (Seg)

To be updated.