Neighbor-Vote: Improving Monocular 3D Object Detection through Neighbor Distance Voting (ACM MM 2021)
Xiaomeng Chu, Jiajun Deng, Yao Li, Zhenxun Yuan, Yanyong Zhang, Jianmin Ji, Yu Zhang
@inproceedings{chu2021neighbor,
title={Neighbor-vote: Improving monocular 3d object detection through neighbor distance voting},
author={Chu, Xiaomeng and Deng, Jiajun and Li, Yao and Yuan, Zhenxun and Zhang, Yanyong and Ji, Jianmin and Zhang, Yu},
booktitle={Proceedings of the 29th ACM International Conference on Multimedia},
pages={5239--5247},
year={2021}
}
This repository is an official implementation of Neighbor-Vote, a novel method that incorporates neighbor predictions to ameliorate object detection from severely deformed pseudo-LiDAR point clouds.
-
Prepare for the running environment.
You can follow the installation steps in
OpenPCDet
. -
Prepare for the data.
Please download the official KITTI 3D object detection dataset and you need to prepare your depth maps and put them to
data/kitti/training/dorn
. To provide ease of use, PatchNet provides the estimated depth maps generated from the pretrained models DORN. And you can directly download the results of 2D detector FCOS on the KITTI train set from here. Please organize the downloaded files as follows:Neighbor-Vote ├── data │ ├── kitti │ │ │── ImageSets │ │ │── training │ │ │ ├──calib & velodyne & label_2 & image_2 & dorn & 2d_score_fcos ├── pcdet ├── tools
Generate the data infos by running the following command:
python -m pcdet.datasets.kitti.kitti_dataset create_kitti_infos tools/cfgs/dataset_configs/kitti_dataset.yaml
-
Setup.
python setup.py develop
The model weights can be downloaded from here.
The configuration file is pointpillar.yaml
in tools/cfgs/kitti_models, and the validation scripts is in tools/scripts.
cd tools
sh scripts/dist_test.sh ${NUM_GPUS} \
--cfg_file ${CONFIG_FILE} --batch_size ${BATCH_SIZE} --ckpt ${CKPT}
Thanks to the strong and flexible OpenPCDet
codebase maintained by Shaoshuai Shi (@sshaoshuai) and Chaoxu Guo (@Gus-Guo).
This repository is implemented by Xiaomeng Chu ([email protected]).