This is the official PyTorch implementation of the method described in
URCDC-Depth: Uncertainty Rectified Cross-Distillation with CutFlip for Monocular Depth Estimation
Shuwei Shao, Zhongcai Pei, Weihai Chen, Ran Li, Zhong Liu and Zhengguo Li
We have released the code of CutFlip, which has been incorporated into the dataloader.py. Apart from the results shown in the article, we apply the CutFip to different monocular depth estimation algorithms on the KITTI dataset, such as BTS, TransDepth and Adabins,
We have released the complete code.
If you find our work useful in your research please consider citing our paper:
@article{shao2023urcdc,
title={URCDC-Depth: Uncertainty Rectified Cross-Distillation with CutFlip for Monocular Depth Estimatione},
author={Shao, Shuwei and Pei, Zhongcai and Chen, Weihai and Li, Ran and Liu, Zhong and Li, Zhengguo},
journal={IEEE Transactions on Multimedia},
year={2023},
}
conda create -n urcdc python=3.8
conda activate urcdc
conda install pytorch=1.10.0 torchvision cudatoolkit=11.1
pip install matplotlib, tqdm, tensorboardX, timm, mmcv
You can prepare the datasets KITTI and NYUv2 according to here, and then modify the data path in the config files to your dataset locations.
First download the pretrained encoder backbone from here, and then modify the pretrain path in the config files.
Training the NYUv2 model:
python urcdc/train.py configs/arguments_train_nyu.txt
Training the KITTI model:
python urcdc/train.py configs/arguments_train_kittieigen.txt
Evaluate the NYUv2 model:
python urcdc/eval.py configs/arguments_eval_nyu.txt
Evaluate the KITTI model:
python urcdc/eval.py configs/arguments_eval_kittieigen.txt
Model | Abs.Rel. | Sqr.Rel | RMSE | RMSElog | a1 | a2 | a3 |
---|---|---|---|---|---|---|---|
NYUv2 (code:urcd) | 0.088 | - | 0.316 | - | 0.933 | 0.992 | 0.998 |
KITTI_Eigen (code:urcd) | 0.050 | 0.142 | 2.032 | 0.076 | 0.977 | 0.997 | 0.999 |
If you have any questions, please feel free to contact swshao@buaa.edu.cn.
Our code is based on the implementation of NeWCRFs and BTS. We thank their excellent works.