- Prepare DTU training set(640x512) and BlendedMVS dataset(768x576).
- Edit config.py and set "DatasetsArgs.root_dir", "LoadDTU.train_root&train_pair", and "LoadBlendedMVS.train_root".
- Run the script for training.
# DTU
python train.py -d dtu
# BlendedMVS
python train.py -d blendedmvs
The Pre-training model in "pth".
- Prepare DTU test set(1600x1200)(Baidu Netdisk code:6au3) and Tanks and Temples dataset(Baidu Netdisk code:a4oz).
- Edit config.py and set "DatasetsArgs.root_dir", "LoadDTU.eval_root&eval_pair", and "LoadTanks.eval_root"
- Run the script for the test.
# DTU
python eval.py -p pth/dtu_29.pth.pth -d dtu
# Tanks and Temples
python eval.py -p pth/blendedmvs_29.pth -d tanks
There three methods in "tools": "filter", "gipuma", and "pcd".
- Install fusibile tools: tools/fusibile or https://github.com/kysucix/fusibile
- Edit tools/gipuma/conf.py and set "root_dir", "eval_folder" and "fusibile_exe_path".
- Run the script.
cd tools/gipuma
python fusion.py -cfmgd
- Run the script.
# filter(main method)
cd tools/filter
python dynamic_filter_gpu.py -e EVAL_OUTPUT_LOCATION -r DATASET_PATH -o OUTPUT_PATH
# pcd
cd tools/pcd
chmod +x ninja_init.sh
source ninja_init.sh
python fusion.py -e EVAL_OUTPUT_LOCATION -r DATASET_PATH -o OUTPUT_PATH
Acc(mm) | Comp(mm) | Overall(mm) | Time(s/view) | Memory(M) | |
---|---|---|---|---|---|
MDFNet(4scale) | 0.349 | 0.303 | 0.326 | 0.376 | 4396 |
intermediate | advanced | |
---|---|---|
MDFNet(4scale) | 56.18 | 34.70 |
MDFNet(3scale) | 60.24 | 37.31 |
Our work is partially baed on these opening source work: MVSNet, MVSNet-pytorch, D2HC-RMVSNet, pcd-fusion. We appreciate their contributions to the MVS community.
This work will be published in Machine Vision and Applications.