this respository is forked from [cafffe_faster_rcnn_renet]( https://github.com/Enia c-Xie/caffe-fast-rcnn.git), thanks for Eniac-Xie
Now I want to do:
- implement FPN(Feature Pyramid Network)
- Test on VOC
- Test on COCO
This code extends py-faster-rcnn by adding ResNet implementation and Online Hard Example Mining.
This is a ResNet Implementation for Faster-RCNN. The faster rcnn code is based on py-faster-rcnn. The ohem code is based on ohem. To reduce the memory usage, we use batchnorm layer in Microsoft's caffe
- The caffe-fast-rcnn we use is a little different from the one py-faster-rcnn use, it uses the batchnorm layer from Microsoft's caffe to reduce the memory usage.
- Using the in-place eltwise sum within the PR
- To reduce the memory usage, we also release a pretrained ResNet-101 model in which batchnorm layer's parameters is merged into scale layer's, see tools/merge_bn_scale.py form more detail.
- Use Online-Hard-Example-Mining while training.
The usage is similar to py-faster-rcnn.
- Clone this repository
git clone https://github.com/Eniac-Xie/faster-rcnn-resnet.git
We'll call the directory that you cloned faster-rcnn-resnet ROOT
- Clone the modified caffe-fast-rcnn
cd $ROOT/
git clone https://github.com/Eniac-Xie/caffe-fast-rcnn.git
- Build Cython module
cd $ROOT/lib/
make
- Build Caffe
cd $ROOT/caffe-fast-rcnn
make all -j8
make pycaffe
training data | test data | ohem | [email protected] | |
---|---|---|---|---|
Faster-RCNN, ResNet-50 | VOC 07+12 trainval | VOC 07 test | False | 78.78% |
Faster-RCNN, ResNet-101 | VOC 07+12 trainval | VOC 07 test | True | 79.44% |
Download faster-rcnn-resnet weights from:
faster-rcnn-resnet without ohem (BaiduYun)
faster-rcnn-resnet without ohem (OneDrive)
faster-rcnn-resnet with ohem (BaiduYun)
faster-rcnn-resnet with ohem (OneDrive)
then you can do as follow:
cd $ROOT/
sh experiments/scripts/train_resnet101_bn_scale_merged_0712_end2end.sh
make
or
cd $ROOT/
sh experiments/scripts/train_resnet101_bn_scale_merged_0712_end2end_ohem.sh
make
Download resnet-101 pretrained model, note that we use a modified version in which batchnorm layer's parameters is merged into scale layer's, you can download the model from Baidu Yun or OneDrive
then you can do as follow:
cd $ROOT/
sh experiments/scripts/train_resnet101_bn_scale_merged_0712_end2end.sh
or
cd $ROOT/
sh experiments/scripts/train_resnet101_bn_scale_merged_0712_end2end_ohem.sh