# MaskFlownet-Pytorch **Repository Path**: vibratingRose/MaskFlownet-Pytorch ## Basic Information - **Project Name**: MaskFlownet-Pytorch - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-02-05 - **Last Updated**: 2024-02-05 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # MaskFlownet-Pytorch Unofficial PyTorch implementation of MaskFlownet (https://github.com/microsoft/MaskFlownet). Tested with: * PyTorch 1.5.0 * CUDA 10.1 ### Install The correlation package must be installed first: ``` cd model/correlation_package python setup.py install ``` ### Inference Right now, I implemented the inference script for KITTI 2012/2015, MPI Sintel and FlyingChairs. ``` python predict.py CONFIG -c CHECKPOINT --dataset_cfg DATASET -f ROOT_FOLDER [-b BATCH_SIZE] ``` For example: * ``` python predict.py MaskFlownet.yaml -c 5adNov03-0005_1000000.pth --dataset_cfg sintel.yaml -f ./SINTEL -b 4``` * ``` python predict.py MaskFlownet.yaml -c 8caNov12-1532_300000.pth --dataset_cfg kitti.yaml -f ./KITTI -b 4``` * ``` python predict.py MaskFlownet_S.yaml -c 771Sep25-0735_500000.pth --dataset_cfg chairs.yaml -f ./FLYINGCHAIRS -b 4 ``` * ``` python predict.py MaskFlownet_S.yaml -c dbbSep30-1206_1000000.pth --dataset_cfg sintel.yaml -f ./SINTEL -b 4 ``` ### Differences with the original implementation The results are slightly different from the original implementation: | Checkpoint | Network | Implementation | KITTI2012 | KITTI2015 | Sintel Clean | Sintel Final | FlyingChairs | | --- | --- | --- | --- | --- | --- | --- | --- | | 771Sep25 | MaskFlownet_S |
Original AEPE:
PyTorch AEPE:
4.12
4.18
11.52
11.82
3.38
3.38
4.71
4.70
1.84
1.83
Original AEPE:
PyTorch AEPE:
1.27
1.28
1.92
1.93
2.76
2.78
3.29
3.32
2.36
2.36
Original AEPE:
PyTorch AEPE:
1.16
1.18
1.66
1.68
2.58
2.59
3.14
3.17
2.23
2.23
Original AEPE:
PyTorch AEPE:
0.82
0.82
1.38
1.38
4.34
4.40
5.27
5.33
4.01
3.99