Motion-Aware Correlation Filters for Online Visual Tracking
Abstract
:1. Introduction
1.1. DCF-Based Trackers
1.2. Solutions to the Problem of Fast Motion
1.3. Our Contributions
- A novel tracking framework named MACF which corrects the padding region using motion cues predicted by separated joint instantaneous motion estimation Kalman filters, one for in-plane position prediction and the other for scale prediction;
- An attractive confidence function of the response map to identify the situation where the target is occluded or corrupted and an adaptive learning rate to prevent the model from being corrupted.
- Qualitative and quantitative experiments on OTB-50, OTB-100 and UAV video have demonstrated that our approach outperforms most of the state-of-the-art trackers.
2. The Reference Tracker
3. Our Approach
3.1. Instantaneous Motion Estimation between Three Adjacent Frames
3.2. Kalman Filters-Based Motion Estimation
3.2.1. Prediction
3.2.2. Measurement and Correction
3.3. Motion-Aware in Our Framework
3.4. Position and Scale Detection
3.5. A Novel Model Update Strategy
Algorithm 1. MACF tracking algorithm |
Input: 1: Image . 2: Predicted target position and scale in previous frame. |
Output: |
1: Detected target position and scale in current frame. 2: Predicted target position and scale subsequent frame. |
Loop: 1: Initialize the Translation model , and Scale model , in the first frame by Equations (3) and (4), and initialize the Confidence of the Squared Response Map in the initial frame by Equation (19). 2: for do. 3: Position detection and prediction: 4: Extract pending sample feature from at and . 5: Compute correlation scores by Equation (8). 6: Set to the target position that maximizes . 7: Predict the position of the target of subsequent frame by joint Equations (11) and (17). 8: Scale detection and prediction: 9: Extract pending sample feature from at and . 10: Compute correlation scores by Equation (8). 11: Set to the target scale that maximizes . 12: Predict the position of the target of subsequent frame by joint Equations (11) and (17). 13: Model update: 14: Compute the Confidence of the Squared Response Map in current frame by Equation (17). 15: Compute the adaptive learning rate by Equation (18). 16: Extract sample features and from at and . 17: Update motion parameters (, , ), (, , ) by Equations (9) and (10). 18: Update Kalman filters by Equation (18). 19: Update the translation model , by adaptive learning rate . 20: Update the scale model , by adaptive learning rate . 21: Return , and , . 22: end for. |
4. Experiments and Results
4.1. Implement Details
4.2. Ablation Experiments
4.3. Experiment on OTB-50
4.4. Experiment on OTB-100
4.5. Comparation on Raw Benchmark Results
4.6. Experiment on UAV Video
4.6.1. Materials and Conditions
4.6.2. Results and Analysis
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A
References
- Bolme, D.S.; Beveridge, J.R.; Draper, B.A.; Lui, Y.M. Visual object tracking using adaptive correlation filters. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), San Francisco, CA, USA, 13–18 June 2010; Volume 119, pp. 2544–2550. [Google Scholar]
- Kalal, Z.; Mikolajczyk, K.; Matas, J. Tracking-Learning-Detection. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 1409–1422. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wu, Y.; Lim, J.; Yang, M.H. Online Object Tracking: A Benchmark. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition CVPR, Portland, OR, USA, 23–28 June 2013; Volume 9, pp. 2411–2418. [Google Scholar]
- Wu, Y.; Lim, J.; Yang, M.H. Online Object Tracking: A Benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 1834–1848. [Google Scholar] [CrossRef] [PubMed]
- Kristan, M.; Pflugfelder, R.; Leonardis, A.; Matas, J.; Porikli, F.; Cehovin, L.; Nebehay, G.; Fernandez, G.; Vojir, T.; Gatt, A.; et al. The Visual Object Tracking VOT2014 Challenge Results. In Proceedings of the IEEE European Conference on Computer Vision Workshops (ECCVW), Zurich, Switzerland, 6–12 September 2014; Volume 8926, pp. 191–217. [Google Scholar]
- Kristan, M.; Pflugfelder, R.; Leonardis, A.; Matas, J.; Porikli, F.; Cehovin, L.; Nebehay, G.; Fernandez, G.; Vojir, T.; Gatt, A.; et al. The Visual Object Tracking VOT2016 Challenge Results. In Proceedings of the IEEE European Conference on Computer Vision Workshops (ECCVW), Amsterdam, The Netherlands, 8–10 October 2016; pp. 777–823. [Google Scholar]
- Possegger, H.; Mauthner, T.; Bischof, H. In defense of color-based model-free tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; Volume 2015, pp. 2113–2120. [Google Scholar]
- Vojir, T.; Noskova, J.; Matas, J. Robust Scale-Adaptive Mean-Shift for Tracking. Pattern Recognit. Lett. 2014, 49, 250–258. [Google Scholar] [CrossRef]
- Danelljan, M.; Khan, F.S.; Felsberg, M.; Weijer, J.V.D. Adaptive Color Attributes for Real-Time Visual Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, 23–28 June 2014; Volume 2014, pp. 1090–1097. [Google Scholar]
- Hare, S.; Saffari, A.; Torr, P.H.S. Struck: Structured output tracking with kernels. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 38, 2096–2109. [Google Scholar] [CrossRef] [PubMed]
- He, S.; Yang, Q.; Lau, R.W.H.; Wang, J.; Yang, M.H. Visual Tracking via Locality Sensitive Histograms. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; Volume 2013, pp. 2427–2434. [Google Scholar]
- Lu, H.; Jia, X.; Yang, M.H. Visual tracking via adaptive structural local sparse appearance model. In Proceedings of the IEEE Computer Vision and Pattern Recognition CVPR, Providence, RI, USA, 16–21 June 2012; Volume 2012, pp. 1822–1829. [Google Scholar]
- Learnedmiller, E.; Sevillalara, L. Distribution fields for tracking. In Proceedings of the IEEE Computer Vision and Pattern Recognition CVPR, Providence, RI, USA, 16–21 June 2012; Volume 2012, pp. 1910–1917. [Google Scholar]
- Galoogahi, H.K.; Fagg, A.; Lucey, S. Learning Background-Aware Correlation Filters for Visual Tracking. In Proceedings of the 2017 IEEE International Conference on Computer Vision Wrokshop (ICCVW), Venice, Italy, 22–29 October 2017; Volume 2017, pp. 1144–1152. [Google Scholar]
- Galoogahi, H.K.; Sim, T.; Lucey, S. Correlation filters with limited boundaries. In Proceedings of the IEEE IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; Volume 2015, pp. 4630–4638. [Google Scholar]
- Danelljan, M.; Häger, G.; Khan, F.S.; Felsberg, M. Discriminative Scale Space Tracking. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1561–1575. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Danelljan, M.; Häger, G.; Khan, F.S.; Felsberg, M. Accurate Scale Estimation for Robust Visual Tracking. In Proceedings of the British Machine Vision Conference, Nottingham, UK, 1–5 September 2014; Volume 2014, pp. 65.1–65.11. [Google Scholar]
- Li, Y.; Zhu, J. A Scale Adaptive Kernel Correlation Filter Tracker with Feature Integration. In Proceedings of the IEEE European Conference on Computer Vision (ECCV), Zurich, Switzerland, 6–12 September 2014; Volume 8926, pp. 254–265. [Google Scholar]
- Li, F.; Yao, Y.; Li, P.; Zhang, D.; Zuo, W.; Yang, M.H. Integrating Boundary and Center Correlation Filters for Visual Tracking with Aspect Ratio Variation. In Proceedings of the 2017 IEEE International Conference on Computer Vision Wrokshop (ICCVW), Venice, Italy, 22–29 October 2017; Volume 2017, pp. 2001–2009. [Google Scholar]
- Zhang, K.; Zhang, L.; Yang, M.H. Real-Time Compressive Tracking. In Proceedings of the IEEE European Conference on Computer Vision (ECCV), Florence, Italy, 7–13 October 2012; Volume 2012, pp. 864–877. [Google Scholar]
- Rui, C.; Martins, P.; Batista, J. Exploiting the circulant structure of tracking-by-detection with kernels. In Proceedings of the IEEE European Conference on Computer Vision (ECCV), Florence, Italy, 7–13 October 2012; Volume 2012, pp. 702–715. [Google Scholar]
- Joao, F.H.; Rui, C.; Pedro, M.; Jorge, B. High-Speed Tracking with Kernelized Correlation Filters. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 583–596. [Google Scholar] [Green Version]
- Xu, L.; Luo, H.; Hui, B.; Zheng, C. Real-Time Robust Tracking for Motion Blur and Fast Motion via Correlation Filters. Sensors 2016, 16, 1443. [Google Scholar] [CrossRef] [PubMed]
- Li, F.; Zhang, S.; Qiao, X. Scene-Aware Adaptive Updating for Visual Tracking via Correlation Filters. Sensors 2017, 17, 2626. [Google Scholar] [CrossRef] [PubMed]
- Lukezic, A.; Vojir, T.; Zajc, L.C.; Matas, J.; Kristan, M. Discriminative Correlation Filter Tracker with Channel and Spatial Reliability. Int. J. Comput. Vis. 2018, 126, 671–688. [Google Scholar] [CrossRef] [Green Version]
- Danelljan, M.; Hager, G.; Khan, F.S.; Felsberg, M. Learning Spatially Regularized Correlation Filters for Visual Tracking. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; Volume 2015, pp. 4310–4318. [Google Scholar]
- Bertinetto, L.; Valmadre, J.; Golodetz, S.; Miksik, O.; Torr, P.H. Staple: Complementary Learners for Real-Time Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; Volume 38, pp. 1401–1409. [Google Scholar]
- Zhang, K.; Zhang, L.; Yang, M.-H.; Zhang, D. Fast Tracking via Spatio-Temporal Context Learning. Computer. Science 2013, 1, 25–32. [Google Scholar]
- Yang, R.; Wei, Z. Real-Time Visual Tracking through Fusion Features. Sensors 2016, 16, 949. [Google Scholar] [Green Version]
- Shi, G.; Xu, T.; Guo, J.; Luo, J.; Li, Y. Consistently Sampled Correlation Filters with Space Anisotropic Regularization for Visual Tracking. Sensors 2017, 17, 2889. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Zhu, J.; Hoi, S.C.H. Reliable Patch Trackers: Robust visual tracking by exploiting reliable patches. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; Volume 2015, pp. 353–361. [Google Scholar]
- Fan, H.; Ling, H. Parallel Tracking and Verifying: Parallel Tracking and Verifying: A Framework for Real-Time and High Accuracy Visual Tracking. In Proceedings of the 2017 IEEE International Conference on Computer Vision Wrokshop (ICCVW), Venice, Italy, 22–29 October 2017; Volume 2017, pp. 5487–5495. [Google Scholar]
- Hao, W.; Chen, S.X.; Yang, B.F.; Chen, K. Robust cubature Kalman filter target tracking algorithm based on genernalized M-estiamtion. Acta Phys. Sin. 2015, 64, 1–7. [Google Scholar]
- Gao, S.; Liu, Y.; Wang, J.; Deng, W.; Heekuck, O. The Joint Adaptive Kalman Filter (JAKF) for Vehicle Motion State Estimation. Sensors 2016, 16, 1103. [Google Scholar] [CrossRef] [PubMed]
- Su, L.M.; Hojin, J.; Woo, S.J.; Gook, P.C. Kinematic Model-Based Pedestrian Dead Reckoning for Heading Correction and Lower Body Motion Tracking. Sensors 2015, 15, 28129–28153. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pajares Redondo, J.; Prieto González, L.; García Guzman, J.; López Boada, B.; Díaz López, V. VEHIOT: Design and Evaluation of an IoT Architecture Based on Low-Cost Devices to Be Embedded in Production Vehicles. Sensors 2018, 18, 486. [Google Scholar] [CrossRef] [PubMed]
- Ettlinger, A.; Neuner, H.; Burgess, T. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors. Sensors 2018, 18, 414. [Google Scholar] [CrossRef] [PubMed]
- Li, P.; Zhang, T.; Ma, B. Unscented Kalman filter for visual curve tracking. Image Vis. Comput. 2004, 22, 157–164. [Google Scholar] [CrossRef]
- Funk, N. A Study of the Kalman Filter Applied to Visual Tracking; University of Alberta: Edmonton, AB, Canada, 2003. [Google Scholar]
- Yoon, Y.; Kosaka, A.; Kak, A.C. A New Kalman-Filter-Based Framework for Fast and Accurate Visual Tracking of Rigid Objects. IEEE Trans. Robot. 2008, 24, 1238–1251. [Google Scholar] [CrossRef] [Green Version]
- Bewley, A.; Ge, Z.; Ott, L.; Ramos, F.; Upcroft, B. Simple online and realtime tracking. In Proceedings of the IEEE International Conference on Image Processing, Phoenix, AZ, USA, 25–28 September 2016; Volume 2016, pp. 3464–3468. [Google Scholar]
- Wang, M.; Liu, Y.; Huang, Z. Large Margin Object Tracking with Circulant Feature Maps. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–16 July 2017; Volume 2017, pp. 4800–4808. [Google Scholar]
- Gladh, S.; Danelljan, M.; Khan, F.S.; Felsberg, M. Deep Motion Features for Visual Tracking. In Proceedings of the 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, 4–8 December 2016. [Google Scholar]
- Danelljan, M.; Bhat, G.; Gladh, S.; Khan, F.S.; Felsberg, M. Deep Motion and Appearance Cues for Visual Tracking. Pattern Recognit. Lett. 2018. [Google Scholar] [CrossRef]
- Ma, C.; Yang, X.; Zhang, C.; Yang, M.H. Long-term correlation tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; Volume 2015, pp. 5388–5396. [Google Scholar]
- Wojke, N.; Bewley, A.; Paulus, D. Simple Online and Realtime Tracking with a Deep Association Metric. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; Volume 2017, pp. 3645–3649. [Google Scholar]
- Kronhamn, T. Geometric illustration of the kalman filter gain and covariance update algorithms. IEEE Control Syst. Mag. 2003, 5, 41–43. [Google Scholar] [CrossRef]
- Oron, S.; Bar-Hillel, A.; Levi, D.; Avidan, S. Locally Orderless Tracking. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; Volume 2012, pp. 1940–1947. [Google Scholar]
- Wang, D.; Lu, H.; Yang, M.H. Least Soft-Threshold Squares Tracking. In Proceedings of the IEEE Computer Vision and Pattern Recognition CVPR, Portland, OR, USA, 23–28 June 2013; Volume 2013, pp. 2371–2378. [Google Scholar]
- Ahuja, N. Robust visual tracking via multi-task sparse learning. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; Volume 2012, pp. 2042–2049. [Google Scholar]
- Danelljan, M.; Bhat, G.; Khan, F.S.; Felsberg, M. ECO: Efficient Convolution Operators for Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; Volume 2017, pp. 6931–6939. [Google Scholar]
- Nam, H.; Han, B. Learning Multi-Domain Convolutional Neural Networks for Visual Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition CVPR, Boston, MA, USA, 7–12 June 2015; Volume 2015, pp. 4293–4302. [Google Scholar]
- Fan, H.; Ling, H. SANet: Structure-Aware Network for Visual Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops CVPRW, Honolulu, HI, USA, 21–26 July 2017; Volume 2017, pp. 2217–2224. [Google Scholar]
- Danelljan, M.; Robinson, A.; Khan, F.S.; Felsberg, M. Beyond Correlation Filters: Learning Continuous Convolution Operators for Visual Tracking. In Proceedings of the IEEE European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 8–10 October 2016; Volume 2016, pp. 472–488. [Google Scholar]
- Bertinetto, L.; Valmadre, J.; Henriques, J.F.; Vedaldi, A.; Torr, P.H.S. Fully-Convolutional Siamese Networks for Object Tracking. In Proceedings of the IEEE European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 8–10 October 2016; Volume 2016, pp. 850–865. [Google Scholar]
- Zhang, T.; Xu, C.; Yang, M.H. Multi-task Correlation Particle Filter for Robust Object Tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition CVPR, Honolulu, HI, USA, 21–26 July 2017; Volume 2017, pp. 4819–4827. [Google Scholar]
- Gašparović, M.; Gajski, D. Two-step camera calibration method developed for micro UAV’s. In Proceedings of the XXIII ISPRS Congress, Prague, Czech Republic, 12–19 July 2016; Volume 2016, pp. 829–833. [Google Scholar]
- Pérez, M.; Agüera, F.; Carvajal, F. Low Cost Surveying Using AN Unmanned Aerial Vehicle. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 40, 311–315. [Google Scholar] [CrossRef]
- Gašparović, M.; Jurjević, L. Gimbal Influence on the Stability of Exterior Orientation Parameters of UAV Acquired Images. Sensors 2017, 17, 401. [Google Scholar] [CrossRef] [PubMed]
Trackers | Precision Plots (AUC%) | Success Plots (AUC%) | Speed (FPS) | ||||
---|---|---|---|---|---|---|---|
OPE | SRE | TRE | OPE | SRE | TRE | ||
FDSST | 62.8 | 55.6 | 60.8 | 50.6 | 44.2 | 53.0 | 49 |
IME_CF | 63.7 | 58.6 | 63.3 | 52.6 | 46.7 | 53.7 | 48 |
KF_CF | 64.5 | 59.3 | 65.5 | 54.4 | 46.9 | 54.0 | 46 |
ALR_CF | 65.0 | 61.1 | 66.6 | 54.6 | 48.6 | 54.6 | 55 |
MACF | 65.1 | 59.7 | 65.6 | 52.3 | 47.1 | 54.4 | 51 |
Trackers | OPE | SRE | TRE | |||
---|---|---|---|---|---|---|
SP (%) | PP (%) | SP (%) | PP (%) | SP (%) | PP (%) | |
MACF | 52.3 | 65.1 | 47.1 | 59.7 | 54.4 | 65.1 |
FDSST | 50.6 | 60.8 | 44.2 | 55.6 | 53.0 | 62.8 |
LCT | 49.3 | 61.3 | 44.0 | 57.4 | 52.9 | 63.0 |
DSST | 47.0 | 58.5 | 44.0 | 56.1 | 53.8 | 64.6 |
KCF | 43.9 | 59.7 | 40.1 | 54.7 | 49.0 | 61.8 |
CSK | 36.5 | 47.0 | 33.2 | 43.2 | 43.4 | 53.7 |
CT | 25.4 | 32.1 | 26.1 | 33.6 | 29.0 | 36.7 |
DEF | 26.4 | 32.3 | 25.4 | 32.3 | 33.1 | 39.4 |
Trackers | SV | IV | OPR | OCC | BC | DEF | MB | FM | IPR | OV | LR | AUC |
---|---|---|---|---|---|---|---|---|---|---|---|---|
MACF | 64.6 | 68.4 | 64.9 | 65.6 | 70.8 | 64.3 | 62.1 | 61.0 | 65.8 | 55.6 | 72.1 | 69.5 |
FDSST | 62.6 | 68.0 | 63.4 | 63.5 | 71.4 | 61.6 | 63.7 | 62.5 | 65.2 | 49.4 | 69.9 | 67.7 |
LCT | 61.3 | 67.4 | 64.1 | 62.1 | 68.9 | 63.5 | 60.8 | 57.9 | 65.3 | 45.8 | 66.8 | 68.6 |
DSST | 62.0 | 67.5 | 61.9 | 60.2 | 67.4 | 60.8 | 56.8 | 54.3 | 63.6 | 46.4 | 68.3 | 64.7 |
KCF | 60.5 | 65.2 | 63.1 | 60.4 | 71.6 | 60.8 | 56.3 | 57.0 | 63.4 | 46.5 | 65.1 | 63.7 |
CSK | 50.3 | 54.6 | 52.0 | 48.7 | 57.0 | 51.4 | 42.7 | 41.7 | 52.9 | 33.5 | 54.7 | 55.7 |
CT | 38.4 | 35.8 | 40.9 | 38.2 | 38.3 | 39.9 | 22.9 | 25.9 | 39.9 | 31.8 | 49.3 | 38.7 |
DEF | 38.9 | 43.5 | 46.1 | 43.5 | 47.6 | 45.7 | 35.2 | 34.9 | 45.3 | 29.4 | 41.6 | 46.0 |
Trackers | SV | IV | OPR | OCC | BC | DEF | MB | FM | IPR | OV | LR | AUC |
---|---|---|---|---|---|---|---|---|---|---|---|---|
MACF | 66.2 | 71.8 | 65.5 | 63.5 | 72.6 | 62.5 | 63.6 | 63.9 | 67.0 | 57.3 | 65.2 | 69.6 |
FDSST | 61.8 | 68.4 | 62.8 | 59.5 | 70.5 | 58.5 | 61.6 | 63.2 | 66.9 | 50.4 | 64.7 | 68.8 |
LCT | 61.9 | 67.8 | 66.6 | 60.2 | 66.1 | 61.6 | 60.2 | 62.0 | 69.9 | 52.0 | 64.3 | 68.1 |
DSST | 59.3 | 67.5 | 60.6 | 56.6 | 63.8 | 52.8 | 52.0 | 51.0 | 62.8 | 43.1 | 63.6 | 66.9 |
KCF | 58.2 | 64.2 | 62.9 | 60.0 | 65.2 | 58.6 | 55.3 | 58.1 | 63.8 | 48.0 | 62.3 | 66.5 |
CSK | 44.2 | 47.3 | 46.7 | 42.0 | 52.7 | 42.5 | 34.9 | 38.7 | 49.5 | 27.7 | 43.8 | 49.3 |
CT | 32.8 | 29.7 | 35.6 | 32.4 | 35.8 | 31.5 | 20.7 | 21.1 | 34.9 | 30.8 | 40.3 | 33.0 |
DEF | 34.5 | 39.7 | 43.0 | 41.6 | 43.1 | 40.4 | 27.6 | 30.5 | 41.4 | 34.4 | 41.9 | 40.6 |
Trackers | SV | IV | OPR | OCC | BC | DEF | MB | FM | IPR | OV | LR | AUC |
---|---|---|---|---|---|---|---|---|---|---|---|---|
MACF | 60.6 | 64.5 | 59.8 | 58.7 | 63.4 | 56.1 | 56.9 | 57.9 | 62.3 | 51.7 | 67.5 | 64.1 |
FDSST | 56.9 | 60.0 | 57.8 | 55.7 | 62.5 | 51.2 | 56.1 | 58.4 | 61.4 | 44.2 | 64.1 | 61.8 |
LCT | 57.4 | 61.8 | 62.0 | 56.5 | 59.7 | 58.8 | 52.7 | 49.9 | 64.8 | 44.7 | 62.7 | 63.3 |
DSST | 56.8 | 62.7 | 56.8 | 53.7 | 60.9 | 50.1 | 49.0 | 55.1 | 59.6 | 42.5 | 64.3 | 60.6 |
KCF | 53.7 | 58.9 | 57.0 | 52.9 | 60.1 | 53.8 | 48.8 | 53.1 | 58.3 | 39.7 | 56.9 | 59.4 |
CSK | 41.2 | 44.7 | 45.0 | 41.5 | 45.7 | 38.9 | 33.4 | 36.4 | 46.8 | 28.9 | 45.9 | 46.2 |
CT | 35.0 | 30.9 | 35.8 | 33.7 | 31.6 | 32.8 | 22.2 | 24.6 | 36.4 | 30.2 | 42.6 | 34.4 |
DEF | 31.9 | 34.8 | 38.7 | 36.0 | 40.3 | 35.3 | 28.9 | 30.4 | 40.3 | 28.3 | 34.5 | 37.8 |
Trackers | OTB-50 | OTB-100 | Deep Learning | Real Time (FPS) | ||
---|---|---|---|---|---|---|
SP of OPE (%) | PP of OPE (%) | SP of OPE (%) | PP of OPE (%) | |||
ECO | 64.3 | 87.4 | 69.4 | 91.0 | Y | N (6) |
MDNet | 64.5 | 89.0 | 67.8 | 90.9 | Y | N (1) |
SANet | -- | -- | 69.2 | 92.8 | Y | N (1) |
C-COT | 61.4 | 84.3 | 67.1 | 89.8 | Y | N (0.3) |
SiamFC_3s | 51.6 | 69.2 | 58.2 | 77.1 | Y | Y (86) |
MCPF | 58.3 | 84.3 | 62.8 | 87.3 | Y | N (0.5) |
DeepSRDCF | 56.0 | 77.2 | 63.5 | 85.1 | Y | N (<1) |
CSR-DCF | 59.7 | 66.7 | 59.8 | 73.3 | N | Y (13) |
ECO-HC + MACF | 60.7 | 84.6 | 65.6 | 87.5 | N | Y (19) |
ECO-HC | 59.2 | 81.4 | 64.3 | 85.6 | N | Y (21) |
MACF | 52.3 | 65.1 | 56.6 | 69.6 | N | Y (51) |
FDSST | 50.6 | 60.8 | 56.2 | 67.9 | N | Y (49) |
Camera Parameters | UAV Parameters | ||
---|---|---|---|
Aperture size | F2.2 | Product number | W5 |
Number of Pixel | 1200 W | Expand Size | 15.5 × 15.5 × 10 cm |
Size of Pixel | 1.25 μm | Color | Red |
Focusing speed | 0.23 s | Type of Control Signal | Wireless Fidelity (Wi-Fi) |
Image dimensions | 3 | Others | No Antivibration used and No gimbal [59] used |
Trackers | MACF | ECO_HC | BACF | STC | STAPLE | SRDCF | DAT | FDSST |
---|---|---|---|---|---|---|---|---|
SP of OPE (%) | 100.0 | 94.8 | 20.6 | 31.4 | 21.8 | 98.7 | 23.1 | 46.8 |
PP of OPE (%) | 100.0 | 98.6 | 24.1 | 32.2 | 29.1 | 99.5 | 32.5 | 52.6 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Y.; Yang, Y.; Zhou, W.; Shi, L.; Li, D. Motion-Aware Correlation Filters for Online Visual Tracking. Sensors 2018, 18, 3937. https://doi.org/10.3390/s18113937
Zhang Y, Yang Y, Zhou W, Shi L, Li D. Motion-Aware Correlation Filters for Online Visual Tracking. Sensors. 2018; 18(11):3937. https://doi.org/10.3390/s18113937
Chicago/Turabian StyleZhang, Yihong, Yijin Yang, Wuneng Zhou, Lifeng Shi, and Demin Li. 2018. "Motion-Aware Correlation Filters for Online Visual Tracking" Sensors 18, no. 11: 3937. https://doi.org/10.3390/s18113937
APA StyleZhang, Y., Yang, Y., Zhou, W., Shi, L., & Li, D. (2018). Motion-Aware Correlation Filters for Online Visual Tracking. Sensors, 18(11), 3937. https://doi.org/10.3390/s18113937