Abstract
Recent optical flow methods are almost exclusively judged in terms of accuracy, while their robustness is often neglected. Although adversarial attacks offer a useful tool to perform such an analysis, current attacks on optical flow methods focus on real-world attacking scenarios rather than a worst case robustness assessment. Hence, in this work, we propose a novel adversarial attack—the Perturbation-Constrained Flow Attack (PCFA)—that emphasizes destructivity over applicability as a real-world attack. PCFA is a global attack that optimizes adversarial perturbations to shift the predicted flow towards a specified target flow, while keeping the \(L_2\) norm of the perturbation below a chosen bound. Our experiments demonstrate PCFA’s applicability in white- and black-box settings, and show it finds stronger adversarial samples than previous attacks. Based on these strong samples, we provide the first joint ranking of optical flow methods considering both prediction quality and adversarial robustness, which reveals state-of-the-art methods to be particularly vulnerable. Code is available at https://github.com/cv-stuttgart/PCFA.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
FGSM [15] and I-FGSM [20] limit the perturbation size below \(\varepsilon _\infty \) by performing only so many steps of a fixed step size \(\tau \), that exceeding the norm bound is impossible. To this end, the number of steps is fixed to \(N=\lfloor \frac{\varepsilon _\infty }{\tau } \rfloor \), which comes down to a one-shot optimization. Additionally, this “early stopping” reduces the attack strength as it prevents optimizing in the vicinity of the bound.
- 3.
Ranjan et al. [30] generate a pseudo ground truth for their attack with static patches by prescribing a zero-flow at the patch locations.
References
Anand, A.P., Gokul, H., Srinivasan, H., Vijay, P., Vijayaraghavan, V.: Adversarial patch defense for optical flow networks in video action recognition. In: Proceedings of the IEEE International Conference on Machine Learning and Applications (ICMLA), pp. 1289–1296 (2020)
Baker, S., Scharstein, D., Lewis, J.P., Roth, S., Black, M.J., Szeliski, R.: A database and evaluation methodology for optical flow. Int. J. Comput. Vis. 92(1), 1–31 (2011)
Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12, 43–77 (1994)
Black, M.J., Anandan, P.: A framework for the robust estimation of optical flow. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV), pp. 231–236 (1993)
Brown, T.B., Mané, D., Roy, A., Abadi, M., Gilmer, J.: Adversarial patch. In: arXiv preprint. arXiv:1712 (2018)
Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Proceedings of European Conference on Computer Vision (ECCV), pp. 25–36 (2004)
Brox, T., Malik, J.: Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans. Pattern Anal. Mach. Intell. 33(3), 500–513 (2011)
Bruhn, A., Weickert, J., Schnörr, C.: Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods. Int. J. Comput. Vis. 61(3), 211–231 (2005)
Butler, D.J., Wulff, J., Stanley, G.B., Black, M.J.: A naturalistic open source movie for optical flow evaluation. In: Proceedings of European Conference on Computer Vision (ECCV), pp. 611–625 (2012)
Capito, L., Ozguner, U., Redmill, K.: Optical flow based visual potential field for autonomous driving. In: IEEE Intelligent Vehicles Symposium (IV), pp. 885–891 (2020)
Carlini, N., Wagner, D.: Towards evaluating the robustness of neural networks. In: Proceedings of the IEEE Symposium on Security and Privacy (SP), pp. 39–57 (2017)
Deng, Y., Karam, L.J.: Universal adversarial attack via enhanced projected gradient descent. In: Proceedings of the IEEE International Conference on Image Processing (ICIP), pp. 1241–1245 (2020)
Dong, Y., et al.: Boosting adversarial attacks with momentum. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2018)
Geiger, A., Lenz, P., Urtasun, R.: Are we ready for autonomous driving? The KITTI vision benchmark suite. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2012)
Goodfellow, I.J., Shlens, J., Szegedy, C.: Explaining and harnessing adversarial examples. arXiv:1412.6572 (2014)
Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artif. Intell. 17(1–3), 185–203 (1981)
Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., Brox, T.: FlowNet 2.0: evolution of optical flow estimation with deep networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2017)
Janai, J., Güney, F., Wulff, J., Black, M., Geiger, A.: Slow Flow: exploiting high-speed cameras for accurate and diverse optical flow reference data. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1406–1416 (2017)
Jiang, S., Campbell, D., Lu, Y., Li, H., Hartley, R.: Learning to estimate hidden motions with global motion aggregation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), pp. 9772–9781 (2021)
Kurakin, A., Goodfellow, I., Bengio, S.: Adversarial machine learning at scale. arXiv: 6110.1236 (2017)
Li, R., Tan, R.T., Cheong, L.-F.: Robust optical flow in rainy scenes. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11219, pp. 299–317. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01267-0_18
Liu, C., Yuen, J., Torralba, A.: SIFT flow: dense correspondence across scenes and its applications. IEEE Trans. Pattern Anal. Mach. Intell. 33(5), 978–994 (2010)
Menze, M., Heipke, C., Geiger, A.: Joint 3D estimation of vehicles and scene flow. In: Proceedings of the ISPRS Workshop on Image Sequence Analysis (ISA) (2015)
Moosavi-Dezfooli, S.M., Fawzi, A., Fawzi, O., Frossard, P.: Universal adversarial perturbations. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2017)
Niklaus, S.: A reimplementation of SPyNet using PyTorch (2018). https://github.com/sniklaus/pytorch-spynet
Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)
Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, New York (2006). https://doi.org/10.1007/978-0-387-40065-5
Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. In: Proceedings of the Conference on Neural Information Processing Systems (NeurIPS), pp. 8024–8035 (2019)
Ranjan, A., Black, M.J.: Optical flow estimation using a spatial pyramid network. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2017)
Ranjan, A., Janai, J., Geiger, A., Black, M.J.: Attacking optical flow. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) (2019)
Reda, F., Pottorff, R., Barker, J., Catanzaro, B.: flownet2-pytorch: Pytorch implementation of FlowNet 2.0: evolution of optical flow estimation with deep networks (2017). https://github.com/NVIDIA/flownet2-pytorch
Schrodi, S., Saikia, T., Brox, T.: Towards understanding adversarial robustness of optical flow networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 8916–8924 (2022)
Shafahi, A., Najibi, M., Xu, Z., Dickerson, J., Davis, L.S., Goldstein, T.: Universal adversarial training. Proc. AAAI Conf. Artif. Intell. 34(04), 5636–5643 (2020)
Stegmaier, T., Oellingrath, E., Himmel, M., Fraas, S.: Differences in epidemic spread patterns of norovirus and influenza seasons of Germany: an application of optical flow analysis in epidemiology. Nat. Res. Sci. Rep. 10(1), 1–14 (2020)
Stein, F.: Efficient computation of optical flow using the census transform. In: Proceedings of the German Conference on Pattern Recognition (DAGM), pp. 79–86 (2004)
Sun, D., Roth, S., Black, M.: Secrets of optical flow estimation and their principles. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2432–2499 (2010)
Sun, D., Yang, X., Liu, M.Y., Kautz, J.: PWC-Net: CNNs for optical flow using pyramid, warping, and cost volume. In: Proceeding of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2018)
Szegedy, C., et al.: Intriguing properties of neural networks. In: Proceedings of the International Conference on Learning Representations (ICLR) (2014)
Teed, Z., Deng, J.: RAFT: recurrent all-pairs field transforms for optical flow. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 402–419 (2020)
Tehrani, A., Mirzae, M., Rivaz, H.: Semi-supervised training of optical flow convolutional neural networks in ultrasound elastography. In: Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), pp. 504–513 (2020)
Tsipras, D., Santurkar, S., Engstrom, L., Turner, A., Madry, A.: Robustness may be at odds with accuracy. In: Proceedings of the International Conference on Learning Representations (ICLR) (2019)
Ullah, A., Muhammad, K., Del Ser, J., Baik, S.W., de Albuquerque, V.H.C.: Activity recognition using temporal optical flow convolutional features and multilayer LSTM. IEEE Trans. Ind. Electr. 66(12), 9692–9702 (2019)
van de Weijer, J., Gevers, T.: Robust optical flow from photometric invariants. In: Proceedings of th IEEE International Conference on Image Processing (ICIP), vol. 3, pp. 1835–1838 (2004)
Virmaux, A., Scaman, K.: Lipschitz regularity of deep neural networks: analysis and efficient estimation. In: Proc. Conference on Neural Information Processing Systems (NeurIPS) (2018)
Wang, H., Cai, P., Fan, R., Sun, Y., Liu, M.: End-to-end interactive prediction and planning with optical flow distillation for autonomous driving. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPR-W), pp. 2229–2238 (2021)
Wang, L., Guo, Y., Liu, L., Lin, Z., Deng, X., An, W.: Deep video super-resolution using HR optical flow estimation. IEEE Trans. Image Process. 29, 4323–4336 (2020)
Wong, A., Mundhra, M., Soatto, S.: Stereopagnosia: fooling stereo networks with adversarial perturbations. Proc AAAI Conf. Artif. Intell. 35(4), 2879–2888 (2021)
Xu, H., et al.: Adversarial attacks and defenses in images, graphs and text: a review. Int. J. AOF Automat. Comput. 17(2), 151–178 (2020)
Yang, G., Ramanan, D.: Volumetric correspondence networks for optical flow. In: Proceedings of Conference on Neural Information Processing Systems (NeurIPS), pp. 794–805 (2019)
Yin, Z., Darrell, T., Yu, F.: Hierarchical discrete distribution decomposition for match density estimation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 6044–6053 (2019)
Yu, H., Chen, X., Shi, H., Chen, T., Huang, T.S., Sun, S.: Motion pyramid networks for accurate and efficient cardiac motion estimation. In: Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), pp. 436–446 (2020)
Zhang, F., Woodford, O., Prisacariu, V., Torr, P.: Separable flow: Learning motion cost volumes for optical flow estimation. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), pp. 10807–10817 (2021)
Zhang, T., Zhang, H., Li, Y., Nakamura, Y., Zhang, L.: Flowfusion: dynamic dense RGB-D SLAM based on optical flow. In: Proc, IEEE International Conference on Robotics and Automation (ICRA), pp. 7322–7328 (2020)
Acknowledgments
Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) – Project-ID 251654672 – TRR 161 (B04). The International Max Planck Research School for Intelligent Systems supports J.S.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Schmalfuss, J., Scholze, P., Bruhn, A. (2022). A Perturbation-Constrained Adversarial Attack for Evaluating the Robustness of Optical Flow. In: Avidan, S., Brostow, G., Cissé, M., Farinella, G.M., Hassner, T. (eds) Computer Vision – ECCV 2022. ECCV 2022. Lecture Notes in Computer Science, vol 13682. Springer, Cham. https://doi.org/10.1007/978-3-031-20047-2_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-20047-2_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20046-5
Online ISBN: 978-3-031-20047-2
eBook Packages: Computer ScienceComputer Science (R0)