2023
Neural Architecture Design and Robustness: A Dataset
S. Jung, J. Lukasik and M. Keuper
Eleventh International Conference on Learning Representations (ICLR 2023), 2023
(Accepted/in press) S. Jung, J. Lukasik and M. Keuper
Eleventh International Conference on Learning Representations (ICLR 2023), 2023
Abstract
Deep learning models have proven to be successful in a wide <br>range of machine learning tasks. Yet, they are often highly sensitive to <br>perturbations on the input data which can lead to incorrect decisions <br>with high confidence, hampering their deployment for practical <br>use-cases. Thus, finding architectures that are (more) robust against <br>perturbations has received much attention in recent years. Just like the <br>search for well-performing architectures in terms of clean accuracy, <br>this usually involves a tedious trial-and-error process with one <br>additional challenge: the evaluation of a network's robustness is <br>significantly more expensive than its evaluation for clean accuracy. <br>Thus, the aim of this paper is to facilitate better streamlined research <br>on architectural design choices with respect to their impact on <br>robustness as well as, for example, the evaluation of surrogate measures <br>for robustness. We therefore borrow one of the most commonly considered <br>search spaces for neural architecture search for image classification, <br>NAS-Bench-201, which contains a manageable size of 6466 non-isomorphic <br>network designs. We evaluate all these networks on a range of common <br>adversarial attacks and corruption types and introduce a database on <br>neural architecture design and robustness evaluations. We further <br>present three exemplary use cases of this dataset, in which we (i) <br>benchmark robustness measurements based on Jacobian and Hessian matrices <br>for their robustness predictability, (ii) perform neural architecture <br>search on robust accuracies, and (iii) provide an initial analysis of <br>how architectural design choices affect robustness. We find that <br>carefully crafting the topology of a network can have substantial impact <br>on its robustness, where networks with the same parameter count range in <br>mean adversarial robust accuracy from 20%-41%.
FreeMatch: Self-adaptive Thresholding for Semi-supervised Learning
Y. Wang, H. Chen, Q. Heng, W. Hou, Y. Fan, Z. Wu, J. Wang, M. Savvides, T. Shinozaki, B. Raj, B. Schiele and X. Xie
Eleventh International Conference on Learning Representations (ICLR 2023), 2023
(Accepted/in press) Y. Wang, H. Chen, Q. Heng, W. Hou, Y. Fan, Z. Wu, J. Wang, M. Savvides, T. Shinozaki, B. Raj, B. Schiele and X. Xie
Eleventh International Conference on Learning Representations (ICLR 2023), 2023
Higher-Order Multicuts for Geometric Model Fitting and Motion Segmentation
E. Levinkov, A. Kardoost, B. Andres and M. Keuper
IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume 45, Number 1, 2023
E. Levinkov, A. Kardoost, B. Andres and M. Keuper
IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume 45, Number 1, 2023
Abstract
Minimum cost lifted multicut problem is a generalization of the multicut problem and is a means to optimizing a decomposition of a graph w.r.t. both positive and negative edge costs. Its main advantage is that multicut-based formulations do not require the number of components given a priori; instead, it is deduced from the solution. However, the standard multicut cost function is limited to pairwise relationships between nodes, while several important applications either require or can benefit from a higher-order cost function, i.e. hyper-edges. In this paper, we propose a pseudo-boolean formulation for a multiple model fitting problem. It is based on a formulation of any-order minimum cost lifted multicuts, which allows to partition an undirected graph with pairwise connectivity such as to minimize costs defined over any set of hyper-edges. As the proposed formulation is NP-hard and the branch-and-bound algorithm is too slow in practice, we propose an efficient local search algorithm for inference into resulting problems. We demonstrate versatility and effectiveness of our approach in several applications: geometric multiple model fitting, homography and motion estimation, motion segmentation.
Urban Scene Semantic Segmentation With Low-Cost Coarse Annotation
A. Das, Y. Xian, Y. He, Z. Akata and B. Schiele
2023 IEEE Winter Conference on Applications of Computer Vision (WACV 2023), 2023
A. Das, Y. Xian, Y. He, Z. Akata and B. Schiele
2023 IEEE Winter Conference on Applications of Computer Vision (WACV 2023), 2023
Intra-Source Style Augmentation for Improved Domain Generalization
Y. Li, D. Zhang, M. Keuper and A. Khoreva
2023 IEEE Winter Conference on Applications of Computer Vision (WACV 2023), 2023
Y. Li, D. Zhang, M. Keuper and A. Khoreva
2023 IEEE Winter Conference on Applications of Computer Vision (WACV 2023), 2023
Revisiting Consistency Regularization for Semi-supervised Learning
Y. Fan, A. Kukleva, D. Dai and B. Schiele
International Journal of Computer Vision, Volume 131, 2023
Y. Fan, A. Kukleva, D. Dai and B. Schiele
International Journal of Computer Vision, Volume 131, 2023
Online Hyperparameter Optimization for Class-Incremental Learning
Y. Liu, Y. Li, B. Schiele and Q. Sun
Proceedings of the 37th AAAI Conference on Artificial Intelligence, 2023
(Accepted/in press) Y. Liu, Y. Li, B. Schiele and Q. Sun
Proceedings of the 37th AAAI Conference on Artificial Intelligence, 2023
Joint Self-Supervised Image-Volume Representation Learning with Intra-Inter Contrastive Clustering
D. M. H. Nguyen, H. Nguyen, M. T. N. Truong, T. Cao, B. T. Nguyen, N. Ho, P. Swoboda, S. Albarqouni, P. Xie and D. Sonntag
Proceedings of the 37th AAAI Conference on Artificial Intelligence, 2023
(Accepted/in press) D. M. H. Nguyen, H. Nguyen, M. T. N. Truong, T. Cao, B. T. Nguyen, N. Ho, P. Swoboda, S. Albarqouni, P. Xie and D. Sonntag
Proceedings of the 37th AAAI Conference on Artificial Intelligence, 2023