Hierarchical vector quantized transformer for multi-class unsupervised anomaly detection
Advances in Neural Information Processing Systems, 2023•proceedings.neurips.cc
Abstract Unsupervised image Anomaly Detection (UAD) aims to learn robust and
discriminative representations of normal samples. While separate solutions per class endow
expensive computation and limited generalizability, this paper focuses on building a unified
framework for multiple classes. Under such a challenging setting, popular reconstruction-
based networks with continuous latent representation assumption always suffer from the"
identical shortcut" issue, where both normal and abnormal samples can be well recovered …
discriminative representations of normal samples. While separate solutions per class endow
expensive computation and limited generalizability, this paper focuses on building a unified
framework for multiple classes. Under such a challenging setting, popular reconstruction-
based networks with continuous latent representation assumption always suffer from the"
identical shortcut" issue, where both normal and abnormal samples can be well recovered …
Abstract
Unsupervised image Anomaly Detection (UAD) aims to learn robust and discriminative representations of normal samples. While separate solutions per class endow expensive computation and limited generalizability, this paper focuses on building a unified framework for multiple classes. Under such a challenging setting, popular reconstruction-based networks with continuous latent representation assumption always suffer from the" identical shortcut" issue, where both normal and abnormal samples can be well recovered and difficult to distinguish. To address this pivotal issue, we propose a hierarchical vector quantized prototype-oriented Transformer under a probabilistic framework. First, instead of learning the continuous representations, we preserve the typical normal patterns as discrete iconic prototypes, and confirm the importance of Vector Quantization in preventing the model from falling into the shortcut. The vector quantized iconic prototypes are integrated into the Transformer for reconstruction, such that the abnormal data point is flipped to a normal data point. Second, we investigate an exquisite hierarchical framework to relieve the codebook collapse issue and replenish frail normal patterns. Third, a prototype-oriented optimal transport method is proposed to better regulate the prototypes and hierarchically evaluate the abnormal score. By evaluating on MVTec-AD and VisA datasets, our model surpasses the state-of-the-art alternatives and possesses good interpretability. The code is available at https://github. com/RuiyingLu/HVQ-Trans.
proceedings.neurips.cc
Showing the best result for this search. See all results