Out-of-Distribution Detection with Logical Reasoning

Code: Here · Paper: Here

Our paper Out-of-Distribution Detction with Logical Reasoning has been accepted on the WACV 2024.

Abstract

Machine Learning models often only generalize reliably to samples from the training distribution. Consequentially, detecting when input data is out-of-distribution (OOD) is crucial, especially in safety-critical applications. Current OOD detection methods, however, tend to be domain agnostic and often fail to incorporate valuable prior knowledge about the structure of the training distribution. To address this limitation, we introduce a novel, hybrid OOD detection algorithm that combines a deep learning-based perception system with a first-order logic-based knowledge representation. A logical reasoning system uses this knowledge base at run-time to infer whether inputs are consistent with prior knowledge about the training distribution. In contrast to purely neural systems, the structured knowledge representation allows humans to inspect and modify the rules that govern the OOD detectors’ behavior. This not only enhances performance but also fosters a level of explainability that is particularly beneficial in safety-critical contexts. We demon- strate the effectiveness of our method through experiments on several datasets and discuss advantages and limitations.

Video

Below, you can find the presentation video I created for the conference. I used OpenAIs API for writing the script, as well as voice synthesis. Overall costs of production were $0.15.


Last Updated: 04 Jan. 2024
Categories: Anomaly Detection · Neuro-Symbolic
Tags: WACV · Anomaly Detection · Neuro-Symbolic