Prognostic Bakeoff Evaluation: Oracle MSET2 vs Modern Anomaly Detection Techniques

Prognostic Bakeoff Evaluation: Oracle MSET2 vs Modern Anomaly Detection Techniques

Guang Wang, Matthew Gerdes, Kenny Gross

01 June 2024

Time series anomaly detection is a hard problem that has been studied in a broader spectrum of research areas due to its diverse applications in different domains. Despite a significant amount of advance in this research area with the wide adoption of modern machine learning algorithms, there does not exist a single winning anomaly detector known to be applicable to all time series datasets. On the other hand, Oracle has launched an Anomaly Detection service that employs an advanced pattern recognition algorithm, while Oracle’s major competitors adopt LSTM in their equivalent service. We as Oracle researchers are inspired to conduct a broad comparative evaluation of Oracle’s MSET, LSTM, and other state-of-the-art techniques available in the literature to fully understand the value propositions of MSET. A benchmarking test bed is developed to evaluate the detection results, reconstruction accuracy and the computational cost of the anomaly detection techniques of interest. The benchmark datasets consist of synthetic datasets, publicly available datasets, and Oracle customer’s dataset. MSET is demonstrated to achieve higher F1 score on average than LSTM in most test cases, and deliver remarkable competitive advantage over the competing methods on low false alarms, reconstruction accuracy, and the computational cost. The existence of signal correlations in the datasets is found to be an important determinant in the performance of MSET. Last, although the explainability cannot be quantified in our study, we showcase it constitutes a key value proposition of MSET favoured by the IoT industries targeted by MSET.


Venue : a joint zoom presentation with Lennard to iTrust at Singapore

File Name : MSET_Benchmark_Study_Jan31.pdf