Real Time Empirical Synchronization of IoT Signals for Improved AI Prognostics

Real Time Empirical Synchronization of IoT Signals for Improved AI Prognostics

Guang Wang, Kenny Gross

14 December 2018

A significant challenge for Machine Learning (ML) prognostic analyses of large-scale time series databases is variable clock skew between/among multiple data acquisition (DAQ) systems across assets in a fleet of monitored assets, and even inside individual assets, where the sheer numbers of sensors being deployed are so large that multiple individual DAQs, each with their own internal clocks, can create significant clock-mismatch issues. For Big Data prognostic anomaly detection, we have discovered and amply demonstrated that variable clock skew issues in the timestamps for time series telemetry signatures cause poor performance for ML prognostics, resulting in high false-alarm and missed-alarm probabilities (FAPs and MAPs). This paper describes a new Analytical Resampling Process (ARP) that embodies novel techniques in the time domain and frequency domain for interpolative online normalization and optimal phase coherence so that all system telemetry time series outputs are available in a uniform format and aligned with a common sampling frequency. More importantly, the “optimality” of the proposed technique gives end users the ability to select between “ultimate accuracy” or “lowest overhead compute cost”, for automated coherence synchronization of collections of time series signatures, whether from a few sensors, or hundreds of thousands of sensors, and regardless of the sampling rates and signal-to-noise (S/N) ratios for those sensors.

Venue : IEEE Intn'l Symposium on Computational Intelligence (CSCI-ISCI)

File Name : ARP_CSCI_paper_r4.pdf