21 March 2022
Machine learning IoT use cases involve thousands of sensor signals, and the demand on the cloud is high. One challenge for all cloud companies who seek to deal with big data use cases is the fact that the peak memory utilization scales non-linearly with the number of sensors, and sizing cloud shapes properly and autonomously prior to the program run is complicated. To address this issue, Oracle developed an autonomous formularization tool with OCI Anomaly Detection’s patented MSET2 algorithm so RAM capacity and/or VRAM capacity can be optimally sized—which helps developers gain a perception of the required computing resources beforehand and avoid the out-of-memory error. It also avoids excessively conservative RAM pre-allocations which saves cost for customers.
Venue : 2022 NVIDIA GPU Technology Conference (Virtual)