Jarvis: Large-scale server monitoring with adaptive near-data processing
Rapid detection and mitigation of issues that impact performance and reliability are
paramount for large-scale online services. For real-time detection of such issues, datacenter …
paramount for large-scale online services. For real-time detection of such issues, datacenter …
Reliability Evaluation and Fault Tolerant Design for KLL Sketches
Z Gao, J Zhu, P Reviriego - IEEE Transactions on Emerging …, 2023 - ieeexplore.ieee.org
Quantile estimation is a fundamental task in Big Data analysis. In order to achieve high-
speed estimation with low memory consumption, especially for streaming Big Data …
speed estimation with low memory consumption, especially for streaming Big Data …
Holistic Knowledge Framework for Improving PC Design using Big Data Analytics
T Rajendran, O Vignesh, V Priyadharsini… - … and Control Systems …, 2023 - ieeexplore.ieee.org
Cyberspace is massively expanding every day, and the users of these digital devices are
looking for more innovative applications to ease their day-to-day work. The main objective of …
looking for more innovative applications to ease their day-to-day work. The main objective of …
An adaptive placement framework for efficient near-data stream processing over data source-edge-cloud systems
A Sandur - 2022 - ideals.illinois.edu
Large amounts of data are being generated across many different domains including
datacenters, surveillance cameras, mobile devices and other Internet-of-things (IoT) …
datacenters, surveillance cameras, mobile devices and other Internet-of-things (IoT) …
An Effective Single-Pass Approach for Estimating the Φ-quantile in Data Streams
Z Xue - … Conference on Algorithms and Architectures for …, 2021 - Springer
Random sampling is a common method to deal with large-scale data sets and in particular
to deal with data streams. However, the accuracy of this method decreases greatly with the …
to deal with data streams. However, the accuracy of this method decreases greatly with the …