Emerging Research Trends in Data Deduplication: A Bibliometric Analysis from 2010 to 2023
In the present time of industry and academia, the demand for efficient utilization of data
storage needs to be taken into account, as lots of duplicate data on the cloud lead to a waste …
storage needs to be taken into account, as lots of duplicate data on the cloud lead to a waste …
Exploring query processing on cpu-gpu integrated edge device
Huge amounts of data have been generated on edge devices every day, which requires
efficient data analytics and management. However, due to the limited computing capacity of …
efficient data analytics and management. However, due to the limited computing capacity of …
Dingo optimization based network bandwidth selection to reduce processing time during data upload and access from cloud by user
Big data processing is considered as significant because a massive amount of data is
generated due to the rapid usage of the internet by people all over the globe. Cloud …
generated due to the rapid usage of the internet by people all over the globe. Cloud …
Certificateless integrity auditing scheme for sensitive information protection in cloud storage
J Wen, L Deng - Journal of Systems Architecture, 2024 - Elsevier
Data integrity auditing provides a method for checking the integrity of outsourced data in
cloud storage. However, outsourced data often contain sensitive information (such as …
cloud storage. However, outsourced data often contain sensitive information (such as …
UltraCDC: A Fast and Stable Content-Defined Chunking Algorithm for Deduplication-based Backup Storage Systems
P Zhou, Z Wang, W Xia, H Zhang - 2022 IEEE International …, 2022 - ieeexplore.ieee.org
Content-Defined Chunking (CDC) is the key stage of data deduplication since it has a
significant impact on deduplication system's throughput and deduplication efficiency …
significant impact on deduplication system's throughput and deduplication efficiency …
Efficient Container Image Updating in Low-bandwidth Networks with Delta Encoding
Containers are the technology for Linux to isolate execution environments. By distributing a
container image, which is a collection of files contained in the container, users can use an …
container image, which is a collection of files contained in the container, users can use an …
BFDup: Batch Fuzzy Deduplication Scheme for Massive Data in Non-Trusted Environments
Z Tang, S Cheng, S Zeng, L Xiong - International Conference on Frontiers …, 2024 - Springer
Existing fuzzy deduplication solutions focus more on providing secure and low redundancy
storage services while ignoring the efficiency of deduplication matching-batch checking …
storage services while ignoring the efficiency of deduplication matching-batch checking …
Synchronization of data in heterogeneous decentralized systems
N Boškov - 2023 - search.proquest.com
Data synchronization is the problem of reconciling the differences between large data stores
that differ in a small number of records. It is a common thread among disparate distributed …
that differ in a small number of records. It is a common thread among disparate distributed …
Perancangan Aplikasi Mobile Hybrid untuk Penelusuran Tanaman Obat Herbal dengan Metode Delta Sync
RY Bakti - Arus Jurnal Sains dan Teknologi, 2024 - jurnal.ardenjaya.com
Penelitian ini bertujuan untuk mengembangkan sebuah aplikasi yang menyediakan
platform bagi pengguna untuk mengakses informasi tentang tanaman obat dan manfaat …
platform bagi pengguna untuk mengakses informasi tentang tanaman obat dan manfaat …