Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT
Abstract The Internet of Medical Things (IoMT) is being incorporated into current healthcare
systems. This technology intends to connect patients, IoMT devices, and hospitals over …
systems. This technology intends to connect patients, IoMT devices, and hospitals over …
Federated Learning With Selective Knowledge Distillation Over Bandwidth-constrained Wireless Networks
Artificial Intelligence (AI) applications on Internet of Things (IoT) networks often involve
relaying generated data to a server for deep learning training, which poses security risks to …
relaying generated data to a server for deep learning training, which poses security risks to …
Federated Distillation: A Survey
Federated Learning (FL) seeks to train a model collaboratively without sharing private
training data from individual clients. Despite its promise, FL encounters challenges such as …
training data from individual clients. Despite its promise, FL encounters challenges such as …
Trustworthy federated learning model for the internet of robotic things
Federated learning (FL) has become a viable concept in the Internet of Robotic Things
(IoRT) by allowing local gradients to be shared and used to train a global model without …
(IoRT) by allowing local gradients to be shared and used to train a global model without …
A Trustworthy Decentralized Federated Learning Framework for Consumer Electronics: Mitigating Large-Scale AIoT Heterogeneity through Transfer Knowledge …
IoT-enabled consumer electronics can collect and analyze data to improve functionality and
user experiences, increasingly becoming part of edge computing networks. Decentralized …
user experiences, increasingly becoming part of edge computing networks. Decentralized …
Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions
Federated Learning (FL) is a distributed and privacy-preserving machine learning paradigm
that coordinates multiple clients to train a model while keeping the raw data localized …
that coordinates multiple clients to train a model while keeping the raw data localized …
Unity is Power: Semi-Asynchronous Collaborative Training of Large-Scale Models with Structured Pruning in Resource-Limited Clients
In this work, we study to release the potential of massive heterogeneous weak computing
power to collaboratively train large-scale models on dispersed datasets. In order to improve …
power to collaboratively train large-scale models on dispersed datasets. In order to improve …
Energy-Efficient Federated Knowledge Distillation Learning in Internet of Drones
Federated Learning (FL) in the Internet of Drones (IoD) leverages the distributed
computational resources of drones for collaborative learning, while addressing challenges …
computational resources of drones for collaborative learning, while addressing challenges …
Data analysis algorithm for internet of things based on federated learning with optical technology
V Tiwari, S Ananthakumaran, MR Shree… - Optical and Quantum …, 2024 - Springer
Abstract As the Internet of Things (IoT) progresses, federated learning (FL), a decentralized
machine learning framework that preserves every participant's data privacy, has grown in …
machine learning framework that preserves every participant's data privacy, has grown in …
Fall Detection using Knowledge Distillation Based Long short-term memory for Offline Embedded and Low Power Devices
H Zhou, A Chen, C Buer, E Chen, K Tang… - arXiv preprint arXiv …, 2023 - arxiv.org
This paper presents a cost-effective, low-power approach to unintentional fall detection
using knowledge distillation-based LSTM (Long Short-Term Memory) models to significantly …
using knowledge distillation-based LSTM (Long Short-Term Memory) models to significantly …